Clean URL without HTTPs (from General Confidence V2 copy)
Parent Company
Type of Site
Individual Category
URL: Category Product Review
URL: Category Buying Guide
Name of Article Author
5.1 Does the site claim to test this product category?
6.1 Has the author on the most recent category-specific product review written at least five articles in the category on the site?
6.2 Does the author have a public LinkedIn profile?
Pillar BG Author Name w/o Spaces
Search: Author LinkedIn
Search: Author Muck Rack
6.3 Has the author been writing for at least three months?
6.4 Has the author been writing for at least six months?
6.5 Has the author been writing for at least one year?
6.6 Has the author been writing for at least three years?
6.7 Has the author been writing for at least five years?
6.8 Has the author been writing for at least ten years?
7.1 Do either the product reviews or buying guides contain real-world, non-stock photos?
7.2 Do product reviews or buying guides include images that show any of the following?: Hardware testing equipment; software testing screenshots; test results chart; multiple similar products together; or the journalist pictured with the product?
7.3 Are there photos, data, mentions, etc. showing that hardware testing equipment/software testing materials were actually used?
7.4 BONUS: Has the site produced category-specific video content that includes either a journalist’s face talking or voiceover with the product?
URL: Has the site produced category-specific video content that includes either a journalist’s face talking or voiceover with the product?
8.1 Does a category-specific test method exist, i.e. How We Test Headphones, How We Test Smartphones, etc.?
URL: Does a category-specific test method exist, i.e. How We Test Headphones, How We Test Smartphones, etc.?
8.2 Has the category-specific test method been published or updated in the last year?
8.3 Has the category-specific test method been published or updated in the last two years?
8.4 Do they provide correct units of measurement that directly relate to testing the product and the performance criteria to help support that they actually tested?
8.5 Did the reviewer demonstrate that the product was used in a realistic usage scenario? (e.g. assessing a pair of headphones’ sound quality with different genres of music, riding an e-bike on various inclines, blending ice in a blender, etc.)
2023 Research PR/BG Link
8.6 Performance Criteria 1
8.7 Performance Criteria 2
8.8 Performance Criteria 3
8.9 Performance Criteria 4
8.11 Based on the performance criteria questions and your own assessment, is their claim to test truthful?
Of the three sample reviews, two tested brightness, one tested input lag, and one of the reviews didn't have data for anything. It seems that their testing is spotty and prone to being impacted by the author of the review (the review with no data from testing was written by someone other than Steve May.)
Not every review has detailed test results incl. any sort of hard data - but some hands-ons do. It may not always be clear what is a hands-on at their own studio and what's a recap of a controlled manufacturer demo.
hands-on TV reviews are few and far between, with the last one posted in 2019. Pub still covers upcoming TV news but may have pivoted away from TV reviews?
Weird one - they have good quantitative testing methodology and plenty of product reviews... for gaming monitors. There is a tiny amount of overlap between OLED TVs and gaming monitors but not enough to fulfill our criteria. They link to their gaming monitor how-we-test in an article about OLED TVs.
The reviews are old, but they are in-depth and use actual testing equipment and devices. That said, there's no buying guides and the old nature of the reviews means they're difficult to rely on beyond the few reviews they've actually done.
The review text is very well written and makes reference to a lot of measurements but they're all just vague enough that it leaves doubt as to how much testing is really happening and how much is just research. They bothered to actually supply brightness ranges and make comparative statements though, so we can give them the pass - it doesn't put them in a fantastic position either way.
They do a minor amount of testing, but they don't seem to always test the same things; brightness was tested on one tv but then not tested on several others, for example.
They certainly use the TVs; there are pictures of it and everything. But "testing" is a stretch; they're just turning them on and assessing them qualitatively.
Only has a single tv review. Despite the real world images, the total lack of any kind of meaningful data calls into question the validity of their testing.
There's only a handful of reviews, and while they do show that the TVs were actually used, the lack of any claim of testing aligns with the lack of any evidence of testing beyond simply using the TV.
They certainly use the TV and put through it through incredibly basic real world testing (watching shows, etc.) but there isn't anything to suggest any kind of actually rigorous testing.
They certainly use the TV and have a "how we test" section, but there's nothing TV specific in that section and there's no data to suggest they did anything but use the TV for everyday watching.
Some testing may have been done given the measurements present, but said measurements only appear on SOME reviews, and there is no claim about testing. They're tentatively set to actually testing, but that still didn't put them somewhere good (still untrusted.)
There is some degree of testing happening, and they make actual video reviews with the actual product, but there needs to be greater quantitative testing performed.
There's some data in their reviews but none of it appears to be test data given it seems to be specifications data (based on the spec box towards the bottom of the page.)
Product reviews are a joke - the sole reviewer admits his inexperience and provides no useful data or even really opinions. I'd gloss over them on an amazon review page nevermind a whole dedicated site. Buying guides barely exist and seem composed mostly of info pulled off ecomm listings with not even the barest sort of review attached. But, they don't claim to test in those
See comment on this entry. This publication is likely plagiarizing RTings for their "tests" as the numbers they're getting from their "tests" are identical to RTings. Many of the images included in the reviews also look to be taken from other sites.
Cleverly, they don't claim to test, but they do say the writer of the article is an expert tester. They deliberately tiptoe around saying they tested anything on the list. It's obvious they didn't test, but they're trying to get the implication without actually implying it.
Despite mention of measurements, more reviews are missing measurements than include them, and many of the reported numbers are listed in categories that indicate they're just being taken off the box.
No claims of testing though there is some mention of testing that was performed, but only on a single review. Very few reviews to go off of. Basic buying guides.
Very odd BGs almost no info on the listed products. But it is a BG so I'll count it. "We have tested hundreds of HDTV’s," claims the singular BG I found.
Despite claiming they make their buying guide decisions based on testing, there isn't anything in the way of actual results data to show that they're using testing results to make decisions.
6.3 Has the author been writing for at least three months?
Yes
6.4 Has the author been writing for at least six months?
Yes
6.5 Has the author been writing for at least one year?
Yes
6.6 Has the author been writing for at least three years?
Yes
6.7 Has the author been writing for at least five years?
Yes
6.8 Has the author been writing for at least ten years?
Yes
7.1 Do either the product reviews or buying guides contain real-world, non-stock photos?
Yes
7.2 Do product reviews or buying guides include images that show any of the following?: Hardware testing equipment; software testing screenshots; test results chart; multiple similar products together; or the journalist pictured with the product?
Yes
7.3 Are there photos, data, mentions, etc. showing that hardware testing equipment/software testing materials were actually used?
Yes
7.4 BONUS: Has the site produced category-specific video content that includes either a journalist’s face talking or voiceover with the product?
No
URL: Has the site produced category-specific video content that includes either a journalist’s face talking or voiceover with the product?
8.1 Does a category-specific test method exist, i.e. How We Test Headphones, How We Test Smartphones, etc.?
No
URL: Does a category-specific test method exist, i.e. How We Test Headphones, How We Test Smartphones, etc.?
8.2 Has the category-specific test method been published or updated in the last year?
N/A
8.3 Has the category-specific test method been published or updated in the last two years?
N/A
8.4 Do they provide correct units of measurement that directly relate to testing the product and the performance criteria to help support that they actually tested?
Yes
8.5 Did the reviewer demonstrate that the product was used in a realistic usage scenario? (e.g. assessing a pair of headphones’ sound quality with different genres of music, riding an e-bike on various inclines, blending ice in a blender, etc.)
Yes
2023 Research PR/BG Link
8.6 Performance Criteria 1
Yes
8.7 Performance Criteria 2
No
8.8 Performance Criteria 3
Yes
8.9 Performance Criteria 4
Yes
8.11 Based on the performance criteria questions and your own assessment, is their claim to test truthful?