It used to be a good test of how bright highlights would be in real content but certain manufacturers *coughsamsungcough* starting detecting the pattern - a pure white square on a pure black background, the square taking up 2, 10, 25 and 50% of the screen to simulate different sizes of bright elements like, 2% would be for small bright specular reflections off chrome or the like, or the sun, while the 25 and 50 squares are more for when there are large bright elements on screen, like the ending "basement" scene of Annihilation or the bit in the Matrix where morpheus is in the armchair in the white void with Neo - and abandoning all balance of the image to make the contrast as strong as possible and the white square as bright as possible to game the tests for reviews.
So thats why rtings created the "real scene" test to show how it will produce real world content rather than just a test pattern, read whats in the yellow/golden box here -
https://www.rtings.com/tv/tests/picture-quality/peak-brightness - also look at the weighting for scores, its 63% for real scene brightness, ie they know Samsung games the pattern so they are trying to mitigate that with their own more honest test video.
Generally the HDR Real Scene brightness should match up with the 10% window figure if they aren't bullshitting the tests, but sometimes the 25% figure depending on how the TV handles HDR. Not 100% on this but how I understand it is: if the Real Scene brightness is similar to the 10% window brightness then the TV will make smaller objects brighter but the overall image will on average be less bright, while TVs that match the Real Scene brightness to the 25% window brightness will have higher APL (average picture level) and therefore the non-highlight parts (sun, reflections, pure white things) and shadows will be brighter/show more detail (As long as the source isn't overriding that happening).
The reason Samsung started doing this is so they can put a sticker on the TV that says "HDR 2000" or whatever number, they are claiming it goes to 2000 nits, but the issues are these:
* The test doesn't specify real world content, just test patterns
* It doesn't need to do it for more than a couple of seconds before it can drop to a lower value, so you could be looking at the sun in a film for 4 seconds and it will significantly dim after a second or two ( before the shot ends) on a Samsung, whereas a Sony will hold the highest peak brightness for about 20 seconds, as an example, but more likely for several minutes before dimming, ie longer than any average shot in a normal film would ever be. A shot of something uber bright like the sun is rarely going to be more than ~10 seconds as well.
* You don't know how the TV is setup when they do the test, it probably has every possible settings that increases contrast on and the if you showed real content on it in that state it would look like cartoony trash.
So to answer the question, rtings do test the whole screen by using the Real Scene test, here is the SDR version, there is an HDR version linked on the page above but you have to download it to play it properly, most easily on your TVs internal media player, this is the same video just in SDR so you get the idea anyway:
I don't think rtings is good for evaluating subjective stuff like upscaling quality, tonemapping or motion interpolation but the brightness tests are pretty objective imo, so a great resource for cutting through manufacturer bullshit. This is why Sony mid-range LCDs smashing "high-end" Samsung LCDs for HDR picture quality even when the spec sheet would imply the Samsung would smash the Sony. They claimed "HDR 1000" for the Q60T and its fucking 450 nits, seawards. The XF90 was 950 nits and cost the same or like £100 more at most.