Rambling thoughts time... The 1080p shot looks softer than the other two, but still good, the 2K & 4K look identical to me... There’s so many variables...........
I shoot 4K primarily b/c I might want the most raw image data I can get since I might never be there again (like the Kapoho footage of areas now buried by lava). I use a relatively new 2015 MacBook Pro w/16gb of memory so editing 4K isn’t hard. Mastering at the highest level can take a long time though, with internal fans begging for mercy.
Most 2018 TV’s are 4K HDR, but I really can’t see a noticeable difference in 4K sharpness until I get over 55”. I think HDR is more important, it can make a difference in color/contrast, but only to higher-end, newer 4K HDR TV’s that can TRULY display HDR 10 standards and/or Dolby Vision — and only on content made to those standards!
Our little drone imagers can’t hope to compete with professional imagers in dynamic range, noise, a number of things, but they’re pretty good considering. We also can’t compete with their S/W, edit suites or mastering skills.
Better 4K TV’s upscale 1080p to 4K pretty well, so that blurs the whole thing too. On a new 75” Sony OLED, well-mastered 4K HDR content looks STUNNING. But hardly anyone is making that kind of content — a few Netflix & Amazon shows (Bosch, The Grand Tour, etc.). To streaming it, you need a fast connection, high data cap, and good equipment (Apple 4K TV, Roku Ultimate).
I watch my videos from .mov files on a good, Best Buy sale 65” Sony, but not OLED, it’s too damn pricey... ok, I’m tired now...
