What's new

Brent Reid

Supporting Actor
Joined
Apr 27, 2013
Messages
808
Location
Nottingham, UK
Real Name
Brent
Fear not everyone, I have a one-size-fits-all answer, courtesy of Alexander Pope, English poet, essayist and satirist:
  • Be not the first by whom the new are tried,
  • Nor yet the last to lay the old aside.
From An Essay on Criticism (1709, publ. 1711). Seems to me ol' Alex knew a thing or two about home theatre and technology in general! ;)
 

Matt Hough

Reviewer
Senior HTF Member
Joined
Apr 24, 2006
Messages
25,217
Location
Charlotte, NC
Real Name
Matt Hough
I remember a DVD player I owned that had a switch on the back for progressive and non-progressive scan televisions. My first widescreen TV (a 40" Samsung) still wouldn't let me play DVDs with the progressive scan turned on. But my next set was a real HDTV, a rear projection 42" Toshiba set, which did allow it.
 

B-ROLL

Senior HTF Member
Joined
May 26, 2016
Messages
4,593
Real Name
Bryan
So I've read this entire thread and I'm confused ...
v2k_ca01.jpg

Which of these will work best with the new system ... ?
;)
 

Neil S. Bulk

Senior HTF Member
Joined
Sep 13, 1999
Messages
2,875
Real Name
Neil S. Bulk
Right now it's best to wait for the dust to settle. HDMI 2.1 was recently finalized as well as ATSC 3.0. It's silly to buy anything new at the moment.
 

Robert Crawford

Crawdaddy
Moderator
Patron
Senior HTF Member
Joined
Dec 9, 1998
Messages
62,050
Location
Michigan
Real Name
Robert
Same here. I really had no choice wanting to remain 3D capable for the foreseeable future. My OLED is BY FAR the best television I've ever owned.
TBH, I was in the same boat as I didn't expect to upgrade until this year or next. However, with the industry eliminating 3-D capable panels, I had to upgrade last January.
 

titch

Screenwriter
Joined
Nov 7, 2012
Messages
1,753
Real Name
Kevin Oppegaard
In response to Dave Moritz's comments. I concur, but will go a step further.

Many HDR releases have been created so dark that they're unviewable in any rational projection environment.

RAH
And this - in a nutshell - is why I'm probably not upgrading my HD projector this year. 4K projectors that can be installed in a living room can't yet handle a HDR master, such as Unforgiven or Inception - encoded HEVC @ 4000 nits. Sony's new $15000 4K laser projector, the Sony VPL-VW760ES SXRD, can only produce 2000 ANSI lumens. On a 120 inch screen, that's not enough. Current projectors which are capable of emitting enough light to render a decent image are much too big and unwieldy, let alone prohibitively expensive, to ceiling mount in a normal living room.
 

Neil S. Bulk

Senior HTF Member
Joined
Sep 13, 1999
Messages
2,875
Real Name
Neil S. Bulk
I don't agree from my perspective and 3-D requirements, it was the best decision I made in about ten years.
I never invested in 3D, so the lack of it in the future isn't an issue for me. Thus I'm able to wait for the technology to mature. But this highlights the problem with standards. There are so many of them! :)
 

Scott Merryfield

Senior HTF Member
Joined
Dec 16, 1998
Messages
17,951
Location
Mich. & S. Carolina
Real Name
Scott Merryfield
Right now it's best to wait for the dust to settle. HDMI 2.1 was recently finalized as well as ATSC 3.0. It's silly to buy anything new at the moment.
HDMI 2.1 is mainly for 8K video, and ATSC 3.0 is for over the air broadcasting. Neither of those uses will likely affect the vast majority of consumers for many years, if ever. Many displays, including my Vizio 4K with HDR-10 and Dolby Vision support, do not even have a built in Ota tuner anyway, so ATSC 3.0 is completely irrelevant in those cases. Adding a separate tuner solves the issue for the small minority who still use an Ota antenna.
 

Alberto_D

BANNED
Joined
Jan 24, 2006
Messages
215
HDR for me it's not big deal. It's more something to try to compensate the miserable LCD technology handling of shadows and highlights, than a true innovation. And it's a kind of trick, cause it uses LCD screens anyway. I can still see problems with clipped whites on it, despite shadow be less worse than usual.
LCD it's and always be miserable. LCD technology will never get good.

-IPS, a scam, stil distort image with few angles.
- Refresh rate makes thing blur in low side motion.
-View angle it's a lie, cause just 8 degree create noticeable distortion or turn sreen darker.
-Light distribution, another fail, cause unles you are meters aways the center looks brighter than the corners.
-Response time, bad for games.
-Constrast looks not pleasant, similar to movies in PC, and not full vivid like in prime CRT TVs.

I don't watch LCD TV (LED backlights TVs as just LCD too). I make huge a effort to tolerate my PC LCD monitor, cause it also have problems, and I just use it cause I have no alternative.
I laugh about each time a LCD manufacturer tells they have agreat inovation, casue everytime I went to stores all looks bad. HDR, quantum dots, whatever, it still have all the problems I reported.

I challenge all LCD TV manufacturers to show me a model that please me.
 

Tim Glover

Senior HTF Member
Joined
Jan 12, 1999
Messages
8,220
Location
Monroe, LA
Real Name
Tim Glover
Another issue for me and this might apply to others is we are at a MUCH different media streaming state vs where we were in 2008 when BD won that. Now, we are overrun with Soundbars and TV's with Netflix and Amazon prime & Apple TV.

The world has changed and only a very small percentage of people I know buy discs. Which is why this 'progression' is counterproductive. This is only to combat Dolby Vision, which is superior to HDR. This will be confusing. Dolby Vision and 'plain HDR" is confusing enough.

Knowing the 4K physical media market was shrinking and most seem quite content with Netflix and streaming, it was VITAL that this 4K UHD rollout be effective AND easy for the consumer to understand.

All of us here on the HTF are here due to our love of film and the technology behind it and understand that one can really never stay ahead. But dang, can we not get the specs straightened out and understood BEFOREhand.

Rant over.
 

Peter Yee

Premium
Joined
Aug 31, 1998
Messages
127
I never invested in 3D, so the lack of it in the future isn't an issue for me. Thus I'm able to wait for the technology to mature. But this highlights the problem with standards. There are so many of them! :)
Obligatory XKCD. I work in IEEE 802.11 (Wi-Fi) tech. There's always a new version to chase. At least with IEEE 802.11, there's a tangible benefit to the upgrade, not just a lateral technology that merely has a different licensing regime.
 

Neil S. Bulk

Senior HTF Member
Joined
Sep 13, 1999
Messages
2,875
Real Name
Neil S. Bulk
Yeah, I haven't bought a new router yet either. Sticking with N for now. :)

I plan on getting a new receiver when they can support HDMI 2.1 and whatever HDR standards are out. Then I'll worry about source components and wi-fi.

Neil
 

Adam Gregorich

What to watch tonight?
Moderator
Reviewer
Senior HTF Member
Joined
Nov 20, 1999
Messages
16,528
Location
The Other Washington
Real Name
Adam
Originally there was HDR10. It was what most studios/manufactures used as it was open source (ie inexpensive). Dolby Vision was superior, but expensive as you had to pay licensing fees. I see this as the people who wanted to use the cheaper option improving it. I would be surprised if a piece of hardware (display or player) that can support Dolby Vision, couldn't be upgraded via FW to support HDR10+, and even if it can't, I think the HDR10 encoded movies I have watched as a whole are a big enough upgrade to the non-HDR Blu-rays that I am not regretting making the investment. I saw a prototype 8K display almost 4 years ago and that didn't stop me from upgrading to 4K last year once Dolby Vision was available.
 

SimonTC

Grip
Joined
Jan 10, 2018
Messages
20
Real Name
Simon
Problem is HDR10+ is a metadata enhanced version of HDR10 - it's an upgrade from that, but not superior to Dolby Vision. It's "DV lite" as it were.

There's no "golden reference" like with Dolby Vision, which is important to keep the same levels as what was in the authoring suite.
It doesn't have 12-bit color like Dolby Vision does, which seems to be at the very least future proofed.

It also only seems to benefit Samsung, which in the US market seems to be the lone hold out against licensing Dolby Vision. I'm not sure Sony or Vizio has incentive to license HDR10+
 

revgen

Screenwriter
Joined
Apr 7, 2010
Messages
1,272
Location
Southern California
Real Name
Dan
Correct me if I'm wrong, but my understanding is that the UHD Alliance still hasn't approved HDR10+ for the UHD standard yet.

The only HDR standards currently adopted by UHD is HDR10, Dolby Vision, and another esoteric standard called Phillips HDR which nobody even uses.
 

Robert Harris

Archivist
Reviewer
Senior HTF Member
Joined
Feb 8, 1999
Messages
16,535
Real Name
Robert Harris
Correct me if I'm wrong, but my understanding is that the UHD Alliance still hasn't approved HDR10+ for the UHD standard yet.

The only HDR standards currently adopted by UHD is HDR10, Dolby Vision, and another esoteric standard called Phillips HDR which nobody even uses.

As I understand things, in the real world, there is no such thing as an HDR standard.

Seems to be akin to Deadwood...
 

Users who are viewing this thread

Forum Sponsors

Latest Articles

Forum statistics

Threads
353,166
Messages
5,009,881
Members
143,409
Latest member
Daryubel
Recent bookmarks
0
Top