For the new year, I've decided to start doing projects in a 10-bit 4 to 8K 10-bit HDR workflow. Rather than buying a ridiculously expensive HDR monitor with limited capabilities for the money, I've decided to go with probably a LG CX OLED TV- about 55". Of course, these are HDMI 2.1, so to get all the 10-bit HDR benefits the TV has to offer, I'm already having to upgrade my 1 year old Nvidia 2080ti, which is 2.0, to a 3090, which is 2.1. My son is thrilled, since his gaming computer will end up with the 2080ti. I will of course still run this in a two monitor configuration.
My question is: Should this work well for grading 10-bit HDR with Edius X Workgroup running in a clean feed between the graphic card to the TV, via an HDMI 2.1 cable? I realize that the TV might not be 100% spot on, unless and until calibrated. I'm hoping/assuming that Edius will be able to properly flag the TV to the 10-bit 4:2:2 content without having to go through a hardware card. Would that assumption be correct?
My question is: Should this work well for grading 10-bit HDR with Edius X Workgroup running in a clean feed between the graphic card to the TV, via an HDMI 2.1 cable? I realize that the TV might not be 100% spot on, unless and until calibrated. I'm hoping/assuming that Edius will be able to properly flag the TV to the 10-bit 4:2:2 content without having to go through a hardware card. Would that assumption be correct?
Comment