8K sets will have HLG and 4K sets from 2017 on should have HLG.
if you need the other flavors, you should be using Resolve. Personally, I would use Resolve for HDR work.
Announcement
Collapse
No announcement yet.
10-bit HDR Workflow?
Collapse
X
-
HLG will probably be fine. Actually, the camera I've just upgraded to which should be arriving on Tuesday, is a Sony A7SIII. So, HLG might be the flavor of the day for that one. For time lapse work with the new system, I'm working in 8K, starting with raw 14-bit images, so not sure how that HDR workflow might proceed.
However, again, I'm just in the learning process, but wouldn't it also be necessary to encode in other options as well as HLG, so the content could be viewed on devices which don't recognize HLG? Or, is this a not a factor?Leave a comment:
-
HLG doesn't require metadata. Is HLG not acceptable for your purposes?Leave a comment:
-
The 12G Extreme or Kona 5 cards are the way to go.
Here is what is shown on B&H for the Mini Monitor 4K.
NTSC
HDMI (8/10/12-Bit 4:4:4/4:2:2/4:2:0)
DCI 4K: 23.98/24/25 fps
UHD 4K: 23.98/24/25/29.97/30 fps
1080p: 23.98/24/25/29.97/30/50/59.94/60 fps
1080PsF: 23.98/24/25/29.97/30 fps
1080i: 50/59.94/60 fps
720p: 50/59.94/60 fps
HDR Support
HDR metadata packing, HLG and PQ transfer characteristics. (HDMI supports static HDR metadata only)
What exactly does static metadata only mean?
Edit Update:
I asked that same question on the Blackmagic Resolve forum. Here is the answer that was given:
This applies only to Dolby Vision and HDR10+. The rest of HDR formats are static only.
In addition, in order to implement dynamic metadata from Dolby you need L2 license from them, and their interface, which is about US$15,000. I believe AJA high end interface does Dynamic metadata too.Last edited by Barry C; 01-03-2021, 03:52 AM.Leave a comment:
-
The 12G Extreme or Kona 5 cards are the way to go.
Here is what is shown on B&H for the Mini Monitor 4K.
NTSC
HDMI (8/10/12-Bit 4:4:4/4:2:2/4:2:0)
DCI 4K: 23.98/24/25 fps
UHD 4K: 23.98/24/25/29.97/30 fps
1080p: 23.98/24/25/29.97/30/50/59.94/60 fps
1080PsF: 23.98/24/25/29.97/30 fps
1080i: 50/59.94/60 fps
720p: 50/59.94/60 fps
Last edited by Jerry; 01-03-2021, 12:57 AM.Leave a comment:
-
It would appear that with 4K it might only do 4:2:0, and not 4:2:2. I looked at the 4K 12g Extreme, and it might have the same limitation, although it's unclear whether with that one, they're just referring to it dropping down to 4:2:0 at 4K 60.
The other thing which it said for the 12g was that coming from HDMI, HDR would only do static HDR. I'm not really sure what they mean by this.Leave a comment:
-
The mini monitor 4k will also process HDR to external monitor with auto switching.
It does up to 30p in 4k, but can be dropped to 1080 60p and still handle HDR.
Doing 8K HDR will be tricky.Leave a comment:
-
Thanks Anton,
I'm really hoping to avoid going down the dedicated PC graphic card rabbit hole. Also, for the kind of projects I work on, absolute critical color accuracy isn't essential. I do various forms of landscape: astro, ocean, river, etc. So, I don't mind if the colors are not reference perfect. I go for what looks good on the devices I'm viewing it on. Before final publication, I always view the content on multiple devices ranging from smartphones, tablets, 10' home theater projector screen, etc. If it looks good over that range of devices, then I'm fine with it. I also use Resolve, and it would appear from their specs, that their Decklink 4K cards won't support Edius, and I assume vice versa. So, since I regularly use both these programs, you can see my hesitation to go the card route. With Resolve, it would seem that there is some sort of a flag setting with the cards which must be activated for HDR support. I'm hoping Edius doesn't have this same limitation. Again, absolute color accuracy isn't essential for my workflow. Plus, I FAR prefer working with the Edius interface and all final timeline output is on Edius.
Any thoughts?Leave a comment:
-
I don't think it will work because the EDIUS preview is not accurate via a PC graphics card
I would look for a dedicated video output cardLeave a comment:
-
10-bit HDR Workflow?
For the new year, I've decided to start doing projects in a 10-bit 4 to 8K 10-bit HDR workflow. Rather than buying a ridiculously expensive HDR monitor with limited capabilities for the money, I've decided to go with probably a LG CX OLED TV- about 55". Of course, these are HDMI 2.1, so to get all the 10-bit HDR benefits the TV has to offer, I'm already having to upgrade my 1 year old Nvidia 2080ti, which is 2.0, to a 3090, which is 2.1. My son is thrilled, since his gaming computer will end up with the 2080ti. I will of course still run this in a two monitor configuration.
My question is: Should this work well for grading 10-bit HDR with Edius X Workgroup running in a clean feed between the graphic card to the TV, via an HDMI 2.1 cable? I realize that the TV might not be 100% spot on, unless and until calibrated. I'm hoping/assuming that Edius will be able to properly flag the TV to the 10-bit 4:2:2 content without having to go through a hardware card. Would that assumption be correct?Tags: None
Leave a comment: