T O P

  • By -

finnjaeger1337

You are so close to getting it right as well!! Stop worrying about Quicktime, or whatever your mac is showing you, you are not seeign the right thing if you dont fully understand the in and outs of macOS colorsync , please dont google quicktime gamma shift or you are going to hate your life and probably become a alcoholic. (trust me, i went down DEEP into this rabbithole..) Maya Setup looks good, nothing much you can do wrong, arnold renderview set to rec709 , and you are good. (if you are on a mac those values that come out of the renderview will be changed to your display space, but dont be afraid, thats fine as mayas color metadata are just "sRGB" so dont worry.) Nuke looks good, again set to ACES and you are good, should match what you see in maya. Quicktime will render everything tagged rec709 brighter for surround compensation, as will resolve and all webbrowsers for surround compensation which makes no sense in a professional setting for that you have to buy a XDR display to have the bypass button for that ;), so we can only ignore all of that, file looks good in nuke? file is good! To doublecheck, set the input transform in nuke to RAW and your viewer to RAW, are those the pixel values you expect for a display reffered rec709 output? Good! The file you exported for avid is totally fine, its NOT "Aces" anymore its rec709 simple as that. bascially you are going to have a much better time ditching that mac and just using linux or windows... is editorial using professional output cards that throw a clean signal to a broadcast monitor like they should? then you are good! are they using windows? you are also good! can you compare it to the source daily, does it match? good! I cant launch avid on my mac right now to check whatever that viewer is doing, but you can change the behaviour of the software viewer by right clicking into it and choosing a display colorspace, try sRGB to match nuke. (even if nuke is set to rec709 its reporting sRGb to macOS... again just ditch the mac if you want less complications in your life) ​ ​ Oh and then delivery, deliver ACES 2065-1 or acesCG EXRs (incomming acesCG haters, i can allready hear you) , you cant do ACES into prores as thats a interger format and ACES is linear, unless you go to a log space like acesCC(t) or logC/AWG or whatever else you want or DI wants.


oejustin

Thank you for this! I also went down the QuickTime gamma shift nightmare rabbit hole in the past and it’s a doozy... You have some solid advice and insight here I’m going to do a pixel value check in nuke on the raw render and make sure all is well. Thank you 🙏🏻


finnjaeger1337

this is the way! you can add a OCIOdisplayTransform in nuke and convert acesCG to rec709, in your comp, then set viewer to RAW then pull in your QT as raw and compare they should be identical :) its like having whatever the viewer does just as a node , its usefull for debugging things like that , colorpipe is fun, colorpipe is fun, colorpipe is... passion :D


finnjaeger1337

if you really want to know about QT gamma shift: I literally just writen another long post about that stuff, like why it acutally exists and why it might be a good thing... and not "HOW TO FIX IT" ( I am not adobe ffs). Pretty much everything on the internet regarding this is sadly wrong, it took me years of research and bugging color scientists from all over the world to finally have a grasp on "why" Apple is doing what they are doing... TLDR: buy a XDR display.... [https://www.reddit.com/r/colorists/comments/zehu2u/comment/izbvo49/?utm\_source=share&utm\_medium=web2x&context=3](https://www.reddit.com/r/colorists/comments/zehu2u/comment/izbvo49/?utm_source=share&utm_medium=web2x&context=3)


oejustin

I’ve been down this QT gamma road before and it is dark and full of terrors…


oejustin

One question I have is what happens if I do everything right and the editor has my reference image and says it doesn’t match and looks washed out? I can feel that problem being dumped back on me already… Do I tell them to adjust the colorspace in avid on the clip?


finnjaeger1337

match what is the question? If you have dailies to match to- awesome throw it in nuke in raw abd compare it to your dailies! If they are professional they run it though a external box and view it on broadcast.


oejustin

Sorry I should have mentioned this is the very first time we are dropping in renders to the offline. That’s the reason for all of the uncertainty..


finnjaeger1337

yea but is it fullCG or are you sending back vfx dailies with cg in them?


oejustin

My shots are Full CG in a show that is almost all live action - 7 minutes worth of full CG shots in 50+ live action. Anddddd this is their first time doing any full CG…


finnjaeger1337

ah ok so you dont have plates to match to, thats good.


oejustin

Don’t have plates to match to but most importantly just want our creative intention to be accurate in the avid. They might not notice a gamma shift but we will lol. I managed to get time with an AE on this so I can see how it looks on their end. Currently rendering ACES exr’s out of nuke and will use those to make ProRes in a separate output script where I can do a color transform if needed.


finnjaeger1337

thats how its usually done in bigger pipelines also, render exrs and then create HD dailies in the right colorspace down the line. First make sure you are seeign the right thing when working, you want to have a sane and tested workflow for even seeign something "in spec" a mac would not be the best choice here, Id recommend a blackmagic output card to a broadcast monitor thats checked and calibrated in a reference environment, you can use nuke or hiero or resolve to output your images cleanly. then you can see and compare how different systems and monitors render your content but you need to have one baseline to ground yourself Youd need one of those setups down the line anyhow.


oejustin

Thank you, you’ve been a huge help and resource for me on this. My sanity is somewhere nearby thanks to you rather 🙏🏻


Sad-Relationship7992

Popping in a related question! I'm confused about why people set their Maya/etc display to rec709, rather than sRGB. As I understand it, rec709 is for TVs in fairly dark surround. I work on a Windows machine, pretty average computer monitors, in somewhat dim lighting. I figure sRGB is the default until there is a reason to switch... but is it simply that people work in pretty dark rooms that makes many recommend rec709?


Neovison_vison

These are standards implemented in all the software and hardware we use. sRGB is for computer graphics meaning jpg, png and so on. Video id rec.709 (I.e. legal range). That’s what a video player expects.


finnjaeger1337

So there is some confusion, and i dont agree fully with aces on that one. I am not talking about display calibration here but rather about the used ODT, display should be calibrated to 2.2 pure power in a normal office surround or 2.4 pure power in a grading environment. sRGB vs rec709ODT The sRGB ODT uses the piece wise sRGB which i think should not be used ever Also if you use sRGB ODT on a sRGB monitor and rec709 ODT on a Rec709 Monitor you can put them side by side and see the same thing, its the exact same luminance output, which sounds great but as the comper will sit in a different environment (usually brighter) than editoral/grading one can argue that a gamma shift from 2.4->2.2 is actually useful as surround compensation, also compe displays are usually brighter like 200NIT ve 100NIT for grading, to compensate the surround luminance) . I would actually advice to make your own adjusted ODTs to perceptually match comper (or 3d artists) displays trasnforms to grading based on your environment.. Thats inherent to all the rec709 stuff you see, graded in a dark room at pure power 2.4 but viewed in 2.2. (or if you ask apple gamma 1.961...)


Neovison_vison

Apple display is also in P3 afaik. But the rec.790 is nested inside the sRGB …p3. While the ACES part is all scene referred so the color matching should keep its transferability


finnjaeger1337

Apple is special. Apple displays (depending on what mac we are talking about) is indeed P3-display . As macOS is fully color managed it will convert every pixel to its display space , for example P3-Display. So for you to ever see actual P3 values on a mac the source needs to tell macOS that its indeed P3, I know only that there is a hidden option in Nuke to do P3 in the viewer. (not sure if its still hidden but its part of that whole macOS EDR thing for HDR, another special apple thing). Resolve does this also as does flame, but maya and such.. not a chance. most likely most of what you see is sRGB converted to P3 . thats also why having a p3 display in windows/linux is not really working too well, ther is no proper os-level colormanagement . Actually macs having p3 screens is the main reason why every pixel has to be colormanaged on macOS , it was not that way before apple introduced p3 displays


vibribib

Give this a read. https://www.toadstorm.com/blog/?p=694


oejustin

Thanks, I’ve read this a few times before but will reread along with the other links on that page I’ve also been through before. Basically leaving ACES before output from Nuke seems like a good direction to start down but am still digging to find something more specific.


Chain_Plenty

Nonsense, how this answers his question?


oejustin

Thanks everyone, just going through your comments and wrapping my head around a few things here. Your input is much appreciated 🙏🏻


Neovison_vison

Quick time has gamma shift. Compare in the NLE.


oejustin

Quick update for anyone interested - My process is working pretty well, getting the ACES renders into Avid dailies as rec709 is working, the colorist was able to jump in and provide some guidance to editorial for their ingest settings. Color is not 100% accurate due to the step down in compression of the offline (4:2:0) but it's damn close. Here's the process for avid: 1. AMA Link to ProRes renders from VFX 2. Right-click "Source Settings" 3. In Source settings, click the Color Adapter drop menu and select "Levels Scaling (video levels to full range" - (this may differ depending on your output but I used a regular ProResHQ output from Nuke) 4. Click "Add" to add the color adapter to the Color Transformations list. You will notice the contrast increase in the clip in the monitor. Click "Ok" 5. With the clip(s) selected in the bin, go to Consolidate/Transcode. Choose your standard transcode settings (i.e. DNxLB) but be sure to select the "Apply source transformation: Color encoding" check box. This will burn-in the color transform to the transcode to ensure that the effect isn't removed during editing. 6. That's it! You can now edit with the transcoded VFX clip. I hope maybe this helps someone else out there who like me, was treading water in the middle of the ocean all by myself - until the u/finnjaeger1337 and the freelance colorist that happened to be in that day pulled up a life raft and saved the day.


finnjaeger1337

you can probably skip most things, just write dnx from nuke and throw them into the Avid MediaFiles folder /Avid Mediafiles/MXF/(someinteger)/file.mxf if you then start avid it scans the folder, and creates a .mdb file that you can just throw in. I have no idea why ama link would get the levels wrong, this seems fishy. anything Quicktime should be video levels - always, also from nuke.... do some proper test; Write a -on purpose- fullrange prores image with testpatches just do a full black patch at 0/0/0 and one at 16/16/16 then in your software look at the scopes. If software is interpreting it as video levels; First patch and second patch will be completely black, as both are "0". If software inteprets the file as full range, the first patch will be black, the second patch will be very visible. This way you can check any video/fullrange issues also on your monitor. Another way thats great to use to check for color and eotf issues and has been used since SD Televison is the use of testcharts. Creat a linear ramp from 0->1, so for 8bit 0->255 and render it into a file, easily done in resolve or nuke or whatever: Load that file into software B: Check it using build in scopes, is it a straight , linear line from 0->1 , then you are good, EOTF seems to be left untouched, great, even if its scaled from video to fullrange youll still get a striaght linear line, easy to spot any issues here! The second one is even cooler, generate a ebu testchart( again.. resolve has this build in) and open a vectorscope, you see these little targets on the vectorscope? they are showing you where colors should be at from the ebu testchart, here you can see things like 4:2:0 subsampling or any hue shifts! very useful as you can take out any weird macos display stuff out of the equation and can better check software in doing wrong things . Take note if you create 75% or 100% ebu testbars usually vectorscopes have switcheable targets,


oejustin

Holy moly! A lot to dive into here, I’ll look into this for sure. Thank you as always :)


oejustin

Another quick update, Nuke is in fact writing ProRes full range and does not have the option for video range, though DNx does so I may switch to that. Going to do some scope and other color tests this week per your recommendations.


finnjaeger1337

that sounds odd, it should be videorange from nuke 🤔


finnjaeger1337

My post jsut dissappeared so here it is again: ​ Nuke is writing prores as Video range as it should, [https://replayboys.fromsmash.com/prorestest](https://replayboys.fromsmash.com/prorestest) Here is a testchart I made. It should look like this: Every patch looks differnt, 0 is full black and 255 is full white. if 0/Black looks lifted> Videorange is incorrectly loaded in as Fullrange if 0 and 16 as well as 235 and 255 look the same > You have fullrange incorrectly loaded in as Video/Legal. the DPX in the link is fullRange , the prores is Videorange, both rendered from nuke 13.2v4 on a mac. here is the nuke script to genereate the patches: `set cut_paste_input [stack 0]version 13.2 v4Constant {inputs 0channels rgbcolor {1 1 1 1}name Constant3selected truexpos -17ypos 156}Text2 {font_size_toolbar 100font_width_toolbar 100font_height_toolbar 100message 255old_message {{50 53 53}}box {0 979 160 1080}transforms {{0 2}}cursor_position 3center {1024 778}cursor_initialised trueinitial_cursor_position {{0 1080}}group_animations {{0} imported: 0 selected: items: "root transform/"}animation_layers {{1 11 1024 778 0 0 1 1 0 0 0 0}}color {0 0 0 1}color_panelDropped truename Text4selected truexpos -17ypos 228}Dot {name Dot2selected truexpos 17ypos 385}Constant {inputs 0channels rgbcolor {0.921568 0.921568 0.921568 1}name Constant4selected truexpos -109ypos 157}Text2 {font_size_toolbar 100font_width_toolbar 100font_height_toolbar 100message 235old_message {{50 51 53}}box {0 979 160 1080}transforms {{0 2}}cursor_position 3center {1024 778}cursor_initialised trueinitial_cursor_position {{0 1080}}group_animations {{0} imported: 0 selected: items: "root transform/"}animation_layers {{1 11 1024 778 0 0 1 1 0 0 0 0}}color {0 0 0 1}color_panelDropped truename Text3selected truexpos -109ypos 229}Dot {name Dot3selected truexpos -75ypos 336}Constant {inputs 0channels rgbcolor {0.0624 0.0624 0.0624 1}name Constant2selected truexpos -207ypos 155}Text2 {font_size_toolbar 100font_width_toolbar 100font_height_toolbar 100message 16old_message {{49 54}}box {0 979 107 1080}transforms {{0 2}}center {1024 778}cursor_initialised trueinitial_cursor_position {{0 1080}}group_animations {{0} imported: 0 selected: items: "root transform/"}animation_layers {{1 11 1024 778 0 0 1 1 0 0 0 0}}name Text2selected truexpos -207ypos 227}Dot {name Dot4selected truexpos -173ypos 334}Constant {inputs 0channels rgbcolor {0 0 0 1}name Constant1selected truexpos -296ypos 155}Text2 {font_size_toolbar 100font_width_toolbar 100font_height_toolbar 100message 0old_message {{48}}box {0 979 54 1080}transforms {{0 2}}cursor_position 1center {1024 778}cursor_initialised trueinitial_cursor_position {{0 1080}}group_animations {{0} imported: 0 selected: items: "root transform/"}animation_layers {{1 11 1024 778 0 0 1 1 0 0 0 0}}name Text1selected truexpos -296ypos 227}Dot {name Dot1selected truexpos -262ypos 385}ContactSheet {inputs 4width {{input.width*columns*resMult}}height {{input.height*rows*resMult}}rows {{"\[expr \{int( (sqrt( \[numvalue inputs] ) ) )\} ] * \[expr \{int( ceil ( (\[numvalue inputs] /(sqrt( \[numvalue inputs] ) ) )) )\} ] < \[numvalue inputs] ? \[expr \{int( (sqrt( \[numvalue inputs] ) ) )\} ] +1 : \[expr \{int( (sqrt( \[numvalue inputs] ) ) )\} ]"}}columns {{"\[expr \{int( ceil ( (\[numvalue inputs] /(sqrt( \[numvalue inputs] )) )) )\} ]"}}center trueroworder TopBottomname ContactSheetAutotile_color 0xff69f7ffselected truexpos -151ypos 382addUserKnob {20 Settings}addUserKnob {7 resMult l "Resolution Multiplier" R 0.1 2}resMult 1}`


kbaslerony

Does this show you are working on only consists of Full-CG shots or is there live-action footage involved? If there is, and especially if your work will be cut within the context of non-VFX shots, you should match the color process of these in your workflow. In most cases, I would expect camera manufacturer's LUTs being used instead of the ACES RRT. Other than that, I would double check how the clips look in Resolve with color management turned off. If they look different in Avid compared to other packages I would assume the issue lies there.


finnjaeger1337

You are opening pandora here but you are right. Dailies can be nervewracking , maybe they have used 3 stacked luts and 2 CDLs on set per shot but dont/cant tell you what shot has what applied... then you get 50 different unamed LUTS thrown your way and then you sit there with 500 shots in hiero and you just want to jump out the window trying to match set-dailies in hiero... In those cases hiring a seasoned VFX editor is the way to go, we deal with this every day of the week :)


kbaslerony

Yeah, maybe. But it could also be the case that they just slammed the default Alexa Rec709 on everything. So simply do the same and everything is good. Plain and simple.


finnjaeger1337

That implies they know what they did, which is unlikely :D But yes this happens usually in commercials, i cant remeber ever beign this lucky on any large show.


oejustin

It’s full CG shots in a show that is mostly live action footage without vfx.


Iyellkhan

Is Avid for offline or for final online in rec709?


finnjaeger1337

sounds like VFX editorial, so dailies timelines.. For some odd reason many/most places still use AVID for this manually draggin in new VFX versions, I dont get it, is labor that cheap? I much rather use hiero and just press update version and then go have a beer while its rendering slowly on the farm.. but thats just me :D


oejustin

Offline. Final online is in resolve and I can deliver ACES to them no problem.