Lessons Learned, video

My Cinematic Streaming Studio v3

By popular demand, I’ve jotted down some details about my updated “Cine” Live Streaming Studio V3. I’ve shared some Lessons Learned at the end of this article that might be helpful if you also want to achieve a more professional or “cinematic” look for your streaming or Zoom calls.

See for yourself what v3 actually looks like in a Zoom call below and feel free to check out the comprehensive list of the gear I’m currently using to achieve the look on my Kit.co page.

The path to getting there

overview of versions 0 to 3

The week of the first lockdown in March 2020 a live streaming studio became a necessity to keep serving my customers, so I instantly started to build the first version of my studio. It was my first decent attempt to achieve a cinematic webcam that I would actually use to stream live.

Before v1 there was also a lot of incremental versions during spring and summer 2020, lots and lots of trial and error and learning the basics.

v0 being just natural light (which obviously doesn’t work if overcast or afternoon/night) or cheap light and no successful color grading; with the results of not good and worse, with cheap lights destroying the possibility of ever getting a clean grade.

You can e.g. see the terrible green tint in v0.2. Lesson learned: Get color-accurate lights from the get-go. While not cheap, they will 10x your results.

v0.5 was the first attempt to shape the light cinematically using new professional cinema lights (great color accuracy, went for the cheapest color-accurate lights I could find and thus they were underpowered (30W each) for my needs) and I also started color correcting and grading my own LUT using an actual “x-rite ColorCheck” color correction chart. To me, this was the “we’re finally getting anywhere” moment. IMO it was also a pleasing “cine” look, but way too flat and way too edgy for “work”. It took me a couple of months of experimenting and learning just to get here.

v0 through v0.5 where made with the tiny Blackmagic Micro Cinema Camera super 16mm MFT camera on a cheap-ish Samyang MFT 12mm lens and I used a Blackmagic Design UltraStudio Mini Recorder Thunderbolt (only works in a few apps like e.g. Zoom) and a Corsair Elgato Game Capture HD60 S+ for the rest (emulates webcam, lower quality signal) until the release of the ATEM Mini Pro.

The gist of my lighting setup remains the same to date: A key light close left (my side) of my face, a top/hair light just above my head, and a kicker further behind me to my right side (like a mirror version of the key light). This is the classic three-point film shot setup. (I've later also added various filler lights to help shape and warm up the light on my face.)

v1, the “2020” look

This was the first serious “look” upgrading to more powerful (300W key and 150W kicker) professional cinema lights, adding softboxes with diffusers and grids to help shape the light, upgrading to the Blackmagic Pocket Cinema Camera 4k super 35mm camera and adding a professional Sigma Art zoom lens.

The grading was updated to the new camera and lighting, but it was very similar to v0.5 with no “cinematic” look. Good contrast, shape and skin colors, perhaps – but lacking that little certain something that the brain recognizes as “cinematic” (I had no idea what I was doing with color grading at the time). It was also a bit too dark (looked great for me, but sometimes participants reported it was a bit dark depending on their monitor an operating system).

Also, there was no “motivation” to where the light was coming from – just a black void.

v2 the “2021 – 2022” look

I later tried to refine the shaping using two filler lights, one for the shadow side of the face and one at the front of the face, both set to 3200K to add some warmth (all the other lights are 5600K).

I practiced a bit with grading in Davinci Resolve and found some trustworthy educators: e.g. Gerald Undone for grading properly with a color chart and  Rob Ellis for awesome but simple and affordable cinematic lighting setups – and especially Darren Mostyn and Cullen Kelly if you’re getting seriously into grading in DaVinci Resolve. A big thank you to all four for making my life easier and more informed.

BEWARE: Most videos on youtube on how to grade in Davinci Resolve are made by click-seeking idiots with no fundamental knowledge of color science, photochemical film science, and how grading actually works, or even how the software works – let alone any sense of cinematic aesthetics. Dunning–Kruger and all that.

I was then able to make a more cinematic grading LUT. I also had to adjust the grade to having added a Teleprompter to the setup (yes, that added glass has a noticeable effect, about half an f stop darker). I still didn’t know what I was doing, though – an incredible amount of painstaking trial and error (brute-force) followed.

I also added the three lights in the background for “motivation”, aka fooling your brain to think that these are sources that light is coming from (no significant light actually reaches my face from these lights, though – photons, inverse square law and all that), and it added some “interestingness” instead of just the dark void. It was still a tad bit too dark to account for variations in participants’ setup, though. In hindsight, I also find it a bit too saturated – especially in the highlights and shadows.

v3, late 2022 and beyond look

The major hardware change was replacing the individual fill lights with a light tube system that could be remotely controlled in concert (major headache to always have to finesse each fill light manually), also replaced the top/hair light with two tubes in the same system and built a custom softbox around them. The main reason for this, however, was that the new lights had a bit more power than the previous and would enable me to lighten the look or “wrap” the light around further.

I also changed the DOF (depth of field, went from f2.2 to f2.8) to better help staying in focus when naturally moving my head (yes, there is absolutely no autofocus in my cinematic setup), I then relit the whole thing by first cranking the ISO up from 200 to 800 to properly expose to the right (more dynamic range using native ISO, allegedly – but feels like this should not be  applicable when piping the 3G/FHD signal over the HDMI to the ATEM Mini; I mean, it’s not a RAW signal – but I guess the more information going into it, the more comes out. ), and to enable using less power from the lights to achieve the same result (because of less eye-strain, lack of flexibility in lighting the scene if the lights are already maxed out at a 100%, and of course to consume less energy and generate less heat).

The major grading change was, aside from adjusting to the new lights and the ISO change, updating to Blackmagic Gen 5 color science (pain in the ass, had to regrade everything – but not too hard now that I knew a little bit more about how to actually grade and I could replicate steps instead of trial and error) and a brighter, less “edgy” or stylized, look that still tries to retain that “cinematic” quality to it.

It is now bright enough to accommodate for the differences in participants’ setups. Some report it is also a more pleasing look than the previous one. I think it is definitely less “edgy”.

Update: I’ve since incrementally updated this look to a v3.1 – Only by changing lighting, only changes being to bring back a bit more contrast the between light and dark side of face for IMO a bit more “definition” and interestingness. Might post screenshot later.

Davinci Resolve color grading nodes

My current Davinci Resolve node tree for color grading. (Screenshot showing a more edgy grade than my live LUT, disregard the “Grain” node.) Discontinued. The screenshot above was my own brute-forced node tree that I previously used. 

The first node in my grading above is the “Leeming LUT Athena III – Blackmagic Design Pocket 4K – Gen5 Film”, the “Video to Full” node uses the “Leeming LUT Fixie – Video to Full Range” as I find it adds to the cinematic quality, and you can then add your creative cinematic grade to the “Creative LUT” node either manually or applying a cinematic LUT. Keep in mind that the creative LUT you apply should be expecting the same color science you are using. In my case using the “Leeming LUT Athena III – Blackmagic Design Pocket 4K – Gen5 Film” LUT converts the color space to something as close to rec709 as possible, so any LUT expecting a rec709 input will work – but any LUT expecting a different color space input will look like utter garbage. If you find a LUT that you like but it’s for a different color space than you have already set up (say Arri Log-C instead of rec709) – or conversely you’ve found a LUT but it looks like crap when applied and you don’t know what input it is based on – you can always add a node with a “Color Space Transform” effect in front of the Creative LUT node and experiment your way with converting your current color space to different ones to see if you can find something usable for the LUT to use as an input. Oh, and those “Limit Sat” nodes are for me to make sure that no colors snuck into the highlights or shadows during my grading process (I’m not going to claim I fully know what I’m doing here, there must be ways to do this more professionally) to mimic how photochemical film behaves.

Update: Now my basic node tree looks much more like that of Cullen Kelly’s. See below:

My new default clip based node tree above. Of note, my skin tone often renders weirdly so I have a custom skin correction node and I also have a HSV node with only the V channel activated to temper saturation subtly to taste if needed. Noise Reduction is added to the start as an option if needed, the last node after the mix node is for any sharpness or blurring added (technically these two types of transforms should not be in the primaries or the secondaries to avoid potential unwanted artifacting, also I don’t think this node has any effect on the LUT and I would leave it turned off when exporting the LUT, as with obviously also any of the secondaries)

My new default timeline level nodes establishing the overall look, using taste LUTs from Cullen Kelly’s Voyager Pro pack and optionally the Dehancer plugin when I want to mimic real photochemical filmstock when exporting video, but definitely leave Dehancer off for when exporting for a camera streaming LUT.

You should check out the Leeming LUT Pro (IMO the best color transform luts for the Blackmagic Cinema cameras out there by far!) before starting going crazy in Resolve yourself – worth every single buck.

Update: Switching to a color managed workflow made the camera-specific Leeming color transform LUTs sort of obsolete for me. I do still find that the Leeming Fixies LUT “Video to Full” can be helpful to achieve a better starting point for a cinematic grade when dealing with older color science footage like Blackmagic Design Gen 1 of my BMD Micro Cinema Camera s16mm or stuff already in Rec709 like my Canon 5D Mark II DSLR.

I now also use Cullen Kelly’s Voyager Pro pack to create a look of several “taste” LUTs instead of a single creative LUT.

Now for the rocket science going into exporting a new grade as a new LUT for the streaming studio camera, you can easily google how to do that in DaVinci Resolve. Just remember that if you are using a color managed Wide Gamut / Intermediate workflow in DaVinci, you have to add a Color Transform node or set the output to match the intended Camera’s color space and Gamma – which of course may vary.

See for yourself what v3 actually looks like in live streaming action below and check out the comprehensive list of the gear I’m currently using to achieve the look on my Kit.co page.

Let me know if you have any questions!

Lessons Learned:

  • The quality of the output equals the quality of the input: camera, light, and lens matters equally
  • Use a camera with enough dynamic range to be able to deliver a cinematic image at all
  • Only use lights that are color-accurate
  • Diffusion is a prerequisite for that cinematic “wrapped-around-the-skin” light
  • All you need to know about diffusion is that you can either use a white shower curtain, a sheet of bleached muslin – or add a more productified version called a “soft box” (and add a “honeycomb grid”) on it to your main lights for your light to become wonderfully diffused
  • All you need to know about Aputure Lightstorm lights vs Godox VL is that Godox is cheaper and provides the same quality of light (or even better) for this type of studio use (Aputure Amaran 100d/200d might be a better budget option, though – and let’s face it – The Aputure “Sidus Link” app is just fantastic – I love it!)
  • Where you put your lights matters A LOT – study what they’ve been doing in Hollywood for years – experiment with placement and angles, get someone to help you move lights while you stay in the shot, take the time to “fuck around and find out” what the optimal light positioning is for achieving the look you want – it’s going to pay off massively
  • Motivated lighting is a thing, you might want to consider it
  • Use a lens that will support the creative vision of your output (shocker: all lenses are different), but it should probably not be a slow “kit lens”, more likely an f2.8 or faster prime lens (or a fixed t/f stop zoom lens, like I’m using in my v3 setup)
  • Crop factors, full-frame vs MFT vs APS-C, etc are all words you are going to learn to hate – it’s already a fucking mess, and adding a speedbooster to the mix will just kill your will to live and make you give up on calculating actual focal lengths and t/f stops altogether (well, it’s not too hard to actually re-calculate it but it is such a killjoy for me – if it works for me, fuck it, I’ll shoot with it)
  • All you need to know about crop factors is to take the lens in question and mount it on your camera – if it fits (sometimes an adapter is needed) and if it looks good to you (no serious vignetting, you get the field of view you need, the depth of field, the smoothness or sharpness, the character you’re looking for) then it’s a keeper (fuck the calculations) – oh, and never get into a discussion on this topic online ever
  • All you need to know about speed boosters (or actually “Telecompressors“) is that, if you add them, whatever the mm and t/f stop printed on your lens says is now wrong (don’t worry, it’s all good if the image and field of view now coming out of the lens looks good to you) and you now need to adjust the lighting accordingly to taste (although feel free to cheat by using Zebras and False Color)
  • All you need to know about Metabones vs Viltrox “Speed Boosters” is that the Viltrox is almost an order of magnitude cheaper and will most likely be fine for your “cine” webcam studio or office setup – I bought FIVE new Viltroxes (important: I’m talking about the the EF-M2 II version) on ebay for the same price of a single used Metabones adapter
  • Contrary to common “cine” aesthetics, apply high sharpening in-camera if you intend to stream (compression garbles details so you want to have more details going in upstream) and any sharpening in DaVinci will not transfer to your LUT
  • Color grading is an art, not a science – but a “cinematic” grading takes basic principles from  photochemical film science into account when grading AND you have to be aware that there are several aspects of a real photochemical film look like “Halation”,  “Grain”, and “Bloom” effects that do not translate into a LUT (I would disable all plugin effects – except maybe Colorspace Transform if you’re using it – in DaVinci when exporting a LUT)
  • Learn how to use color management and Wide Gamut / Intermediate workflow if you intend to grade with DaVinci Resolve – it will take a lot of guessing and headaches out of the equation, making grading a faster and a much more fun and predictable process & thank me later
  • Forget brute-forcing and using a hundred million custom nodes per clip by using something more like this simple node tree for clip-level (primaries: Exposure, Ratio/Contrast, and Balance + secondaries as parallel nodes) and process as your go-to starting point, see screenshots of my node tree above, and consider using a separate timeline-level node tree for your overall “look”
  • WARNING! If  you are on Apple Mac, you need to know this before stepping into grading with Davinci Resolve: If you don’t change a certain setting, there will forever be a difference in what you see in DaVinci and what the exported video or grade looks like! WARNING! (I wish I had known sooner! It would have taken away 90% of the painstaking trial and error, googling for this artifact gives you zero answers – only entitled industry asshats claiming you need better display hardware – tl;dr you don’t – it’s a software Apple Mac thing – obviously – and Blackmagicdesign has finally addressed it).
  • Save yourself even more pain and time by investing in an X-rite ColorChecker Video chart to proper white balance and check exposure (also found on my kit.co page)

Continue reading

Standard
Failure, innovation, management, Software

RIP Google Stadia

– An opportunity squandered.

RIP Stadia

Google Stadia wasn’t a perfect cloud gaming experience (missing a lot of game titles, multiplayer often impossible to find other players, etc) by any account, but it was more than good enough to enjoy casual gaming without having to buy and manage the PC hardware and software. 

Stadia is however the best effort to make casual AAA-title gaming without a PC or console an enjoyable and frictionless experience to date.

But only two years old, Google has already decided to kill it January 2023.

I guess this is what happens when previously disruptive startups become public corporations: Out the window goes the long-game and everything shifts to short term gains. No vision, no leadership, no will of taking risks beyond the scope of fulfilling career-based KPIs.

Update: I guess I hit close to home… (see below or see tweet link)

https://twitter.com/petergyang/status/1576985038511448064

As a friendly free word of advice in advance — If you’re intending to disrupt an existing market, don’t apply a two year horizon for it to be even remotely successful. (Or if you only have two years, make sure it has enough funding and priority to actually be able to achieve rapid Horizon 3 scaling.)

The sad-funny part is that even Microsoft is more innovative than Google at this point.

Cloud gaming is obviously the future (lower barrier to consume, hardware homogeneity and stability for game developers, no-cost upgrade cycles for consumers, lower environmental impact for everybody, near-zero cost distribution, etc.). I mean, considering the computing power needed for the Metaverse(s) / AR-Verse(s), it is inevitable — you’re not going to render that locally on your iPhone or on your Quest headset any day soon now.

Now, Stadia isn’t the first and probably won’t be the last to drop out of the cloud gaming race.

NVIDIA (GeForce Now) already copped out by castrating themselves by publisher demands (games you previously bought suddenly disappearing because the publishers’ knee-jerk reactions). IMO, if NVIDIA was serious about cloud gaming, they would have litigated publishers to a settle that would set precedence and benefited consumers — but I deem from their no-contest fold that they are not really in the cloud gaming race at all.

Like Apple’s AppStore, I don’t think you’ll win cloud gaming without winning the devs. And by that, I don’t mean the existing publishers. (No, by all means screw those gatekeepers over for good — They represent most things bad with gaming today.) You cannot and will not win them over as they have every incentive in the world to fight for their status quo. You need the games. The games with mainstream appeal. Games that will bring the gamers. The games with epic experiences. Games like those coming out of the studios of Naughty Dog, Crystal Dynamics, or Rockstar.

You also need the multiplayer games to be multiplayer-playable — which cannot be said about a lot of games played in the cloud (not cross-platform compatible, no critical user mass of cloud-only version yet), which renders them unplayable (e.g. Red Dead Redemption 2 Online is completely unplayable on Stadia as there are no other players being matched to your game).

I’m not getting my hopes up for Amazon Luna (everything Amazon touches turns out mediocre at best) and it’s not even available in Europe (yet?).

I think Steam would be in a good position as they are already in the sales and distribution game, have a large customer base — but it feels like Valve got lost after the Half Life 2 release party and is still trying to find their way home.

Sony bought Gaikai in 2012 (I tried it sometime 2011 and I was very impressed by how I was able to play Crysis 2 on my non-gaming MacMini 2009 with it. It was one of those very rare “DANG! This-is-the-future-right-here” moments.) and has since pretty much squandered the potential as they are too entrenched (witch is a nicer way of saying Sony management have a track record of having their heads too far up their behinds) in their existing Nespresso lock-in business model. I’m not expecting miracles.

Which surprisingly makes me believe Microsoft with its XBOX Live (no Mac app yet — to no one’s surprise) is currently in the best position. It’s a distributor and publisher with its own game dev studios — and it seems (for now) that they are playing for the long game. I’m not sure if they will be willing or able to thoroughly disrupt their hardware / software lock-in model any day soon (hey, throw us a Mac app bone), though. Probably a positioning play for now that affords future optionality.

I’m not getting my hopes up for Ubi/EA/EPIC/etc siloed cloud subscription services. A siloed market represents added inconvenience and added costs (subsidise the publisher for what you don’t want, pay for several silos to get what you want) for the consumers. Besides, some of them got cultural baggage and some have a problematic developer / publisher paradox.

And what about those rent-a-windows-box-in-the-cloud services? Have you ever tried one of these? Don’t get me started. It’s still all of the hassles of actually owning and managing a Windows gaming PC — but with higher latency and frame drops. The pain. The horror.

Personally, I would like to see Apple get over their Pippin complex and just get on with it and own the market. It’s the only media type that is missing from their offerings, IMO. But I’m not getting my hopes up. Knowing Apple, they will probably join the fray if and when the time is right — which is to say probably not any day real soon now. (Come on Apple, you need another “hobby”! Maybe hot on the heels of the Apple AR Glasses?)

OTOH — As another corporate venture gets it chain yanked, it’s leaving the opportunity on the table for the startup with the grander vision and deeper (accessible) pockets and more freedom to operate.

What do you think?

(This article was originally published on LinkedIn 2020.09.30)

Standard