Excited and honored to have been invited to mentor participating startups in the Jacobs Startup Competition (JSC) founded and managed by students at the Jacobs University (now known as the Constructor University) in Bremen, DE.

Excited and honored to have been invited to mentor participating startups in the Jacobs Startup Competition (JSC) founded and managed by students at the Jacobs University (now known as the Constructor University) in Bremen, DE.
By popular demand, I’ve jotted down some details about my updated “Cine” Live Streaming Studio V3. I’ve shared some Lessons Learned at the end of this article that might be helpful if you also want to achieve a more professional or “cinematic” look for your streaming or Zoom calls.
UPDATE 2: Latest changes to how I color grade to get the “cinematic” look
And if you could use some help with putting your studio, podcasting, or event setup together (which camera, lenses, lighting, sound, live switchers, etc to get that are right for you and how to cable, set up, and install it all) – or creating that special signature “look” for your videos or streams – give me a ping, and let’s talk about it. I spent a idiotic amount of time and money doing things all sorts of wrong in the beginning, so I’m happy to help you not do the same.
See for yourself in the video below what v3 actually looks looked like when recorded and check out the comprehensive list (constantly updated) of the gear I’m currently using to achieve the look on my Kit.co page.
Here’s what v3.1 looks and sounds like in an actual real-life Zoom call interview situation:
Of note, compression smears the image a whole lot (that’s why I have the camera output set to be so sharp – more details in = more details out when compressed in Zoom), and depending on the conferencing software and the operating system, things happen to your saturation and gamma (here desaturated, less contrast-y – which makes me think it was not captured on a Mac).
Fun fact: One of the other changes from v3 to v3.1 is the choice of microphone. Can you hear it? (One costs 1.600,- Euros, the other 117,-). I’m actually now using the cheap-ass microphone(!) instead of my (beloved) Neumann. Check out my v3.1 kit.co page for the deets.
My studio setup has a price tag that makes it mostly only relevant for professionals (aka people who make money using their setup in any way or form) – or crazy people – it may not be that applicable to your average home office webcam setup.
That’s why I also did some experimenting to come up with a much less expensive (YET FULL-FRAME! ZOMG!) streaming “cine” setup that could be more accessible to more people (also, no color grading or LUTs needed). Here’s the results of my more budget-friendly(less than $300 for the camera, cheap-ass lighting, decent budget sound) in the video below:
You can also find the gear used for this more budget friendly setup on my Kit.co page
The different versions over the last years: v0 – v0.5 used a Blackmagic Design Micro Cinema Camera with a 12mm Samyang f2 MFT lens. V1 and beyond a Blackmagic Design Pocket Cinema Camera 4K with a Sigma 18-35mm f1.8 Art DC HSM APS-C lens on a Viltrox EF-EOS M2 speed booster adapter.
A live streaming studio became a necessity became a necessity in the first week of the COVID-19 lockdown in March 2020 to keep serving my customers. I instantly started to build the first version of my tele-presence studio. Thus my quest to achieve a cinematic looking output began.
Why “cinematic”? Well, partly because I’ve always been interested in cinema. Fun Fact: I originally studied to become a film director way back when before dropping out and pursuing a completely different career path – and this seemed like a good way to combine passion with “work” – again.
Also because my customers were already used to paying for the highest quality of live / in-person content – and serving them online with just a standard shoddy webcam wasn’t an option. I felt I owed the people who put their trust and money with me to provide them the best quality of experience possible also when delivering online on Zoom.
Before v1 there was also a lot of incremental versions during spring and summer 2020, lots and lots of idiotic trial and error while trying to get the hang of the very basics.
v0 being just natural light (which obviously doesn’t work if overcast or afternoon/night) or cheap lights and no real color grading, results being not good and worse. Also using cheap lights destroying the possibility of ever getting a clean grade to begin with.
You can e.g. see the terrible green tint of cheap lights in v0.2, and I couldn’t get completely rid of it when grading so I had to get more color-accurate lights. While obviously more expensive than my green tinted no-name lights, IMO getting color-accurate lights easily 10x’ed the results.
v0.5 was the first attempt to shape the light cinematically using my first new professional cinema lights (I went for the cheapest yet very color-accurate lights I could find – and thus they were in the end underpowered for my needs (Came-TV Boltzen 30W Fresnels, fresnels didn’t help either for my lighting setup – too narrow a beam) and I also started color correcting and grading my own LUT using an actual “x-rite ColorCheck” color chart. It took me a couple of months of experimenting and learning just to get to this point.
To me, this was the “now we’re finally getting somewhere” moment. IMO, a pleasing “cine”-like look, but too “flat” for my taste and way too edgy for my target audience and purpose.
v0 through v0.5 where made with the tiny Blackmagic Micro Cinema Camera super 16mm MFT camera on a cheap-ish Samyang MFT 12mm f2 lens and I used a Blackmagic Design UltraStudio Mini Recorder Thunderbolt (only works as a webcam in a few apps like e.g. Zoom, but delivers better image quality than the HD60 S+, 8-bit 4:4:4 and 12-bit 4:2:2 over HDMI – also has SDI with 12-bit 4:4:4) and a Corsair Elgato Game Capture HD60 S+ for the rest (emulates webcam, somewhat lower quality signal, 4:2:0 chroma subsampling) until the release of the ATEM Mini Pro.
The gist of my lighting setup remains the same to date:
A key light close left (my side) of my face, a top/hair light just above my head, and a kicker further behind me to my right side (like a mirror version of the key light). This is the classic three-point film shot setup. I’ve later also added various filler tube lights to help shape and warm up the light on my face.
The 2023 V3.1 studio lighting setup illustrated above. ATTENTION: For some reason, the back light and key light are on the wrong sides in this illustration. Key should be on the right hand side looking at this illustration, back light left.
This was the first serious “look” upgrading to more powerful (300W key and 150W kicker) professional cinema lights, adding softboxes with diffusers and grids to help shape the light, upgrading to the Blackmagic Pocket Cinema Camera 4k super 35mm camera and adding a pro Sigma Art zoom lens. Why 4K when the streaming standard is still mostly 1080p? Because more information going in equals better quality coming out in the downsampled 1080p signal, and to future-proof my camera setup.
The grading was updated to the new camera and lighting, but it was very similar to v0.5 with no “cinematic” look. Good contrast, shape and skin colors, perhaps – but lacking that little certain something that the brain recognizes as “cinematic”.
I had zero idea what I was doing with color grading at the time. It was also a bit too dark – it looked fine for me, but sometimes participants and customers reported it was a bit dark depending on their monitor, device, and operating system.
Also, there was no “motivation” to where the light was coming from – just a black void. Which was what I was going for at the time, but in hindsight it is very boring to look at over time.
I later tried to refine the shaping using two filler lights, one for the shadow side of the face and one at the front of the face, both set to 3200K color temperature to add some warmth, all the other lights 5600K, the camera set to 4400K by walking it back from the original 5600K to taste by monitoring in studio.
I practiced a bit with grading in Davinci Resolve (rapidly becoming an industry standard, a free download, unless you need e.g. “Color Space Transform” (CST) nodes the free version is awesome – if you need e.g. CST, you’ll need the “Studio” version which comes free with Blackmagic Design’s cameras or as a paid upgrade) and found some trustworthy educators: e.g. Gerald Undone for e.g. no nonsense technical information and grading with a color chart and introduced me to the Leeming LUTs, and Rob Ellis for awesome but simple, affordable – yet beautiful – cinematic looks with lighting setup tutorials, and especially Darren Mostyn and Cullen Kelly if you’re getting seriously into grading in DaVinci Resolve. A big thank you to all of them for making my life easier and way better informed.
CAVEAT EMPTOR: Most videos on youtube on how to grade in DaVinci Resolve are made by click-seeking (or well-meaning but not knowledgeable and still) BUFFOONS with no fundamental knowledge of color science, photochemical film science, and how grading actually works, or even how the technology or software works – let alone any sense of cinematic aesthetics. Thank you, massively lowered barrier to entry with cheaper cameras and free software, Plato, Dunning–Kruger, and all that. Heuristic: if they never speak about how they color-manage – if they don’t color manage in any way or form – avoid, do not watch, do not read!
And after a while I was then able to make a more cinematic look LUT. I also had to adjust the grade after having added a Teleprompter to the setup – yes, that added glass has an effect, about half an f stop less light. I still didn’t know what I was doing, though – and an incredible amount of painstaking trial and error (brute-force) followed.
I also added the three light tubes in the background (approximately 2m further behind me) for “motivation”, aka fooling your brain to think that these are sources that light is coming from (no significant light actually reaches me from these lights, though – photons, inverse square law and all that), and it added some “interestingness” instead of just the dark void. It was still a tad bit too dark to account for variations in participants’ setup, though. In hindsight, I also find it a bit too saturated – especially in the highlights and shadows.
The major hardware change was replacing the individual fill lights with a light tube system that could be remotely controlled in concert (as it was always a major headache to have to finesse each fill light manually), also replaced the top/hair light with two tubes in the same system and built a custom softbox around them. The main reason for this, however, was that the new lights had a bit more power than the previous and would enable me to lighten the look or “wrap” the light around further.
I also changed the DOF (depth of field, went from f2.2 to f2.6) to better help staying in focus when naturally moving my head (yes, there is no autofocus in my cinematic setup, get a Sony Alpha or FX family camera instead if AF is important to you), I then relit the whole thing by first cranking the ISO up from 200 to 800 to properly expose to the right (GEEK ALERT: more dynamic range using native ISO) and to enable using less power from the lights to achieve the same result (because of less eye-strain, lack of flexibility in lighting the scene if the lights are already maxed out at a 100%, and of course to consume less energy and generate less heat in the studio).
The major grading change was, aside from adjusting to the new lights and the ISO change, updating to Blackmagic Gen 5 color science (a pain in the ass as I had to regrade everything – but not as hard as the first time around now that I knew a little bit more about how to actually grade and could replicate steps instead of brute-forcing it) and a brighter, less “edgy” or stylized, look that still tries to retain that “cinematic” quality to it.
It is now bright enough to accommodate for the differences in participants’ displays. Some report it is also a more pleasing look than the previous one. I think it is definitely less “edgy”, more rounded.
Update 1: I’ve since incrementally updated this look to a v3.1 (screenshot at the top of this post) – Only by changing the lighting values, the ratio between dark and bright, bringing back a bit more contrast the between light and dark side of face for IMO more “definition” and interestingness.
UPDATE 2: Latest changes to how I color grade to get the “cinematic” look in 2024
My current Davinci Resolve node tree for color grading above. (Screenshot showing a more edgy grade than my live LUT, disregard the “Grain” node.) Discontinued – The screenshot above was my own brute-forced node tree that I previously used, described below:
The first node in my grading above is the “Leeming LUT Athena III – Blackmagic Design Pocket 4K – Gen5 Film”, the “Video to Full” node uses the “Leeming LUT Fixie – Video to Full Range” as I find it adds to the cinematic quality, and you can then add your creative cinematic grade to the “Creative LUT” node either manually or applying a cinematic LUT. Keep in mind that the creative LUT you apply should be expecting the same color science you are using. In my case using the “Leeming LUT Athena III – Blackmagic Design Pocket 4K – Gen5 Film” LUT converts the color space to something as close to rec709 as possible, so any LUT expecting a rec709 input will work – but any LUT expecting a different color space input will look like utter garbage. If you find a LUT that you like but it’s for a different color space than you have already set up (say Arri Log-C instead of rec709) – or conversely you’ve found a LUT but it looks like crap when applied and you don’t know what input it is based on – you can always add a node with a “Color Space Transform” effect in front of the Creative LUT node and experiment your way with converting your current color space to different ones to see if you can find something usable for the LUT to use as an input. Oh, and those “Limit Sat” nodes are for me to make sure that no colors snuck into the highlights or shadows during my grading process (I’m not going to claim I fully know what I’m doing here, there must be ways to do this more professionally) to mimic how photochemical film behaves.
Update: Now my basic node tree looks much more like that of Cullen Kelly’s (the updated version of his node tree, see his newer videos for the changes, e.g. no sharpening or smoothing modifiers in the secondaries anymore, added instead immediately after primaries and secondaries join together).
Update 2: See this update post for my latest node trees.
What my current previous node tree looked like below:
My new default Clip based node tree above. Of note, my skin tone usually renders weirdly so I have a custom skin correction node to adjust to taste and I also have a HSV node with only the S channel activated to temper saturation subtly to taste if needed. Noise Reduction is added to the start as an option if needed, the last node after the mix node is for any sharpness or blurring added (technically these two types of transforms should not be in the primaries or the secondaries to avoid potential unwanted artifacting, also I don’t think this node has any effect on the LUT and I would leave it turned off when exporting the LUT, as with obviously also any of the secondaries)
My new default Timeline level nodes establishing the overall look, using taste LUTs from Cullen Kelly’s Voyager Pro pack and optionally the Dehancer plugin when I want to mimic real photochemical filmstock when exporting video, but definitely leave Dehancer off when exporting for a camera streaming LUT unless you know what you’re doing (aka first turning off all the features that do not translate into a LUT and check if you’re still happy with the results).
You should also check out the Leeming LUT Pro (IMO the best color transform luts for the Blackmagic Cinema cameras out there) before starting going crazy in Resolve yourself – worth every single buck. Update: Switching to a color managed workflow made the camera-specific Leeming color transform LUTs obsolete for me and my Blackmagic cameras. I do still find that the Leeming Fixies LUT “Video to Full” can be helpful to achieve a better starting point for a cinematic grade when dealing with stuff already in Rec709 like my Canon 5D Mark II DSLR HDMI out and I do still use the Leeming LUTs when grading for my GoPros.
I now use Cullen Kelly’s Voyager Pro pack to create a look of several “taste” LUTs instead of a single creative LUT (in the “timeline” nodes to make the look apply globally to all clips). They are really, REALLY good – and also made to work perfectly with a color managed DWG / Intermediate workflow (which is not the case with the majority of LUTs out there – so be advised if some other LUT you purchased looks like utter crap with your color management workflow and/or camera).
Now, for the not-so-rocket-science going into exporting a new grade as a new LUT for the streaming studio camera, refer to the manual or just google how to export a LUT, and remember that if you are using a color managed Wide Gamut / Intermediate workflow in DaVinci, you have to add a Color Space Transform (CST) node or set the output to match your camera’s intended output Color Space and Gamma – which of course varies. E.g. for my BMD PCC4K I use rec709 / Gamma2.4 (which, I have been informed, used to be the industry standard to deliver in). However, my BMD Video Assist 5″ 12G seems to expect a P3 / D65 color space LUT, so YOLO.
For me and for my camera, before exporting the LUT I have to add a CST node as the last “clip” node, converting the color space from Timeline to rec709 / Gamma 2.4 explicitly, and I also set Tone Mapping to “Luminance”, Gamut Mapping to “Saturation” – and most importantly check the box “Apply Forward OOTF” under advanced. (The image is going to look terrible in DaVinci, but don’t worry – it’s going to be correctly interpreted in the camera or LUT box! Trust me – sort of.)
GEEK ALERT: Theoretically, and to the best of my knowledge, this CST node should not be necessary as I’m already operating in DaVinci Color Managed DWG / Intermediate timeline set to a rec709 / Gamma2.4 output color space, BUT THIS IS THE *ONLY* WAY the exported LUT will look right when imported to my camera for me. Be advised.
See for yourself what v3 actually looks like in live streaming action below and check out the comprehensive list of the gear I’m currently using to achieve the look on my Kit.co page.
Let me know if you have any questions!
Excited to be invited to explain the Customer Development part of the Lean Startup methodologies to startups in Cameroon, courtesy of FI Cameroon.
I’m honored and excited to be invited back to support the incubated startups at digitalHUB Aachen for the FIFTH year in a row with my company +ANDERSEN & ASSOCIATES!
I shared a summary of the Lean Launchpad structure we are using to support the early-stage startups last year.
Google Stadia wasn’t a perfect cloud gaming experience (missing a lot of game titles, multiplayer often impossible to find other players, etc) by any account, but it was more than good enough to enjoy casual gaming without having to buy and manage the PC hardware and software.
Stadia is however the best effort to make casual AAA-title gaming without a PC or console an enjoyable and frictionless experience to date.
But only two years old, Google has already decided to kill it January 2023.
I guess this is what happens when previously disruptive startups become public corporations: Out the window goes the long-game and everything shifts to short term gains. No vision, no leadership, no will of taking risks beyond the scope of fulfilling career-based KPIs.
Update: I guess I hit close to home:
As a friendly free word of advice; If you’re intending to disrupt an existing market, don’t apply a two year horizon for it to be even remotely successful. (Or if you only have two years, make sure it has enough funding and priority to actually be able to achieve rapid Horizon 3 scaling.)
The sad-funny part is that even Microsoft is more innovative than Google at this point.
Cloud gaming is obviously the future (lower barrier to consume, hardware homogeneity and stability for game developers, no-cost upgrade cycles for consumers, lower environmental impact for everybody, near-zero cost distribution, etc.). I mean, considering the computing power needed for the Metaverse(s) / AR-Verse(s), it is inevitable — you’re not going to render that locally on your iPhone or on your Quest headset any day soon now.
Now, Stadia isn’t the first and probably won’t be the last to drop out of the cloud gaming race.
NVIDIA (GeForce Now) already copped out by castrating themselves by publisher demands (games you previously bought suddenly disappearing because the publishers’ knee-jerk reactions). IMO, if NVIDIA was serious about cloud gaming, they would have litigated publishers to a settle that would set precedence and benefited consumers — but I deem from their no-contest fold that they are not really in the cloud gaming race at all.
Like Apple’s AppStore, I don’t think you’ll win cloud gaming without winning the devs. And by that, I don’t mean the existing publishers. (No, by all means screw those gatekeepers over for good — They represent most things bad with gaming today.) You cannot and will not win them over as they have every incentive in the world to fight for their status quo. You need the games. The games with mainstream appeal. Games that will bring the gamers. The games with epic experiences. Games like those coming out of the studios of Naughty Dog, Crystal Dynamics, or Rockstar.
You also need the multiplayer games to be multiplayer-playable — which cannot be said about a lot of games played in the cloud (not cross-platform compatible, no critical user mass of cloud-only version yet), which renders them unplayable (e.g. Red Dead Redemption 2 Online is completely unplayable on Stadia as there are no other players being matched to your game).
I’m not getting my hopes up for Amazon Luna (everything Amazon touches turns out mediocre at best) and it’s not even available in Europe (yet?).
I think Steam would be in a good position as they are already in the sales and distribution game, have a large customer base — but it feels like Valve got lost after the Half Life 2 release party and is still trying to find their way home.
Sony bought Gaikai in 2012 (I tried it sometime 2011 and I was very impressed by how I was able to play Crysis 2 on my non-gaming MacMini 2009 with it. It was one of those very rare “DANG! This-is-the-future-right-here” moments.) and has since pretty much squandered the potential as they are too entrenched (witch is a nicer way of saying Sony management have a track record of having their heads too far up their behinds) in their existing Nespresso lock-in business model. I’m not expecting miracles.
Which surprisingly makes me believe Microsoft with its XBOX Live (no Mac app yet — to no one’s surprise) is currently in the best position. It’s a distributor and publisher with its own game dev studios — and it seems (for now) that they are playing for the long game. I’m not sure if they will be willing or able to thoroughly disrupt their hardware / software lock-in model any day soon (hey, throw us a Mac app bone), though. Probably a positioning play for now that affords future optionality.
I’m not getting my hopes up for Ubi/EA/EPIC/etc siloed cloud subscription services. A siloed market represents added inconvenience and added costs (subsidise the publisher for what you don’t want, pay for several silos to get what you want) for the consumers. Besides, some of them got cultural baggage and some have a problematic developer / publisher paradox.
And what about those rent-a-windows-box-in-the-cloud services? Have you ever tried one of these? Don’t get me started. It’s still all of the hassles of actually owning and managing a Windows gaming PC — but with higher latency and frame drops. The pain. The horror.
Personally, I would like to see Apple get over their Pippin complex and just get on with it and own the market. It’s the only media type that is missing from their offerings, IMO. But I’m not getting my hopes up. Knowing Apple, they will probably join the fray if and when the time is right — which is to say probably not any day real soon now. (Come on Apple, you need another “hobby”! Maybe hot on the heels of the Apple AR Glasses?)
OTOH — As another corporate venture gets it chain yanked, it’s leaving the opportunity on the table for the startup with the grander vision and deeper (accessible) pockets and more freedom to operate.
What do you think?
(This article was originally published on LinkedIn 2020.09.30)