On-Set with DIT Rich Roddman – Talking High Performance Shared Storage

A Tao Colorist Newsletter Sponsor Case Study

DIT Rich Roddman

Rich Roddman is a professional DIT and Colorist, based in Tampa, Florida.

Rich Roddman, C.S.I. is a highly experienced and talented DIT and colorist working out of  his own SilverBox Studios in Tampa, Florida. He’s been a good friend of the Tao since I met him in Orlando in 2013. I’ve always been impressed with his professionalism and his dedication to making his DIT craft as frictionless as possible (a big deal for on-set professionals).

Earlier this year LumaForge reached out to me to be a sponsor on our weekly Tao Colorist Newsletter. As part of my due diligence, I helped Rich and LumaForge team up for a very data-intensive, quick-turnaround, multi-editor live event that Rich was about to do for Wrestlemania.

This story is about how Rich teamed up with LumaForge—and his experience using their Jellyfish shared storage solution. He ended up keeping the Jellyfish for an entire week, testing it on three very different jobs: a 4K Sony F55 multi-camera “Behind the Scenes” shoot; a Phantom Flex and Arri XT product shoot; and a RED 5K .r3d direct response shoot.

It’s an interesting look at the concerns and challenges of a professional DIT and how the Jellyfish may have found an unexpected niche.

How a Routine Dinner Led to a ‘Shared Storage’ Case Study

Just before NAB 2016, I was having dinner with Rich Roddman. His DIT work brings him to Orlando fairly regularly and we try to get together when he’s in town. At this particular dinner, Rich was a little anxious about an upcoming gig for a live Wrestlemania event.

Rich Roddman at stage level for Wrestlemania in April 2016

Rich needed to simultaneously offload, archive and transcode (4) Sony F55s and (2) Canon C300 Mark II’s for a Behind the Scenes camera crew… and finish before the lights were turned off on him.


Rich expressed concern about managing the data flow for a team of documentary cameras

This Wrestlemania gig was going to be very demanding. He needed to offload multiple cameras, simultaneously, for the Behind the Scenes team. At the same time, he needed to transcode the camera original footage to hand off to the editorial team and he needed to make multiple backup copies for archival and safety purposes.

Rich was about to operate outside his comfort zone and he was a little anxious about it.

A short time earlier, LumaForge introduced me to their portable Jellyfish high-performance shared storage system

Based on what Rich was telling me about the demands of this gig, I though the Jellyfish sounded like a great solution for him. And since I really wanted to get someone to verify that this new sponsor of my Newsletter was legit, I asked if he’d like me to hook the two of them together?

Maybe I could get Rich to find out if the Jellyfish was for real? Or if I should be worried about my new Sponsor?

Rich agreed. And after a few emails, LumaForge was onboard and sent Rich a demo unit to use. Interestingly, the LumaForge team hadn’t really considered DITs like Rich as a target market. They were so focused on solving post-production problems, they were as curious about Rich’s feedback as Rich was anxious about solving his multi-camera live event problem.

After the event, Rich and I spoke on the phone… and his demeanor was unmistakable

From 90 miles away I could hear his grin through his words, “Patrick, I wrapped before the grips! That NEVER happens… especially with as many cameras as I was offloading. Even the camera crew was asking me how the heck I was working so quickly!”

For a DIT that is to-the-moon praise, indeed.

He told me had two more jobs that week and LumaForge was letting him hold on to the gear to test out—one was a Phantom job and the other a 5K RED shoot. Shortly after Rich sent his demo unit back to LumaForge, they sent Rich a questionnaire about his experience.

What follows is Rich Roddman’s own words about his experience using Jellyfish in multiple different on-set jobs

And in case you’re wondering, Rich called me this week to let me know he’s getting ready to pull the trigger on buying one of these units. That, more than anything, makes me feel good about having LumaForge as a Tao Colorist Newsletter sponsor. Enjoy reading the details of Rich’s experience!

What made you feel like you needed shared storage on set?

As a DIT I am always looking to improve workflows for speed and data safety while on location. Normally you are one of the last to leave the set with everyone waiting for you to finish. Anything that can speed up the process while maintaining the integrity of the media is worth exploring; it makes me look good and I hate being the reason anyone has sit around and wait.

What were you worried about with the various shared storage systems you had seen?

When I first looked into working with shared storage systems the two things that immediately stood out were the general size of the systems and the cost. They were designed to be installed into post houses and not be moved. They weighed 80-100 pounds and used great amounts of power and still were quite fragile for working day after day on-set and moving around to different locations. Then I added the purchase price of the system into the equation and there was almost no ROI

Rich's DIT cart, ready to rock for Wrestlemania

Rich’s DIT cart, ready to rock for Wrestlemania. The Jellyfish is the silver unit on the top shelf behind the keyboard. (click to open full size in a new tab)

What was the reason you were always the last one to leave on set?

After 12, 14, sometimes 16 hours the only words you want to hear from the AD’s mouth is “That’s a wrap!” It has the same effect as “Start your engines” at a car race; it means the shoot is over and everyone sprints to pack up and go home.

Except for the DIT.

There is always footage that’s sitting in the camera and has to be backed up and processed. Depending on what the project specs require, you’re usually working away as all those around you pack up and say their goodbyes. Rarely do you beat out the grip & electric department—and then, only if they have a full truck to pack up and the director hasn’t been doing a continuous roll throughout the day (which nowadays, happens all the time).

One might think, “What’s the big deal? It’s just hard drives and data?” but someone still must copy and process it. And that someone is me, the DIT.

What was the workflow you had in mind when preparing for this Wrestlemania job?

To quote Tom Petty, “The waiting is the hardest part” and as a DIT you are always waiting on one of two things:

  • 10-20 minutes for files to transfer from the card
  • 30 minutes for transcoding for editorial

And always… you’re only doing one of those things at a time before moving on to the other task

I thought, there has to be some way of multitasking. But with bus powered drives (which clients always use for delivery) and a single computer almost any solution that multi-tasked for me, slowed down the process—or worse, corrupted the media… which is completely unacceptable.

So I wondered why there can’t be a Raid 5 type system that I can connect multiple computers into so I can offload media and at the same time transcode media with other computers, allowing me to remove the waiting time for starting the next card? I just hadn’t found a system that met my on-set needs.

Why was preventing you from executing that workflow in the real world?

The combination of single drive enclosures with computers running background tasks made using my existing gear impractical. And the robust shared storage systems I looked into were too expensive and fragile to justify bringing into the field. Trying to create my own system opened the door for media corruption.

When Patrick mentioned the Jellyfish to you, what were your concerns?

The LumaForge Jellyfish hard at work

The LumaForge Jellyfish hard at work at WrestleMania

The first concern was: Is this going to work as advertised?

Being able to copy and work from a single raid using multiple computers in the field seems like a simple concept but none of the major drive companies were doing anything close to that—not while also maintaining my data integrity, minimizing latency between computers and having the physical SAN being jostled about.

Then I was worried that I’d need a degree in IT to set it up. But once we had the Jellyfish up and running, that concern fell to the side quickly. A mere mortal can set this up. 

The LumaForge Jellyfish hard at work

Rich’s DIT cart is packed and ready for shipping. The Jellyfish is on the top shelf of the cart.

My next question was: Could the Jellyfish take the ground and pound of location shooting? Not only do I have to deal with airline weight limitations but what about setting up on a sea wall or in an airplane hanger? None of the locations are ideal for hard drives but the Jellyfish didn’t seem to care. It fired up every time and over 10TBs of media sitting on it’s drives continued to scream as 5K R3D files moved across its cables and connectors.

What was your setup across your week with Jellyfish (Wrestlemania, Phantom, RED)? 

The system we were using allowed us to have 3 computers running on the 10gb ethernet connections at the same time. This gave me a variety of hardware options. The basic setup was MacBook Pros ingesting the media and a New Mac Pro running DaVinci Resolve for processing and render.

The equipment we had with the Jellyfish was consistent for all of the jobs we tested it on.  Two Apple MacBook Pros 15” and a new Mac Pro 6-core with 64GB of Ram and the D700 Graphic cards for more GPU processing power with Resolve.

How much media was passed through/how much transcoding, etc.?

For WrestleMania our main concern was ingesting all the footage from the 6 cameras during the 5-6 hour event (from the time the gates opened). We were shooting 4K XAVC on four Sony F55s plus two Canon C300 Mark II’s and there was some transcoding involved but it was not a priority for the evening. Our main concern was getting all the footage backed up before the venue turned the lights out.

Within that 7-hour window, we created and copied over 4 TB of media. I was able to walk out on time with the rest of the crew!

That following Tuesday we were working with the Phantom Flex and Alexa XT offloading both .cine files and ARRI Raw files, then transcoding them to ProRes LT 1080p. The schedule had only one camera working at a time so I only needed to set up one laptop along with the Mac Pro to the Jellyfish.

We created a little over 2TB between the two cameras which sounds like a lot of data but in reality, with these codecs there was not a lot of footage recorded time wise. Again the Jellyfish had me waiting on new media instead of the camera department waiting on my transfers to finish. The camera department was surprised how fast I was returning cards and asking for more.

That weekend was the infomercial shoot using the RED and 5K footage of R3D Raw media. Once again we deployed two laptops and a new Mac Pro. One laptop to ingest the new 5K footage, a second to playback 5K footage that was shot 2 months prior (showing Before shots of the guests being interviewed).

And the Mac Pro was creating h.264 1080p files with timecode burn-ins for the post house, while the rest was going on.  We created just under 3TBs that day with another 4TBs loaded into the project for on-set playback.

What was the most real world bandwidth you were able to pull across the various computers at a given time?  How much were you taxing the system?

LumaForge Jellyfish simultaneous transfer speeds

In a quick moment of ‘downtime’, Rich did a quick test of Jellyfish’s bandwidth on two laptops. (click to open full size in a new tab)

To be honest, once everything was setup and running smoothly it was a real world production so testing our speed took a back burner. But there was one moment at WrestleMania where we had to offload 3 SxS pro cards at the same time. Each with a 110-115 GB of 4K Sony XAVC media on them.

With the two laptops and the one Mac Pro, 3 cards were offloaded and had a 64bit checksum run in under 9 minutes.

That’s over 330Gb in less that 9 minutes. It is truly a game changer.

What sort of “a-ha” moments did you have?  Did your workflow change at all from Job 1 to Job 3?

Each project presented its own version of an “a-ha” moment.

Working at WrestleMania
When wrap was finally called we had 6 cameras show up with 8 cards (2 they meant to send back earlier) yet within the hour (and before the camera team had even finished packing up their camera gear) we had offloaded all the media from the entire day onto the Jellyfish. Everyone was surprised, yet that was my plan and the Jellyfish made it a reality.

Shooting a beer commercial
The second job was working on a beer commercial for a Japanese company using a Phantom Flex shooting at 1600fps. Our plan was to connect the Jellyfish directly to the 10gb port on the camera which would allow us to bypass two steps of transferring the hundreds of gigs of media created by the camera before converting them to ProRes dailies as requested by the Post house.

Unfortunately, we were not able to test this connection. The schedule had the camera practicing the location and positions of shots continually before the hero talent showed up on set. But I used the Jellyfish to offload media from the Phantom and ARRI RAW (from an Alexa XT) while making dailies. The speed of the Raid on Jellyfish allowed me to have all the offloads and transcodes completed—again before other departments had finished breaking down after the wrap.

Playback and Capture for an Infomercial
The 3rd job that week using the Jellyfish was working with a RED Dragon camera. This was the second shoot in a Before and After production for a new workout device. It was a combination of recording off-speed footage of the people working out on the machine along with their interviews of the experience over the past two months with it, all shot in 5K.

The director decided, on set, that she needs the talent to see themselves as they were two months ago, while they were being interviewed. Lucky for me I had guessed I would need that older footage and had loaded it onto the Jellyfish the night before. So at various points throughout the day I was:

  • offloading RED R3D media from the RED mag on one laptop
  • playing back 5K RED media in Premiere, set at full quality, from another laptop
  • transcoding h.264 timecode burn-in dailies for the post house in Resolve on a new Mac Pro

At times, this happened simultaneously and without any issues or perceived slowdowns.  I don’t know how any of these projects could have possibly worked so smoothly without the Jellyfish.

How do you see the Jellyfish changing workflows for the DIT?  What sort of workflows could this help open up on future jobs for you?

As most DITs will tell you, each project is just like a snowflake. They may all look the same from a distance but up close the details make each one individual and unique. Having a system with the speed and the flexibility that the Jellyfish offers allows me to take on projects that at one time would have been impossible to fit into certain time or budget windows.

In other words, it will allow me to take on new jobs that I otherwise would have had to turn down or put restrictions on.

Was the Jellyfish difficult to set up?

The Jellyfish could not have been easier to use:

  • Turn it on
  • Connect the Cat 7 cable to the ports from the Thunderbolt-to-10gb adapter, then the Thunderbolt to the computer
  • Open up the ShareClient app from Lumaforge
  • Mount the drive
  • Start working

How was support from Lumaforge?

The support I received from the Lumaforge team was amazing. When the Jellyfish first arrived I was having difficulty getting two of the four 10gb ports to work. After a quick call running down the start up procedure, we decided to check the card. Turns out a bumpy flight from the West Coast was to blame. Once I re-seated the 10gb card everything worked perfectly for each project I tested it with.

Do you see yourself using it again on future projects?

Without a doubt. The two basic questions asked by almost everyone that I encountered across the three separate productions were. First: How did it take this long for a company to see the need for such an on-set device where time is, literally, money. The other question: Where do they get their hands on one? (Editor: Check out more details about the Jellyfish)

I would like to modify the Jellyfish a little bit and make the casing a little more road-friendly but as far as what the insides can do, well, so far, color me thrilled.

About Rich Roddman, C.S.I.

With almost 30 years in the TV production industry, Rich likes to tell people that he fought in the trenches of the digital revolution. This gives him a unique perspective as to what works or does not work in today’s full digital workflow. This continues as technology increases it’s ever changing pace, widening the gaps between production and post production abilities and expectations. Rich specializes in bridging those gaps for his clients.

Rich opened SilverBox Studios in 2009 providing services connecting on-set to the edit suite, with successful workflows for productions large and small. In 2015 SilverBox Studios opened the doors to The Crayon Factory which is Tampa Bay’s only independent color grading and editing suite; allowing producers access to services they used to fly to Atlanta or New York to obtain. Rich was just recently accepted as a member of the Colorist Society International; a guild representing colorists from around the world.

Connect with Rich Roddman at:

Comments { 3 }

How NAB 2015 Showed Me The Future (I could see today)

Imagine you’re at your favorite restaurant

You sit at a now-familiar table. Your waitress greets you by name and, without asking, brings you your favorite drink. She has the menu in her hand but doesn’t hand it to you, “The usual?”

You almost agree—but reconsider.

You decide to look at the menu and you notice it’s changed. The entrees are all familiar but they’re cooked differently. Clearly the menu has been updated.

You order a few new items and suddenly… Familiar food tastes completely different.

Welcome to my NAB 2015 experience—where the familiar NAB suddenly tasted different

In today’s Newsletter I’ll be covering three topics:

  • My initial, unformed thoughts about the more widely used bits of post-production software
  • Congrats to a favorite sponsor of this website winning NAB’s Best of Show
  • And an in-depth discussion of the absolute highlight of any NAB I’ve attended, ever!

I’ll be covering both software and hardware features and releases that (mostly) aren’t here today but will be soon. I’m especially jazzed about the hardware I saw this week… technology that has been promised to us but has been either impossible to actually see or underwhelming in previous years.

I feel like this was the first NAB in my career where I saw the future of our technology before it’s actually arrived—and I walked away excited (but, with one technology, a tad concerned).

But let’s prologue with some fun statistics

  • Total steps walked from Friday morning through Thursday night: 89,855 (my weekly average is: 30k – 40k steps)
  • Parties attended: 5
  • Parties missed (that I know of): 4
  • Best meal: At the Team Mixing Light dinner, the Tao Treasurer and I ate the Peking Duck Tasting Menu at the first Michelin rated Chinese restaurant in the United States: ‘Wing Lei’.The meal was fantastic until the final entree, which was average and a bit dry.For the first time I also tasted Mochi Ice Cream, which I shared with our guest speaker for the next day’s training, Andrea Chlebak—the colorist of ‘Chappie’ and ‘Elysium’ out of Vancouver’s SkyLab. It has a strange texture, indeed.

Getting to business, what are my initial impressions of this week’s software updates? Let’s start with DaVinci Resolve 12

My very first reaction… Wow, lot’s of eye candy interface changes—I hope Resolve 12 is more than just pretty icons and fonts. 

Then I thought: Holy crud, I need to completely re-record our 14-hour Davinci Resolve Deep Insights training. A mere ‘What’s New’ update is fine for existing users. But all those new editor-types they’re attracting? A ‘What’s New’ title won’t cut it.

Five minutes later, after messing around on the interface, what did I think?

I like the eye candy. I like the interface changes. They are improvements and they’re not all extraneous.


When evaluating new versions of DaVinci Resolve I always ask myself:

  • Did they remove mouse clicks (streamlining the interface)
  • Did they give me new tools enabling new creative possibilities?

My first impression: Yes on both counts (with one big concern).

Here are a few of my DaVinci Resolve 12 first impressions

  • As usual, Blackmagic Marketing emphasized everything BUT color correction (if you look at the banner outside the South Hall, color correction falls under ‘And More’). I’ve learned to stop stressing about it—it’s what they do. Besides, colorist and DaVinci Resolve Product Specialist Alexis Van Hurkman confirms that fully half the new Resolve 12 features fall within the Color tab. And the software seems it.
  • A 2-year feature request of mine was finally implemented! We now have endpoints on curves! And thanks to all of our Mixing Light members who tweeted me about this long-time complaint of mine finally being addressed :-)This means Avid Symphony colorists can now manipulate curves the way they expect… and those of us exploring the LAB colorspace can make ‘cleaner’ AB contrast adjustments. Thanks Team DaVinci Resolve!
  • Favorite new command: Node tree cleanup. FINALLY! Plus, we can now nest multiple nodes… and then color correct on top of the nest. For example, if you take a few nodes for the initial base grade, you can now nest those down to one node, then grade on top of that for your Shot Matching Pass. Very cool. But if you’re not careful you may find yourself clicking way more often than you used to.
  • 2nd favorite new command: “Append Node to Selected Clips”. This will save MANY mouse clicks.

Some other little nifty items that jumped out:

  • In Multicam sequences, you can ‘step in’ to the single track multicam and grade each camera separately in the Color Tab. For those kinds of jobs, it’s a thoughtful feature.
  • Alpha channel outputs can be fed directly into Video. Very useful if you want to clean up your key signal using your normal grading tools or pull up a clip assigned as a Matte and use it as a video source.
  • ‘Convert window to bezier shape’: Select a normal pre-defined Circle Power Window and morph it into a bezier to re-shape it asymmetrically. Nice.
  • The new 3D keyer and 3D tracker look like terrific enhancements. Especially the 3D tracker.
  • The functional but non-grading Specialty Nodes (like the Keyer, Splitter, Combiner) now look different than grading nodes—which should help newbies not mistake them for color correction nodes (a problem I frequently help them correct).

Of course, I’ve got a few Resolve 12 features I’m concerned about

  • The redesigned Frame Mode in the tracker: Is it simplified or has it been dumbed down? I couldn’t tell on the show floor. I love the power user functionality of the current Frame mode. I’m nervous they made the tracker less useful on shots where tracking fails and needs human interaction.
  • The redesigned Curves interface: I get that the old Curves interface almost always required jumping into Gigantor Mode (yes, that’s the actual name of the current super-big curves display). But the Photoshop style overlapping RGB curves now requires button pushing to move between R,G,B channels. It’s impossible to directly select a curve when adding that first control point.For that reason, I don’t like that interface on Photoshop. Since I’m always looking for updates to remove mouse clicks… this interface revision has definitely added a whole bunch of new mouse clicks—and I’m not happy about that.

Moving on…

I don’t like the Big Picture color correction changes in the new release of Final Cut Pro X

Let me explain…

For years, I’ve said that Apple brought color grading as a stand-alone craft to the forefront of our industry when they bundled Apple Color with Final Cut Pro legacy. Suddenly, color correction wasn’t a teeny plug-in buried in your NLE.

Color correction gained wide-spread recognition as it’s own craft with dedicated software. It started to become something even micro-productions could do.

I then gave Apple HUGE kudos for continuing that tradition in Final Cut Pro X. No, I still don’t care for the Color Board (though I’ve finally learned how to make precise, accurate moves on it and am much more ‘at peace’ with the interface). But at least the word ‘Color’ was right there in every editor’s face and impossible to miss.

Apple released Final Cut Pro X 10.2 and they reversed almost 10 years of color emphasis

The Color layer is now gone. You have to hunt for it as an effect or in a somewhat obscure pull-down menu.

I. Am. Sad.

Color correction is such a great story-telling tool, it’s unfortunate Apple decided to de-emphasize it. And I encourage them to think about how to bring it more user-facing since I do understand why they changed the interface.

UPDATE: I’ve gotten some pushback on these comments. I’ve written a follow-up article that digs deeper into this criticsm.

This gets me to what I liked (and didn’t like) about this week’s FCPX update

  • Apple stopped showing powerful respect for the craft of color grading by hiding the toolset and burying the Color Board with dozens of other ‘Effects’. I’d like to see either a default, bypassed Color Board in the Effects stack or a more obvious button for adding the Color Board… simply because it’s a rare shot that doesn’t require some sort of tweak I’m a colorist and I think it’s too important to bury within the User Interface.
  • On the other hand… the Color Board has become 1000% times more useful because it can now be re-ordered within the Effect stack. This is a huge functional improvement that I’ve been dying to see! (now they need to let us rename those layers) I’ll be talking a lot more about this in Mixing Light for those of you FCPx devotees looking to develop a repeatable color correction workflow in FCPx.
  • Color Finale (by Color Grading Central’s Denver Riddle) is a powerful add-on for anyone looking for a set of traditional 3-Way color wheels or Curves. But FCPx’s newly designed color workflow makes Finale’s re-orderable layer stack not quite as compelling as it was before. Still, it looks like a nice plug-in and we’ll be taking a closer look at it in Mixing Light (as well as Red Giant’s Colorista 3).
  • A quick shout-out to FCPWorks for their FCPX mini-conference in the Renaissance Hotel, directly next to the South Hall. Apple’s blanket ‘no trade shows’ mantra has hurt FCPx.In the past six months FCPx has become a viable collaborative post-production tool with all the features it needed two years ago to be a true FCP 7 replacement. Kudos for FCPWorks for filling this obvious trade show gap at NAB.

Where Apple fell, Adobe picked up in Premiere Pro

Premiere Pro has long been ‘color challenged’. Over at Mixing Light, my partner Robbie Carman even did an Insight on how to adjust the default settings of Premiere’s 3-Way Color Corrector filter so it doesn’t… suck.

This week’s Preview of Premiere Pro CC 2015 shows a dramatic reversal of that app’s disappointing tradition.

While FCP X 10.2 buried its color toolset, Premiere Pro CC 2015 put color front and center! A new ‘Color’ workflow button at the top of the interface echoes the DaVinci Resolve tabs. Pressing on that button reveals a Color-oriented inspector that contains:

  • A new 3-Way color wheel interface
  • Easy to add LUTs
  • A nice Hue vs Saturation tool that’s as pretty as it is functional
  • Color manipulations are automatically added as a filter in the filter stack as a Lumetri Effect… meaning under the hood they’re using the SpeedGrade color science and render engine. And when you open your work in SpeedGrade those corrections are ready for additional manipulation by the colorist.

I also need to shout loudly about Adobe Candy—but not for the reason you think

Mostly, I’m very proud of my Mixing Light partner Robbie Carman. He was on Adobe’s main stage demoing Candy. It’s not a small thing, to be entrusted by a company like Adobe to make a tool like Candy relevant to post-production professionals.

Robbie helped explain how Adobe intends for Candy to be a collaborative tool. For more, be sure to check out this public Insight he recorded this week about Candy on MixingLight.com.

And what about Adobe SpeedGrade, you ask?

As I posted in last week’s newsletter—the big SpeedGrade news was all about Premiere Pro. If SpeedGrade was dead, I suppose Adobe would have announced it. But at this point it’s feeling a lot like Apple Color did in its final year. And yet, the two 3-hour SpeedGrade sessions at Post | Production World were close to capacity.

As interest in Premiere Pro grows, so does interest in SpeedGrade. Let’s hope that Adobe decides it’s an app worth investing in… this colorist definitely thinks so—but my reasons for not using it professionally will have to be saved for another day. This email is already long enough.

Moving away from software, let’s look at hardware

Who better to start with than Tao Colorist sponsor, Flanders Scientific? If they had shipped a 4K reference monitor, I would have been floored. They didn’t.

Instead, Flanders Scientific simply won the 2015 NAB ‘Best of Show’ award!

Congratulations to Bram, Johan and the rest of the FSI Team! They won for their new DM250 OLED, which is a field monitoring dream. If you visit this model comparison page and select the AM250, CM250 and DM250… you can clearly see they’re offering a range of OLED models to keep you from paying for features you don’t need. And offering truly unique options for on-set monitoring.

And with the AM250 you can now get an OLED this year for nearly the same price as the comparable LCD a year ago. Impressive!

Let’s move on to the final (and most exciting) section of this special Tao Colorist Newsletter…

Introducing the 1st Annual NAB ‘Monitor Crawl’

I want you to think of the Monitor Crawl like a Bar Crawl. You gather a few of your best friends, hit the road and keep drinking until you can’t drink no more. Except instead of hitting the road we hit the Central Hall. And instead of drinking we looked at reference displays.

Now—I’ve done this before, bouncing around looking at reference monitors. But always on my own. It’s almost always boring as heck and you’re never quite sure what to think about what you’re seeing.

But bring a few very experienced professional colorist friends along?

This Monitor Crawl was not just my highlight of NAB 2015. But any NAB. EVER.

How did we not do this before? I have no idea. But it was spectacular and will be repeated.

I mean, put a group of Colorists in a dark room looking at displays and the comments start flying! You’re forced to really evaluate what you’re looking at, form quick opinions and then have those opinions examined in real-time as you’re all looking at the same display with the same footage.

Our First Annual Monitor Crawl included the following colorists:

  • Alexis Van Hurkman: Author and colorist (Minneapolis)
  • Joe Owens: Prolific forum helper, technical book editor for Alexis and himself a first-class colorist—in all senses of first-class (Edmonton, Canada)
  • Myself: Colorist and Tao Newsletter publisher (Orlando)
  • Michael Sandness: I saved the best for last. Michael is a prolific colorist with a really sharp mind. He works out of Splice in Minneapolis and Michael was the key to this Monitor Crawl.He had done all the scout work early in the week. He knew where every dark room, housing every interesting must-see display was ‘hidden’. Michael led us from booth to booth. We all examined and commented until we were bored and then he had us bee-lining to the next must-see booth.

Remember how I said this was the year I saw our future?

This 2-hour Monitor Crawl is what I’m talking about… (and the following opinions are mine alone, the rest of the crew can speak for themselves)

The Monitor Crawl was filled with ‘Gear I’ve never seen before, but will see again’

It featured two things: High Dynamic Range displays and… wait for it… Rec. 2020, of all things. Let’s start with the HDR displays.

I’ve seen Dolby’s initial forays into HDR displays in prior years. They were interesting but never really impressive to me. I always shrugged and moved on.

This year, the HDR displays were full-on crazy. For us, it started at the Canon booth

Canon showed a 4K LED 30-inch High Dynamic Range prototype. It has a peak brightness of 2,000 nits… perceptually, it seemed 2x-3x brighter than today’s properly calibrated displays. And it (literally) felt like it.

Example: In the looping movie there’s an interior tracking shot of a man walking across a darkened bar. The sunlight shining in the windows glowed so realistically for a few moments it looked, well, real (in fact, several of us commented that the extreme dynamic range did almost as good a job catching the ‘real’ in ‘real life’ images as any 3D system ever has—and goes to show how important contrast is to perceived detail and depth).

But here’s the kicker…

When the scene suddenly cut to a full-on exterior with a midday sun… my eyes hurt at the sudden transition—they had to adjust just as they would in real life if I stepped out of that darkened bar at noon in the desert.

The Canon HDR was both astonishing and concerning

I can’t imagine color grading for days on end a film shot in the desert at high noon (as I did precisely, on an award-winning feature-length Indie just last year).

I have no doubt that HDR will be a serious health concern for professional colorists

Display manufacturers must address this issue. Eyes can’t be replaced—but, not jokingly, colorists can be.

If we want to ensure long, healthy careers these 2000+ nit displays must be designed to keep an accidental bump on a contrast ring from burning us out… literally. Or from the damage of sustained exposure to these super-high brightness levels.

That said… the Canon prototype was the most impactful of the HDR displays I saw during the Monitor Crawl.

The Sony BVM OLED and HDR displays were both impressive

Yes. I think it’s insane to buy a BVM at their $20K+ prices… but damn if you don’t get image for your money. In fact, their BVM OLED is so good, the HDR monitor looked just like it—only packing more punch.

The Dolby booth was super-interesting—but for a different reason

They had a darkened grading suite set-up which featured a Dolby Vision HDR display sitting directly next to a Dolby High Definition Rec 709 display. A colorist from Deluxe was driving an attached Baselight.

As he was grading the footage playing through the Baselight, both displays updated simultaneously.

Of course, that set us upon a flurry of questions—which were answered very nicely, though they were surprised by the sudden onslaught of 4 gentleman asking some very pointed colorist-type questions. Here is what we discovered:

  • When color correcting to the Dolby HDR display (rated at 4,000 nits but I don’t think any of the images got nearly that bright… not in comparison to what we saw at Canon), they simultaneously color grade to a Rec 709 display set beside the HDR display.
  • The ‘downconverted’ Rec 709 image is managed by a ‘Dolby Vision box’ attached to the Baselight (they said the box also talks to DaVinci Resolve). The image path goes from Baselight, out to the HDR display, in to the Dolby Vision box and then to the Rec 709 display.
  • A ‘Dolby Vision’ grading layer in Baselight (or Resolve) gives the colorist control over the ‘Dolby Vision box’ and how the HDR down-convert is managed. There’s basic Lift / Gamma / Gain controls plus a few others for flattening the HDR image into the narrower tonal range of a non-HDR display.
  • The ‘Dolby Vision’ grading layer then gets encoded as metadata with the final rendered output. When delivering the final master, the master is an HDR movie with metadata for normal range HD down covert. Any licensed Dolby Vision display can read the metadata and perform a real-time downconvert that the colorist specified via that Dolby Vision grading layer.
  • This means if you buy a Dolby Vision encoded movie for your normal range HD display today. In five years when you buy your HDR display… that same movie will now play back in full HDR glory.

Cool stuff, right?

Of course, it took a few of us asking the same questions over and over until we all finally ‘grokked it’ and left the poor Deluxe colorist alone. Unfortunately, the room was too dark for any of us to read name tags, so I can’t give him proper thanks.

But it was this type of tag-teaming, and a quick huddle afterwards to confirm what we all thought we heard, that made this group Monitor Crawl so exciting.

Wrapping up this Newsletter, here’s where I saw something I didn’t think I’d see for several more years

I saw Rec. 2020. For REAL. With my own eyes.

Now, to be clear, the Canon folks say their HDR prototype was showing Rec 2020. But with no before / after images, I don’t think anyone on the Crawl thought that claim didn’t have three asterisks accompanying it.

But at the Christie booth, they demo’ed their RGB Laser projector on a gigantic screen. Their booth was open air but no overheads were turned on. Still, it was in the middle of the show floor, so hardly a proper Black Box, yet the image was very bright (at half its potential brightness).

And the demo? It compared Rec 2020 to Rec 709 and DCI-P3 by freezing an image and cutting between the three color gamuts.

This was the first time in my life I actually saw the Rec. 2020 color gamut

If ever I’m bleeding Geek, right now is it and I’m happy to share!

In my recent podcast with FSI’s Bram Desmet, he mentioned that only laser projectors can hit those super-saturated R, G, B primaries specified in the Rec. 2020 gamut. And you could see the difference. Especially the reds. You don’t realize how orange’y our HD reds are until you see them cut into the Rec 2020 color space. Rich. Vibrant. Real reds. Real greens.

Plus… lasers! Now, I’m just waiting for my hover board.

On a side-note, I asked the Christie rep about the FDA certifications required for laser projector installations. He said they’ve worked out the specification… and as long as no one can look directly into the projector from closer than 13 feet, laser projector installations are considered safe to the public.

There it is. My report on the year I saw The Future at NAB

I could (but won’t) go on. However, I do need to send a Special Thanks…

Thank you to my wife, the Tao Treasurer—you’re amazing! Not only would the hugely successful (and sold out) Colorist Mixer not have happened without her (we had 225 people this year) but she was a total trooper.

As we went to parties and networking events, she was patient as I spent time networking (instead of focusing on her). She even had a good time during the Monitor Crawl, offering her thoughts on what she saw. Thank you, Pam—you’re my Rock.

– pat

Feel free to leave your comments below.

This blog post was originally published in Tao of Color’s weekly Sunday Color Correction Newsletter. To subscribe, please visit the Newsletter homepage.
Comments { 15 }

Podcast: Flanders Scientific Part 3

“The Future of Reference Monitors”

Bram Desmet – CEO and General Manager, Flanders Scietific

Bram Desmet is the CEO and General Manager of Flanders Scientific, Inc., based in Georgia just 30 minutes outside of Atlanta.

Despite holding a B.A.in Philosophy from GA State University – and being an instrument rated airplane pilot – Bram ultimately followed in the footsteps of his father, (a 30 year veteran of the professional broadcast industry) when he joined DDA (a sister company of FSI) and then later Flanders Scientific. Both companies focus heavily on professional display technology.

As Managing Director at Flanders Scientific Bram is a vocal advocate of FSI’s core philosophy of providing professional broadcast products that strike an ideal balance between performance, features, and affordability.

In Part 3 of Bram’s Interview we discuss:

  • What is Rec. 2020?
  • Are there devices that can display the Rec. 2020 color gamut?
  • 4K Displays: How widely manufactured? How about true 4k vs. UHD?
  • The problems with high-performance 4K displays

Questions answered from LiftGammaGain:

  • What are the factory settings of FSI displays when they ship to the customer?
  • Do we need to recalibrate if we switch off the factory settings?
  • Why is the 17″ OLED more expensive than the 24″ OLED?
  • How to get better audio sync between SDI video and analog audio monitoring?
  • How to set the output of your camera to minimize audio delay
  • Will FSI allow ‘live grading without a LUT box’ on their displays?
  • Will there be a Mac or PC app for quick LUT loading on an FSI display?
  • Are there scaling artifacts we need to worry about when monitoring 4K material in 2K mode?
  • Do FSI displays clip out-of-gamut data?
  • Finally, there’s the ‘peanut gallery’ question (thanks Paul Provost)!

This podcast was edited by Tom Parish out of Austin, Texas. Visit him at TomParish.com.

Tweet, Like, or Leave a comment! (bottom of the page, no registration required)

Listen Now

Part 1 | Part 2 | Part 3

Subscribe in iTunesSubscribe to the Tao Colorist Sunday Morning Newsletter
More Interviews

Show Notes (links open in a new window / tab):

This interview is part of an on-going interview series with the movers, shaker, and thinkers involved in the field of professional color grading for moving images. When I have new episodes to release, they are released on Tuesdays. To be notified you may follow me on Twitter (@patInhofer), via our RSS feed, and on iTunes.

You can find more interviews here: TaoOfColor.com interview series homepage.

FCC Disclaimer
Yes, I have affiliate accounts with online retailers. Anything on this page that links to Amazon, B&H Photo or ToolFarm is  an affiliate link. If you buy anything from my affiliate link not only do I get a commission, but you get a warm pleasant feeling that you’re helping to sustain the Tao Of Color website! If that is what you do – I, and all my readers and listeners say, Thank You.
Comments { 0 }

Podcast: Flanders Scientific Update – Part 2

“The Current State of Reference Monitors
(2015 edition): Part 2”

Bram Desmet – CEO and General Manager, Flanders Scietific

Bram Desmet is the CEO and General Manager of Flanders Scientific, Inc., based in Georgia just 30 minutes outside of Atlanta.

Despite holding a B.A.in Philosophy from GA State University – and being an instrument rated airplane pilot – Bram ultimately followed in the footsteps of his father, (a 30 year veteran of the professional broadcast industry) when he joined DDA (a sister company of FSI) and then later Flanders Scientific. Both companies focus heavily on professional display technology.

As Managing Director at Flanders Scientific Bram is a vocal advocate of FSI’s core philosophy of providing professional broadcast products that strike an ideal balance between performance, features, and affordability.

In Part 2 of Bram’s Interview we discuss:

  • Is self-calibration of your reference monitor attainable ‘for the rest of us’?
  • The new fast profiling options in CalMan and LightSpace
  • DaVinci Resolve’s test patch generator
  • Low cost hardware test patch generators
  • What is the point of reference monitors when ‘grandma’s TV’ is blue?
  • Can a pro colorist rely on a sub-$1000 probe for accurate calibrations?
  • What is a tri-stimulous colorimeter? And why do you need the matrix settings for your specific probe?
  • What is a spectroradiometer? When does the precision become overkill?
  • How often do you need to recalibrate your reference monitor?
  • What’s the difference between color drift in OLED vs RGB LED vs CCFL LED displays?
  • What’s the drift in colorimeters?
  • Why did FSI change the default factory gamma setting to 2.4 from 2.2?
  • Why does FSI use the Power 2.4 setting rather than BT.1886?
  • How to adjust your minimum black level on an OLED?
  • How to adjust your peak white levels on an OLED?
  • Where should you set the black levels on an OLED?
  • What is the EBU standard for reference black levels?
  • What problem is BT.1886 trying to solve?
  • Why did FSI change their default factory peak luminance setting to 100 nits?
  • What’s the potential problem with BT.1886?
  • Setting the reference display to see brightness change from bit 16 to bit 17
  • Can you see the difference between 100 nits and 125 nits?

This podcast was edited by Tom Parish out of Austin, Texas. Visit him at TomParish.com.

Tweet, Like, or Leave a comment! (bottom of the page, no registration required)

Listen Now

Part 1 | Part 2 | Part 3

Subscribe in iTunesSubscribe to the Tao Colorist Sunday Morning Newsletter
More Interviews

Show Notes (links open in a new window / tab):


This interview is part of an on-going interview series with the movers, shaker, and thinkers involved in the field of professional color grading for moving images. When I have new episodes to release, they are released on Tuesdays. To be notified you may follow me on Twitter (@patInhofer), via our RSS feed, and on iTunes.

You can find more interviews here: TaoOfColor.com interview series homepage.

FCC Disclaimer
Yes, I have affiliate accounts with online retailers. Anything on this page that links to Amazon, B&H Photo or ToolFarm is  an affiliate link. If you buy anything from my affiliate link not only do I get a commission, but you get a warm pleasant feeling that you’re helping to sustain the Tao Of Color website! If that is what you do – I, and all my readers and listeners say, Thank You.
Comments { 2 }