Blackmagic Conversion

Occasionally I intercept outdated technology on the way to the dumpster with the help of the IT department at my school.  I think they are mostly amused that I’d want it, but I know that I can get some more use out of these depleted robots.  One such perfect reuse occurred to me when I began to think about the workflow our students use on their advanced projects.  They were working like people did in the 90s, not at all how people work on sets these days.

I noticed that whereas a professional set would have a DIT (Digital Imaging Technician) working to copy recording media onto multiple hard drives and sometimes to apply LUTs and make proxies, all of these were foreign concepts to people who were still having trouble using lights and not crossing the line.  And yet DIT is actually an entry level job for young people who understand the concept, so it seemed like something to address. They would not need this position in the beginner class, but I’m slated to teach the advanced class next semester.

I noticed that when I explained the role of DIT last year and asked students to fulfill those duties on their own, it was just another thing for them to forget.  No one was assigned to the task, no one remembered to bring a laptop to make copies of the footage, and the precious camera “negative” would be trundled about in backpacks or pockets without so much as an exterior case to protect it.

One of those junky MacBooks could surely be repurposed as a DIT machine that would go out with our Blackmagic Cinema Cameras.  Then there would be no excuse, right?   Then saddle the project editors with the chore of making multiple backups and we’re into the 21st century.  But then I thought wouldn’t it also be good if they were able to check footage on set? To make sure that the crew was getting coverage?  Would that be hard to do?

It turns out yes and no.

On a new laptop, no.  One can download a free copy of DaVinci Resolve and do everything from there.  But on a 2007 white Macbook?  Not even a “Pro” model?  A Core Duo?  The department would never in a million years have money to buy new laptops for me.  If I wanted it, I would have to build it – once again.

The answer was surprisingly simple, but it was pretty far outside my abilities nonetheless.  I know nothing about shell script, and have displayed my ignorance often in the series of posts about the Render Farm, most of which I’ve written to myself, since I’m well aware no one reads those. But in this case it seems I made something worthwhile, and it’s worth sharing.

There is no low-power old-computer way to do this.  Very old versions of DaVinci require a USB key;there was no free version until about v9, and that’s a bit too late and too resource intensive for late 2009 model machines.  Thus I would have to figure out how to cobble together FOSS (Free and Open-Source Software) technologies like ffmpeg to make it work on my own.

My wonder app should be able to accept an entire folder of Blackmagic DNG files and convert it to bog-standard h264 at 1080.  Then students can use the free open source video editor ShotCut to do minor assembles for testing purposes.  I chose Shotcut because it is FOSS  but also because it was working, which is a factor some of the others (I’m looking at you OpenShot) did not have.  In all fairness, I’m sure that supporting almost ten year old hardware was one of the issues.

The thing about Blackmagic DNGs is that most software does not really work with them.  DNGs are “Digital Negative Files” that are made for retaining the greatest amount of raw data from the image sensor, not really for watching or editing.  And they are all single image files, not a continuous “clip.”  Furthermore, it’s not like there is a lot of FOSS that will read them, even.

Googling like mad yielded the perfect answer, on a page made by photographer Karoliina Solminen, in which she described a workflow very similar to mine: slow old computer (hers was a 2009 iMac) and no way to run Resolve.  Her solution?  Use a FOSS package called “libraw.”  So fire up the terminal!  I had installed homebrew a long time ago to get things going with this kind of minor crap development, so it was already there when I needed it.  The libraw and dcraw packages were the ones Solminen mentioned.  I knew I’d need ffmpeg at some point, so I installed that, too.

brew install libraw
brew install dcraw
brew install ffmpeg

Following her lead, I would save every DNG to a TIFF and then use ffmpeg to string them together. My workflow his a few snags hers did not have, though.  The first is that trapping spaces is hard.  So I found a small routine to take care of those:

# Rename files in case there are spaces
echo "Removing spaces..."
for n in *
do
OldName=$n
NewName=`echo $n | tr -s " " "_"`
mv "$OldName" "$NewName"
done

Of course I would have to make sure the folder name was not full of spaces or it, too, would crap out.  But that seemed rather easy to do by hand.  Now that the filenames were good, I could make the libraw conversions. I pulled most of the comments and prompts out of her script. I know I probably should not have, but my terminal was filling up with messages on thousands of conversions.

# Now use libraw to convert everything to TIFFs and move DNGs to a folder
echo "Converting to TIFF..."
mkdir dngs
for i in ./*.dng
do
/usr/local/bin/dcraw -4 -T -H 1 -g 2.7 0 -m 19 -n 0 -o 0 -b 1.5 $i &
done
wait
mv *.dng dngs/

Solminen’s script runs each dcraw process in the background, which is good for speed.  But I kept having trouble with the TIFFs showing up well after ffmpeg had already run!  She solved this issue by putting in a “sleep” command, but I found the cumulative effect of the sleeps was really slowing down long conversions.  I figured on low-power hardware this would be a liability.  But the timing was still wrong, and I could not send the script off to make a movie clip before all the TIFFs had generated.  Hence the “wait” command in my version.  And we both use a move command at the end to put the DNGs in their own subfolder.

At the very beginning of the script I changed the directory to one supplied by an argument.  So the proper use of this script would be to invoke it with the pathname of the shot you want to convert, i.e., “bmccdailies [pathname]”

# The directory is in the argument
cd "$1"

Once we’re in that directory I can use certain commands to derive the filename from the folder name.

# Create filenames to pass to ffmpeg and make the movie
echo "Creating Movie..."
shotname=${PWD##*/}_%06d.tiff
filename=${PWD##*/}.mp4
ffmpeg -i $shotname -c:v libx264 -pix_fmt yuv420p $filename
mkdir tiffs
mv *.tiff tiffs/

In order for ffmpeg to work you must supply the name of the shot with leading zeros (the %06d part).  I realize that as soon as my students work out how to change that setting in the camera then the script will probably die, but oh well.  If I were cooler I’d figure out how to count leading digits in the filename.

So the shot name and file name are generated from $1, the folder pathname you supply.  Ffmpeg makes the movie clip, and Bob’s your uncle.

This script is saved and chmod a+x applied. It is then stored in usr/bin, or usr/local/bin in the case of the hideous rootless OSX 10.11.  Since both paths are standard, it can be called right from the command line.  The general use, as described above, is

bmccdailies [pathname]

And it goes to work.  Because that may be hard for students I made a small .command file for the desktop that has instructions.  When they type “Y” for having understood, a new terminal window pops up, ready for them to type.

I would love to develop something that has an app bundle, an icon, and actually takes a folder name from a drop on the app.  Too complex for someone like me.

But wait!  There is a problem!  The script worked fine on my production machine, and I assumed it would work the same on the Core Duos, only slower.  But on the old machines the script runs too fast!  Putting each dcraw command in the background simply overloaded the processor, and the script would lock up the machine.  I did not realize how slow dcraw was and how much processing was going on. And yes, apparently a Core Duo is puny, which I already knew.

So I adjusted the script with a “sleep 3” after every dcraw line, but it was still too much and the script jammed.  So I tried the “wait” after every draw command, and it worked fine.  Except that it was now running at 1 frame per minute, which means that it takes something like 8 hours to do one shot.  The load on the script was terribly low, too – even the Core Duo can handle more than one of those frames at a time.  How many?  And how to write the script to take, let’s say, four or five at a time?

Here’s a zip of the whole thing.

BMCC Dailies

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.