Claus Witt

At Fliva we had an issue; customers kept uploading iPhone footage in HDR formats with varying framerates, and our software could not handle it. The colors looked washed out, and just plain wrong.

Our first stab at a solution was, of course, to use ffmpeg. However ffmpeg produced the same result when converting hdr to sdr. We did find some posts (you know that thing called stackoverflow?) about how to set - a lot of - parameters to make it work. But we never found the magic incantation to make it work with the same settings every time.

We tried out a bunch of transcoding service providers; but none of them supported converting hdr footage from an iPhone to sdr footage that looked right (one was close; but then the sound started being out of sync in some cases)

We have our own iPhone app that most users upload through, and in an earlier version this app could transcode the files correctly locally on the phone before uploading. However, this feature was buggy in other ways - causing the app to sometimes crash at this step and other times to upload the hdr version anyway. And it was sloooow for the endusers.

But this made us think; we know that apple hardware/software supports this; and quickly found the avconvert tool installed on all macs. (If you right click a video file, there is an entry in the Service submenu called "Encode Selected Video Files" which uses this exact command behind the scenes)

We started using this for manually converting hevc/hdr footage from customers, and it just worked every time.

Next step; make a service that does this for us. And we allready had most of the plumbing ready for the other tries with various providers, so instead of our api wrapper project which receives a http call from us, and uses a callback to notify when the file is done transcoding - we move the enqueing part to sqs.

Then we made a very small go program, that consists of theses steps:

That is it.

The go program really delegates all heavy lifting to ffmpeg, curl and avconvert, and is only responsible for polling the sqs queue and sending http requests as callbacks when it is done.

If we need more throughput the current server can probably handle running a couple of these instances without problems; and if we need even more - we will just start a new server with the same software.

If we need to handle multiple priorities; we will just use multiple sqs queues and push the job to the priority we need; and let the services go through high priority queues before low priority ones.

I suspect this will scale very well without much change needed; and will provide us with some time to make sure our own software can handle this video format.


Recent posts