I love FFMPEG.js, it’s a neat tool that is compiled with asm.js`and it let’s me build JS web apps that can quickly edit videos. FFMPEG.js also works with web workers so that you can encode videos without blocking the main thread.
I also love Comlink. Comlink let’s me easily interact with web workers by exposing functions and classes without having to deal with a complex postMessage state machine.
I recently got to combine the two together.
I recently built a Progressive Web App that takes a screencast from your Android device and then wraps the video in a device frame using FFMPEG.js like so:
I also managed to sort out building ffmpeg.js so that with relative ease, create custom optimized builds of ffmpeg and run it in the browser.
The two things together I think present a lot of opportunities to build some great new small Progressive Web Apps that push what we think the web is capable of with regards to manipulating audio and video.
FFMPEG.js is an amazing project and it helped me building one of my latest projects: Device Frame. It basically builds ffmpeg (with a good set of defaults to keep the size small — as small as it can be). If the default build doesn’t support the filters and encoders you need, then you will need to build it yourself.
This is more of a note for me in the future, but this is what I did to get it working.
I wrote about screen recording from Android a little while ago and whilst it is cool, I didn’t document anything of the process that I had to get it into the device frame and make the screen recordings look all “profesh”.
The process in the past was pretty cumbersome, I would grab the screen recording using my script and then use Screenflow to overlay the video on the device frame and then do export that out to the mp4 followed by a quick bit of GIF hakery.