Media Apple FCP X

… for Codecs & Media

Tip #304: What is FFmpeg?

Larry Jordan –

An open source project supporting hundreds of media formats.

The FFmpeg logo, reflecting how many media files are compressed.

Topic $TipTopic

FFmpeg is a free and open-source project consisting of a vast software suite of libraries and programs for handling video, audio, and other multimedia files and streams. At its core is the FFmpeg program itself, designed for command-line-based processing of video and audio files, and widely used for format transcoding, basic editing (trimming and concatenation), video scaling, video post-production effects, and standards compliance.

FFmpeg is part of the workflow of hundreds of other software projects, and its libraries are a core part of software media players such as VLC, and has been included in core processing for YouTube and the iTunes inventory of files. Codecs for the encoding and/or decoding of most of all known audio and video file formats are included, making it highly useful for the transcoding of common and uncommon media files into a single common format.

The name of the project is inspired by the MPEG video standards group, together with “FF” for “fast forward”. The logo uses a zigzag pattern that shows how MPEG video codecs handle entropy encoding.

The FFmpeg project was started by Fabrice Bellard in 2000. Most non-programmers access the FFmpeg suite of programs using a “front-end.” This is software that puts a user interface on the FFmpeg engine. Examples include: Handbrake, ffWorks, MPEG Streamclip, and QWinFF.

Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #303: What is MXF OP1a?

Larry Jordan –

MXF is an industry-workhorse because it is so flexible.

Topic $TipTopic

MXF (Material Exchange Format) was invented by SMPTE in 2004. MXF is a container that holds digital video and audio media. OP1a (Operational Pattern 1a) defines how the media inside it is stored.

MXF has full timecode and metadata support, and is intended as a platform-agnostic stable standard for professional video and audio applications.

MXF had a checkered beginning. In 2005, there were interoperability problems between Sony and Panasonic cameras. Both recorded “MXF” – but the two formats were incompatible. Other incompatibilities, such as randomly generating the links that connect files, were resolved in a 2009 redefinition of the spec.

MXF generally stores media in separate files. For example: video, audio, timecode and metadata are all separate. This means that a single MXF container actually supports a variety of different media codecs inside it.

Another benefit to MXF OP1a is that it supports “growing files.” These are files that can be edited while they are still being recorded. (Think sports highlights.)

… for Codecs & Media

Tip #228: How Much RAM Do You Need For Editing?

Larry Jordan –

More RAM helps – to a point.

This chart illustrates how RAM needs increase as frame sizes increase.
RAM requirements for 30-fps, 8-bit video at different frame sizes (MB/second).

Topic $TipTopic

The way most NLEs work is that, during an edit, the software will load (“buffer”) a portion of a clip into RAM. This allows for smoother playback and skimming, as you drag your playhead across the timeline.

When a clip is loaded into RAM, it is uncompressed, allowing each pixel to be processed individually. This means that the amount of RAM used for buffering depends upon several factors:

  • How much RAM you have
  • The frame size of the source video clip
  • The frame rate of the source video clip
  • The bit-depth of the source video clip

This graph illustrates this. It displays the MB required per second to cache 8-bit video into RAM. As you can see, RAM requirements skyrocket with frame size. These numbers increase when you have multiple clips playing at the same time.

NOTE: These numbers also increase as bit-depth increases, however the proportions remain the same.

The amount of RAM you need varies, depending upon the type of editing you are doing.

  • 8 GB RAM. You can edit with this amount of RAM, but editing performance may suffer for anything larger than 720p HD
  • 16 GB RAM. Good for most editing.
  • 32 GB RAM. My recommendation for editing 4K, 6K, multicam and HDR.
  • 64 GB RAM. Potentially good for massive frame sizes, but not required.

Anything more than 64 GB of RAM won’t hurt, but you won’t see any significant improvement in performance; especially considering the cost of more RAM.

… for Codecs & Media

Tip #078: For Best Quality, Export a Master File

Larry Jordan –

“Highest quality” doesn’t always mean matching your camera format.

Tip Screen Shot

Topic $TipTopic

If you are in a hurry, export what you need to post and get on with your life.

However, one lesson I’ve learned over the years is that there’s never “just one version” of any project. Copies using different codecs and compression always need to be made. My recommendation is to ALWAYS export a master file of any project and archive that, so that when copies need to be made, you don’t need to reopen a project, reconnect media, re-render effects and re-export. A master file saves all that wasted time.

But, what is a master file? In terms of editing, it means exporting a video that matches your project settings. There’s no reason to export a different format, because the highest quality you can export is that which is the same as the project settings.

For example, exporting an H.264 project as ProRes 4444 will generate a larger file, but not higher quality.

This is why I recommend transcoding highly-compressed camera master files into a higher-quality intermediate codec before starting editing. Transcoding won’t improve the quality of what you shot, but it can improve the quality of transitions, effects and color grading; and, thus, the entire project prior to export.

… for Codecs & Media

Tip #088: Where Should You Store Media

Larry Jordan –

Internal or external storage. Which is best?

Tip Screen Shot

Topic $TipTopic

When it comes to storage and media, there are two essential questions: How much capacity and how much speed do you need?

Most current computers – and all Macs – use high-speed SSDs for their internal boot drives. These provide blazing speed but very limited storage.

So, as you are thinking about where to store media, consider this:

  • If you have a small project, using the internal SSD is fine.
  • If you have a large project, or need to move it between computers or editors, external storage is better.
  • For best results, store cache files (and Libraries in FCP X) on the internal boot drive or your fastest external storage.
  • SSDs are about four times faster than spinning media (traditional hard disks), but spinning media holds more and is much cheaper.
  • A single spinning hard disk is fine for HD, but not fast enough for 4K or HDR
  • RAIDs are preferred for massive projects, like one-hour shows or features, large frame sizes, HDR, or faster frame rates. They hold more and transfer data faster than a single drive.
  • Don’t store media on any gear connected via USB 1, 2, or 3 Gen 1. It won’t be fast enough. Howver, you can use these devices for backups and longer-term storage.
  • Servers are fine for storing and accessing media, but they won’t be as fast as locally-attached storage.
  • In general, if you are getting dropped frame errors, it means your storage is too slow to support the media you are editing. Invest in faster storage.

… for Apple Final Cut Pro X

Tip #055: When to Pick Optimized or Native Media

Larry Jordan –

Picking the wrong option will slow things down.

Tip Screen Shot

Topic $TipTopic

Final Cut supports a variety of media for editing, not just codecs, but native, optimized and proxy media. Which should you choose? Here’s a simple guide.


Native media is what your camera shoots.

Use native media in your edit when you are in a hurry, don’t need to apply a lot of effects or transitions, or when working with high-end log or HDR media.


Optimized media is native media that Final Cut transcodes in the background to ProRes 422; most often using the Transcoding options in the Media Import window.

Use optimized media in your edit for projects that have lots of effects, were recorded using very compressed camera formats such as H.264 or HEVC, require lots of exports for client review, require extensive color grading.


There’s a belief among some editors that editing proxies is somehow “weak.” Actually, virtually every film ever edited was created using proxy files – except they were called “work prints.”

Proxies are smaller files, great for creating rough cuts where you are concentrating on telling stories, because they don’t require as much storage, and you can easily switch from proxy to optimized/native media – retaining all effects – at the click of a button.

Use proxy files in your edit when storage space is tight, you need to edit on an older/slower system or when you are working with large frame size files (4K and above).

When you are ready to color grade and output, switch back to optimized/native.


Optimized files are faster and more efficient to edit and, in the case of highly compressed native files, yield better color grading, gradients and effects. But they take up more space. Most of the time, the trade-off is worth it.