Media Apple FCP X

… for Codecs & Media

Tip #282: When to Use HEVC vs. H.264

Larry Jordan – LarryJordan.com

Which to choose and why?

Topic $TipTopic

As media creators, there’s a lot of confusion over whether we should use H.264 or HEVC to compress our files for distribution on the web. Here’s my current thinking.

The big benefit to HEVC is that it achieves the same image quality with a 30-40% savings in file size.

The big disadvantage is that HEVC takes a lot longer to compress and not all systems – especially older systems – can play it.

If you are sending files to broadcast, cable or digital cinema, they will want something much less compressed than either of these formats. So, for those outlets, this is not a relevant question.

For me, the over-riding reason to use H.264 instead of HEVC is that YouTube, Facebook, Twitter and most other social media sites recompress your video in order to create all the different formats needed to them to re-distribute it. (I read somewhere a while ago that YouTube creates 20 different versions of a single video.)

For this reason, spending extra time creating a high-quality HEVC file, when it will only get re-compressed, does not make any sense to me. Instead, create a high-bit-rate H.264 version so that when the file is recompressed, it won’t lose any image quality.

Where HEVC makes sense is when you are serving files directly to consumers via streaming on your website. And, even in those cases, HTTP Live Streaming will be a better option to support mobile devices.

HEVC is mostly a benefit to service providers and social media firms.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #284: What is a Proxy File?

Larry Jordan – LarryJordan.com

Proxies save time, money and storage space.

Topic $TipTopic

A Proxy file, regardless of the codec that created it, is designed to meet three key objectives: save time, save money and use less expensive gear. Proxies meet these objectives because they:

  • Reduce required storage capacity
  • Reduce required storage bandwidth
  • Reduce the CPU load to process the file

It accomplishes these goals in two significant ways:

  • It converts all media into a very efficient intermediate codec that is easy to edit. For example, ProRes 422 or DNx.
  • It cuts the frame size by 50%. So, a UHD file, with a source frame size of 3840 x 2160, has a proxy size of 1920 x 1080. A 6K frame becomes 3K.

Proxies are best used for the initial editorial where you are reviewing footage, creating selects, building a rough cut and polishing the story. For most of us, that’s 80% of the time we spend editing any project. Proxy files can also be used for most client review exports, because they render and export faster and, at the early stage, clients aren’t looking for the final look.

Using proxies means we can use less powerful and much less expensive computers and storage for the vast majority of time spent on a project. Proxy files also allow us to get out of the edit suite and edit on more portable gear.

Switching out of proxy mode is necessary for polishing effects, color grading, final render and master export.

Many editors feel that it is a sign of weakness to edit proxies. This is nonsense. Back when we edited film, we used workprints – which is the film version of a proxy file – for everything. Somehow, great work was still turned out.

Avid, Adobe and Apple all support proxy workflows. Proxies are worth adding to your workflow.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #304: What is FFmpeg?

Larry Jordan – LarryJordan.com

An open source project supporting hundreds of media formats.

The FFmpeg logo, reflecting how many media files are compressed.

Topic $TipTopic

FFmpeg is a free and open-source project consisting of a vast software suite of libraries and programs for handling video, audio, and other multimedia files and streams. At its core is the FFmpeg program itself, designed for command-line-based processing of video and audio files, and widely used for format transcoding, basic editing (trimming and concatenation), video scaling, video post-production effects, and standards compliance.

FFmpeg is part of the workflow of hundreds of other software projects, and its libraries are a core part of software media players such as VLC, and has been included in core processing for YouTube and the iTunes inventory of files. Codecs for the encoding and/or decoding of most of all known audio and video file formats are included, making it highly useful for the transcoding of common and uncommon media files into a single common format.

The name of the project is inspired by the MPEG video standards group, together with “FF” for “fast forward”. The logo uses a zigzag pattern that shows how MPEG video codecs handle entropy encoding.

The FFmpeg project was started by Fabrice Bellard in 2000. Most non-programmers access the FFmpeg suite of programs using a “front-end.” This is software that puts a user interface on the FFmpeg engine. Examples include: Handbrake, ffWorks, MPEG Streamclip, and QWinFF.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #303: What is MXF OP1a?

Larry Jordan – LarryJordan.com

MXF is an industry-workhorse because it is so flexible.

Topic $TipTopic

MXF (Material Exchange Format) was invented by SMPTE in 2004. MXF is a container that holds digital video and audio media. OP1a (Operational Pattern 1a) defines how the media inside it is stored.

MXF has full timecode and metadata support, and is intended as a platform-agnostic stable standard for professional video and audio applications.

MXF had a checkered beginning. In 2005, there were interoperability problems between Sony and Panasonic cameras. Both recorded “MXF” – but the two formats were incompatible. Other incompatibilities, such as randomly generating the links that connect files, were resolved in a 2009 redefinition of the spec.

MXF generally stores media in separate files. For example: video, audio, timecode and metadata are all separate. This means that a single MXF container actually supports a variety of different media codecs inside it.

Another benefit to MXF OP1a is that it supports “growing files.” These are files that can be edited while they are still being recorded. (Think sports highlights.)


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #228: How Much RAM Do You Need For Editing?

Larry Jordan – LarryJordan.com

More RAM helps – to a point.

This chart illustrates how RAM needs increase as frame sizes increase.
RAM requirements for 30-fps, 8-bit video at different frame sizes (MB/second).

Topic $TipTopic

The way most NLEs work is that, during an edit, the software will load (“buffer”) a portion of a clip into RAM. This allows for smoother playback and skimming, as you drag your playhead across the timeline.

When a clip is loaded into RAM, it is uncompressed, allowing each pixel to be processed individually. This means that the amount of RAM used for buffering depends upon several factors:

  • How much RAM you have
  • The frame size of the source video clip
  • The frame rate of the source video clip
  • The bit-depth of the source video clip

This graph illustrates this. It displays the MB required per second to cache 8-bit video into RAM. As you can see, RAM requirements skyrocket with frame size. These numbers increase when you have multiple clips playing at the same time.

NOTE: These numbers also increase as bit-depth increases, however the proportions remain the same.

The amount of RAM you need varies, depending upon the type of editing you are doing.

  • 8 GB RAM. You can edit with this amount of RAM, but editing performance may suffer for anything larger than 720p HD
  • 16 GB RAM. Good for most editing.
  • 32 GB RAM. My recommendation for editing 4K, 6K, multicam and HDR.
  • 64 GB RAM. Potentially good for massive frame sizes, but not required.

Anything more than 64 GB of RAM won’t hurt, but you won’t see any significant improvement in performance; especially considering the cost of more RAM.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #078: For Best Quality, Export a Master File

Larry Jordan – https://LarryJordan.com

“Highest quality” doesn’t always mean matching your camera format.

Tip Screen Shot

Topic $TipTopic

If you are in a hurry, export what you need to post and get on with your life.

However, one lesson I’ve learned over the years is that there’s never “just one version” of any project. Copies using different codecs and compression always need to be made. My recommendation is to ALWAYS export a master file of any project and archive that, so that when copies need to be made, you don’t need to reopen a project, reconnect media, re-render effects and re-export. A master file saves all that wasted time.

But, what is a master file? In terms of editing, it means exporting a video that matches your project settings. There’s no reason to export a different format, because the highest quality you can export is that which is the same as the project settings.

For example, exporting an H.264 project as ProRes 4444 will generate a larger file, but not higher quality.

This is why I recommend transcoding highly-compressed camera master files into a higher-quality intermediate codec before starting editing. Transcoding won’t improve the quality of what you shot, but it can improve the quality of transitions, effects and color grading; and, thus, the entire project prior to export.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #088: Where Should You Store Media

Larry Jordan – https://LarryJordan.com

Internal or external storage. Which is best?

Tip Screen Shot

Topic $TipTopic

When it comes to storage and media, there are two essential questions: How much capacity and how much speed do you need?

Most current computers – and all Macs – use high-speed SSDs for their internal boot drives. These provide blazing speed but very limited storage.

So, as you are thinking about where to store media, consider this:

  • If you have a small project, using the internal SSD is fine.
  • If you have a large project, or need to move it between computers or editors, external storage is better.
  • For best results, store cache files (and Libraries in FCP X) on the internal boot drive or your fastest external storage.
  • SSDs are about four times faster than spinning media (traditional hard disks), but spinning media holds more and is much cheaper.
  • A single spinning hard disk is fine for HD, but not fast enough for 4K or HDR
  • RAIDs are preferred for massive projects, like one-hour shows or features, large frame sizes, HDR, or faster frame rates. They hold more and transfer data faster than a single drive.
  • Don’t store media on any gear connected via USB 1, 2, or 3 Gen 1. It won’t be fast enough. Howver, you can use these devices for backups and longer-term storage.
  • Servers are fine for storing and accessing media, but they won’t be as fast as locally-attached storage.
  • In general, if you are getting dropped frame errors, it means your storage is too slow to support the media you are editing. Invest in faster storage.

Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Apple Final Cut Pro X

Tip #055: When to Pick Optimized or Native Media

Larry Jordan – https://LarryJordan.com

Picking the wrong option will slow things down.

Tip Screen Shot

Topic $TipTopic

Final Cut supports a variety of media for editing, not just codecs, but native, optimized and proxy media. Which should you choose? Here’s a simple guide.

NATIVE MEDIA

Native media is what your camera shoots.

Use native media in your edit when you are in a hurry, don’t need to apply a lot of effects or transitions, or when working with high-end log or HDR media.

OPTIMIZED MEDIA

Optimized media is native media that Final Cut transcodes in the background to ProRes 422; most often using the Transcoding options in the Media Import window.

Use optimized media in your edit for projects that have lots of effects, were recorded using very compressed camera formats such as H.264 or HEVC, require lots of exports for client review, require extensive color grading.

PROXY MEDIA

There’s a belief among some editors that editing proxies is somehow “weak.” Actually, virtually every film ever edited was created using proxy files – except they were called “work prints.”

Proxies are smaller files, great for creating rough cuts where you are concentrating on telling stories, because they don’t require as much storage, and you can easily switch from proxy to optimized/native media – retaining all effects – at the click of a button.

Use proxy files in your edit when storage space is tight, you need to edit on an older/slower system or when you are working with large frame size files (4K and above).

When you are ready to color grade and output, switch back to optimized/native.

SUMMARY

Optimized files are faster and more efficient to edit and, in the case of highly compressed native files, yield better color grading, gradients and effects. But they take up more space. Most of the time, the trade-off is worth it.


Please rate the helpfulness of this tip.

Click on a star to rate it!