Media Apple FCP X

… for Adobe Premiere Pro CC

Tip #739: Premiere: No Support for FireWire DV Capture

Larry Jordan – LarryJordan.com

FireWire capture of DV media is no longer supported on Macs.

Topic $TipTopic

This tip first appeared on Adobe’s support page. While this won’t affect a lot of folks, it is still worth knowing.

Starting with macOS 10.15 Catalina, Premiere Pro, Audition, and Adobe Media Encoder no longer support the capture of DV and HDV over FireWire.

This change does not impact other forms of tape capture.
You can still edit DV/HDV files that have previously been captured.
DV/HDV capture is still available with Premiere Pro on Windows.

WORKAROUND

If you need access to DV/HDV ingest you can:

  • On macOS: Use Premiere Pro 12.x and 13.x on macOS 10.13.6 (High Sierra) or 10.14 (Mojave)
  • On Windows: Continue to use the latest versions of Premiere Pro with no impact.

Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #744: What is Interlacing?

Larry Jordan – LarryJordan.com

Interlacing was needed due to limited bandwidth.

Interlace artifact – thin, dark, horizontal lines radiating off moving objects.

Topic $TipTopic

Even in today’s world of 4K and HDR, many HD productions still need to distribute interlaced footage. So, what is interlacing?

Interlacing is the process of time-shifting every other line of video so that the total bandwidth requirements for a video stream are, effectively, cut in half.

For example, in HD, first all the even numbered lines are displayed, then 1/2 the frame rate later, all the odd numbered lines are displayed. Each of these is called a “field.” The field rate is double the frame rate.

NOTE: HD is upper field first, DV (PAL or NTSC) is lower field first.

In the old days of NTSC and PAL this was done because the broadcast infrastructure couldn’t handle complete frames.

As broadcasters converted to HD at the end of the last century, they needed to make a choice; again due to limited bandwidth: They could either choose to broadcast a single 720 progressive frame, or an interlaced 1080 frame.

Some networks chose 720p because they were heavily into sports, which looks best in a progressive frame. Others chose interlaced, because their shows principally originated on film, which minimized interlaced artifact, which is illustrated in the screen shot.

As we move past HD into 4K, the bandwidth limitations fade away, which means that all frames are progressive.

EXTRA CREDIT

It is easy to shoot progressive and convert it to interlaced, with no significant loss in image quality. It is far harder to convert interlaced footage to progressive; and quality always suffers. Also, the web requires progressive media because interlacing looks terrible.

For this reason, it is best to shoot progressive, then convert to interlacing as needed for distribution.


… for Codecs & Media

Tip #745: What is HDR Rec. 2020 HLG

Larry Jordan – LarryJordan.com

HLG is compatible with both HDR and SDR broadcast and television sets.

Chart showing a conventional SDR gamma curve and Hybrid Log-Gamma (HLG). HLG uses a logarithmic curve for the upper half of the signal values which allows for a larger dynamic range.

Topic $TipTopic

High-dynamic-range video (HDR video) describes video having a dynamic range greater than that of standard-dynamic-range video (SDR video). HDR capture and displays are capable of brighter whites and deeper blacks. To accommodate this, HDR encoding standards allow for a higher maximum luminance and use at least a 10-bit dynamic range in order to maintain precision across this extended range.

While technically “HDR” refers strictly to the ratio between the maximum and minimum luminance, the term “HDR video” is commonly understood to imply wide color gamut as well.

There are two ways we can display HDR material: HLG and PQ. (Tip #746 discusses PQ).

HLG (Hybrid Log Gamma) is a royalty-free HDR standard jointly developed by the BBC and NHK. HLG is designed to be better-suited for television broadcasting, where the metadata required for other HDR formats is not backward compatible with non-HDR displays, consumes additional bandwidth, and may also become out-of-sync or damaged in transmission.

HLG defines a non-linear optical-electro transfer function, in which the lower half of the signal values use a gamma curve and the upper half of the signal values use a logarithmic curve. In practice, the signal is interpreted as normal by standard-dynamic-range displays (albeit capable of displaying more detail in highlights), but HLG-compatible displays can correctly interpret the logarithmic portion of the signal curve to provide a wider dynamic range.

HLG is defined in ATSC 3.0, among others, and is supported by video services such as the BBC iPlayer, DirecTV, Freeview Play, and YouTube. HLG is supported by HDMI 2.0b, HEVC, VP9, and H.264/MPEG-4 AVC.


… for Codecs & Media

Tip #746: What is HDR Rec. 2020 PQ?

Larry Jordan – LarryJordan.com

PQ provides for the brightest images, even though technology today can’t fully support it.

The PQ inverse EOTF (electro-optical transfer function). I thought you’d like to see the math.

Topic $TipTopic

High-dynamic-range video (HDR video) describes video having a dynamic range greater than that of standard-dynamic-range video (SDR video). HDR capture and displays are capable of brighter whites and deeper blacks. To accommodate this, HDR encoding standards allow for a higher maximum luminance and use at least a 10-bit dynamic range in order to maintain precision across this extended range.

While technically “HDR” refers strictly to the ratio between the maximum and minimum luminance, the term “HDR video” is commonly understood to imply wide color gamut as well.

There are two ways we can display HDR material: HLG and PQ. (Tip #745 discusses HLG).

Perceptual Quantizer (PQ), published by SMPTE as SMPTE ST 2084, is a transfer function that allows for the display of high dynamic range (HDR) video with a luminance level of up to 10,000 cd/m2 and can be used with the Rec. 2020 color space.

NOTE: cd/m2 refers to “candela per meter squared.” One cd/m2 equals one IRE.

PQ is a non-linear electro-optical transfer function (EOTF). On April 18, 2016, the Ultra HD Forum announced industry guidelines for UHD Phase A, which uses Hybrid Log-Gamma (HLG) and PQ transfer functions with a bit depth of 10-bits and the Rec. 2020 color space. On July 6, 2016, the ITU announced Rec. 2100, which uses HLG or PQ as transfer functions with a Rec. 2020 color space.

The key takeaway here is that PQ supports extremely bright images, but in a format that is not compatible with anything else.


… for Apple Final Cut Pro X

Tip #734: What is Tone-Mapping

Larry Jordan – LarryJordan.com

Tone-mapping preserves highlights when displaying HDR media on SDR displays.

HDR media not tone-mapped (bottom) and tone-mapped (top). Tone-mapping preserves highlights.

Topic $TipTopic

HDR (High Dynamic Range) media has grayscale values that far exceed what our computer monitors can display. (The Apple Pro XDR is a video monitor, not a computer monitor.)

As well, if we try using HDR media in a Rec. 709 HD project, the white levels are way past out of control.

NOTE: To convert HDR video to SDR video as part of a project, use the HDR Tools effect.

Tone-mapping solves this problem. This process automatically converts the vast grayscale range of HDR into the much more limited range of SDR (Standard Dynamic Range).

Final Cut Pro X does this using either a preference setting (Preferences > Playback) or a setting in the View menu at the top right corner of the Viewer.

This screen shot illustrates the difference. When tone-mapping is turned off (bottom of image) the highlights are blown out, with the detail lost; even though the image will look fine on an HDR monitor.

The top of the image is tone mapped to convert the highlights to fit within SDR specs. This means the image will look good on your computer monitor AND on an HDR monitor.


… for Adobe Premiere Pro CC

Tip #697: What Is the Alpha Channel?

Larry Jordan – LarryJordan.com

Alpha channels define the amount of translucency for each pixel.

When viewing alpha channels, black is transparent, gray is translucent and white is opaque.

Topic $TipTopic

Just as the red, green and blue channels define the amount of each color a pixel contains, the alpha channel defines the amount of transparency each pixel contains.

A pixel can be fully transparent, fully opaque or somewhere in between. By default, every video pixel is fully opaque.

NOTE: The reason we are able to key titles over backgrounds is that titles contain a built-in alpha channel that defines each character as opaque and the rest of the frame as transparent.

To display the alpha channel in a clip, click the Wrench icon in the lower-right of the Program Monitor and select Alpha. To return to a standard image, select Composite.

While we can easily work with alpha channels inside Premiere, in order to export video that retains transparency information, we need to use the ProRes 4444 or Animation codecs. No other ProRes, HEVC or H.264 codec supports alpha channels.


… for Codecs & Media

Tip #733: How Much Resolution is Too Much?

Larry Jordan – LarryJordan.com

The eye sees angles, not pixels.

At a normal viewing distance for a well-exposed and focused image HD, UHD and 8K look the same.

Topic $TipTopic

This article, written by Phil Platt in 2010 discussing how the human eye perceives image resolution, first appeared in Discovery.com. The entire article is worth reading. Here are the highlights.

As it happens, I know a thing or two about resolution, having spent a few years calibrating a camera on board Hubble, the space telescope.

The ability to see two sources very close together is called resolution. It’s measured as an angle, like in degrees. For example, the Hubble Space Telescope has a resolution of about 0.00003 degrees. That’s a tiny angle!

Since we measure resolution as an angle, we can translate that into a separation in, say, inches at certain distance. A 1-foot ruler at a distance of about 57 feet (19 yards) would appear to be 1 degree across (about twice the size of the full Moon). If your eyes had a resolution of 1 degree, then the ruler would just appear to you as a dot.

What is the resolution of a human eye, then? Well, it varies from person to person, of course. If you had perfect vision, your resolution would be about 0.6 arcminutes, where there are 60 arcmin to a degree (for comparison, the full Moon on the sky is about 1/2 a degree or 30 arcmin across).

To reuse the ruler example above, and using 0.6 arcmin for the eye’s resolution, the 1-foot ruler would have to be 5730 feet (1.1 miles) away to appear as a dot to your eye. Anything closer and you’d see it as elongated (what astronomers call “an extended object”), and farther away it’s a dot. In other words, more than that distance and it’s unresolved, closer than that and it’s resolved.

This is true for any object: if it’s more than 5730 times its own length away from you, it’s a dot. A quarter is about an inch across. If it were more than 5730 inches way, it would look like a dot to your eye.

But most of us don’t have perfect vision or perfect eyesight. A better number for a typical person is more like 1 arcmin resolution, not 0.6. In fact, Wikipedia lists 20/20 vision as being 1 arcmin, so there you go.

[Phil then summarizes:] The iPhone4 has a resolution of 326 ppi (pixels per inch). …The density of pixels in the iPhone 4 [when viewed at a distance of 12 inches] is safely higher than can be resolved by the normal eye, but lower than what can be resolved by someone with perfect vision.

LARRY’S EDITORIAL COMMENT

There’s a lot of discussion today about the value of 8K images. Current research shows that we need to sit within 7 feet (220 cm) of a 55″ HD image to see individual pixels. That converts to 1.8 feet to see individual the pixels in a UHD image. And 5 inches to see individual pixels in an 8K image on a 55 monitor.

Any distance farther and individual pixels can’t be distinguished.


… for Codecs & Media

Tip #701: How to Export an Alpha Channel

Larry Jordan – LarryJordan.com

Alpha channels are not supported in H.264 or HEVC media.

Topic $TipTopic

The alpha channel determines transparency in a clip. However, no compressed codec supports alpha channels. Why? Because including the alpha channel makes a file really big!

Here, courtesy of RocketStock.com is a list of video codecs and image formats that support alpha channels.

Video Codecs and Image Formats with Alpha Channels

  • Apple Animation
  • Apple ProRes 4444
  • Avid DNxHD
  • Avid DNxHR
  • Avid Meridien
  • Cineon
  • DPX
  • GoPro Cineform
  • Maya IFF
  • OpenEXR Sequence With Alpha
  • PNG Sequence With Alpha
  • Targa
  • TIFF

Be sure to test your codec before committing to a project. Not all versions of DNx or GoPro Cineform support alpha channels.


… for Codecs & Media

Tip #702: Is GoPro Cineform Still Useful?

Larry Jordan – LarryJordan.com

GoPro Cineform is available for free for both Mac and Windows.

Topic $TipTopic

When GoPro canceled GoPro Studio a while back, it became more difficult to convert GoPro footage into a format that can be easily edited.

This article, from David Coleman Photography, describes how to convert and play GoPro footage.

While GoPro Studio is no more, you can download the codecs themselves from the GoPro-Cineform decoder page. There you’ll find versions for Mac and Windows. In the case of the Mac version, it’s still called NeoPlayer, which is its old name.


… for Apple Final Cut Pro X

Tip #696: What Does the Alpha Channel Show?

Larry Jordan – LarryJordan.com

Alpha channels define the amount of translucency for each pixel.

When viewing alpha channels, black is transparent, gray is translucent and white is opaque.

Topic $TipTopic

Just as the red, green and blue channels define the amount of each color a pixel contains, the alpha channel defines the amount of transparency each pixel contains.

A pixel can be fully transparent, fully opaque or somewhere in between. By default, every video pixel is fully opaque.

NOTE: The reason we are able to key titles over backgrounds is that titles contain a built-in alpha channel that defines each character as opaque and the rest of the frame as transparent.

Using either the View menu at the top right corner of the Viewer or View > Show in Viewer > Color Channels > Alpha to display the alpha channel for whichever clip contains the playhead (or skimmer).

While we can easily work with alpha channels inside Final Cut, in order to export video that retains transparency information, we need to use the ProRes 4444 or Animation codecs. No other ProRes, HEVC or H.264 codec supports alpha channels.

EXTRA CREDIT

The Event Viewer also supports displaying alpha channels.