… for Codecs & Media

Tip #1043: Comparing HDR Grayscale to SDR

Larry Jordan – LarryJordan.com

HDR HLG matches SDR shadows and midtones, but adds more highlights.

(Graph courtesy of Wikipedia.com.)
This chart compares grayscale values between SDR and HDR HLG media.

Topic $TipTopic

As I was researching my recent webinar on “New Features in Adobe Premiere Pro,” I came across an interesting graph that compares HDR HLG grayscale values with SDR.

We are all familiar with the grayscale values in SDR (Standard Dynamic Range) media. It’s the Rec. 709 HD footage we work with on a daily basis.

While HDR consists of more than simply brighter pixels, grayscale is the relevant concept here. HDR has two formats: HLG and PQ. HLG (Hybrid Log Gamma) is optimized for broadcast, while PQ is optimized for digital display. Both Final Cut and Premiere support HLG media. But, what does it mean to say “optimized for broadcast?” That’s where this chart comes in.

SDR grayscale values are essentially linear, a “straight line” from 0 IRE (pure black) to 100 IRE (pure white). This range of 100 IRE values is what the entire broadcast signal path is designed to support.

HDR HLG mirrors the linear SDR grayscale values from 0 to 75 IRE – though there is some variation between standards in different countries – then expresses highlights as log values, rather than linear for the top 25% of highlights.

This allows HDR HLG to pack much brighter highlights than SDR, yet still fit within a 100 IRE range. However, there’s a trade-off. While HDR HLG is compatible with broadcast, HDR PQ has more highlight detail. Both HDR formats are much brighter than SDR.

EXTRA CREDIT

If you are creating an HDR project, it is important to know what format your distributor supports BEFORE you do the color grade, because grading HLG and PQ is not the same and you can not switch between them.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1046: For HDR, Shadows are More Important

Larry Jordan – LarryJordan.com

Shadow detail is important to perception that highlights – as both SDR and HDR reflect.

Image courtesy of Venera Technologies.
In both SDR and HDR, 50% of all grayscale values are the shadows.

Topic $TipTopic

In earlier tips (#1043 and #1049) we compared differences in grayscale values between SDR and HDR. What I discovered during this research is how important shadow detail is for both SDR and HDR.

NOTE: The screen shot and the information in this article are taken from a Venera Technologies article.

Human beings are more sensitive to changes in darker regions compared to changes in brighter regions. This property is exploited in HDR systems providing more granularity (detail) in darker regions compared to brighter regions. The screenshot depicts that the light level range in darker regions are represented by a larger signal value range compared to the brighter regions – meaning more detail in the shadows.

While grayscale values are more evenly distributed for Rec. 709-based displays, they become less granular for HDR displays in the brighter regions. In the case of HLG, more than half of signal values are represented for light levels between 0-60 Nits while the remaining signal values span 60-1000 Nits. Similarly, in the case of PQ-based displays, approximately half of the signal values are represented for light levels between 0-40 Nits while the remaining half of the signal values are represented in a range of 40-1000 Nits.

In other words, for both HDR and SDR, half the total signal range is reserved for shadow values of less than 50 IRE; while, for HDR, the remaining highlight values are spread up to 10,000 IRE (Nits)!


… for Codecs & Media

Tip #1049: HDR HLG vs PQ on SDR Monitors

Larry Jordan – LarryJordan.com

HLG looks better on SDR than PQ. But PQ looks better on HDR monitors.

Image courtesy of Venera Technologies.
HLG looks better on SDR monitors, but PQ has more detail in the highlights.

Topic $TipTopic

Tip #1043 compared the grayscale differences between HDR HLG and SDR. This tip illustrates the differences between watching HLG and PQ on an SDR monitor.

NOTE: The screen shot and the information in this article are taken from a Venera Technologies article.

To display the digital images on the screen, display devices need to convert the pixel values to corresponding light values. This process is usually non-linear and is called EOTF (Electro-Optical Transfer Function).

While SDR uses Rec. 709, HDR defines two additional transfer functions to handle this issue – Perceptual Quantizer (PQ) and Hybrid Log-Gamma (HLG). HDR PQ is an absolute, display-referred signal while HDR HLG is a relative, scene-referred signal. This means that HLG-enabled display devices automatically adapts the light levels based on the content and their own display capabilities while PQ enabled display devices need to implement tone mapping to adapt the light levels.

Under ideal conditions, dynamic PQ based transformation will achieve the best quality results at the cost of compatibility with existing display systems.

As you can see from the screen shot, HLG images look better on SDR monitors than PQ images. However, while PQ based transforms promise to display the best quality results on HDR enabled monitors, in comparison to HLG, PQ requires proper tone mapping by the display device.

EXTRA CREDIT

As you may be able to see in the screenshot, PQ offers more detail in the highlights than HLG.


… for Apple Final Cut Pro X

Tip #1040: New! Stabilize 360° Video

Larry Jordan – LarryJordan.com

360° Video stabilization is a single button – nothing to adjust.

The Stabilization checkbox in the Video Inspector.

Topic $TipTopic

New with the 10.4.9 update is the ability to stabilize 360° video which involves clicking a single button – there’s nothing to adjust.

To stabilize your footage, select it in the timeline (you can’t do this in the browser), then go to the Video Inspector and check the Stabilization checkbox.

Done.

EXTRA CREDIT

Unlike normal film, 360° video can easily cause motion sickness, especially when an audience member is wearing a headset.

The best way to shoot 360 is to use a tripod. For those situations where you can’t, stabilizing footage is essential.


… for Codecs & Media

Tip #1025: RAW vs. JPEG: Which is Better?

Larry Jordan – LarryJordan.com

RAW is always better, but takes more time to get the image to look right.

(Image courtesy of NightSkyPix.com)
This illustrates what happens to a JPEG image when saved multiple times.

Topic $TipTopic

The folks at NightSkyPix.com looked at RAW vs. JPEG from the point of view of astrophotography. However, this also applies to shooting video more down to earth.

This is an excerpt.

Loosely speaking, a RAW image is the digital equivalent of a film negative. In reality, a RAW file is not an image that can be visualized with classic software, but must be developed before using RAW editors such as Adobe Camera Raw.

The JPEG image format is arguably the most common standard format for digital images, and the name stands for “Joint Photographic Expert Group”.

The JPEG format uses lossy and compressed image data to create an image file that is both lightweight and readily usable with any software and device able to visualize graphics.

JPEG is easier to use and view, but RAW is the better choice.

The article provides additional details, pros and cons, and illustrates these ideas with screen shots. It’s worth spending time reading.

EXTRA CREDIT

For us video folks, JPEG is similar to H.264, and RAW is similar to raw or log files.


… for Codecs & Media

Tip #1029: HDR Playback to TV Screens

Larry Jordan – LarryJordan.com

HEVC was designed with HDR in mind. However, compression times are very slow.

(Image courtesy of Pexels.com.)

Topic $TipTopic

Stephen asks: I have been filming home movies for more than 60 years. I try to get the best output I can and future-proof my movies. Earlier this year my wife and I did an Antarctic cruise and managed to avoid Covid 19. I filmed this in HLG with a Sony PXW Z90.

I want to produce a deliverable HLG movie that I can watch on my HDR TV, an LG OLED. What format should I use?

Larry replies: You’ll need to use 10-bit HEVC. There are two flavors of HEVC: 8-bit and 10-bit. You MUST use the 10-bit version, 8-bit codecs, whether HEVC or H.264, do not support HDR media. (ProRes, though 10-bit and great for editing, is not a supported playback format for televisions.)

Apple Compressor, among other software, compresses into HEVC. However, on older computers, 10-bit HEVC will take a LOOOONNNGGGG time to compress. Probably 10 hours for a one-hour program. So, do a short test to verify this format will work on your TV, then allow enough time to compress the longer version.

Newer computers use hardware acceration for 8-bit HEVC, which speeds things a lot. However, I don’t know of any Mac hardware that accelerates 10-bit HEVC. I expect that to change in the near future.


… for Codecs & Media

Tip #1030: What’s the Difference: CAT5e, 6 or 6e?

Larry Jordan – LarryJordan.com

The type of cabling determines maximum network speed and distance.

(Image courtesy of Pexels.com.)

Topic $TipTopic

I want to upgrade my office network to 10 gigabit Ethernet. But that requires replacing my Category 5e cables with either Cat6 or 6e. That got me wondering: What’s the difference?

According to Black Box: “Cat5e, also known as Category 5e or Category 5 Enhanced, is a network cable standard ratified in 1999. Cat5e cables are typically 24-gauge twisted pair wires, which can support Gigabit networks at segment distances up to 100 meters.

Cat6 came out only a few years after Cat5e. Cat6 is a standardised twisted pair cable for Ethernet that is backward compatible with Cat5/5e and CAT 3 cable standards.

Like Cat5e, Cat6 cables support Gigabit Ethernet segments up to 100 m, but they also allow for use in 10-Gigabit networks over a limited distance. At the beginning of this century, Cat5e typically ran to the workstations, whereas Cat6 was used as the backbone infrastructure from router to switches.

The main difference between Cat5e and Cat6 cable lies within the bandwidth, the cable can support for data transfer. Cat6 cables are designed for operating frequencies up to 250 MHz, compared to 100 Mhz for CAT5e. This means that a Cat6 cable can process more data at the same time. Think of it as the difference between a 2- and a 4-lane highway. On both you can drive the same speed, but a 4-lane highway can handle much more traffic at the same time.

Because Cat6 cables perform up to 250 MHz which is more than twice that of CAT5e cables (100 Mhz), they offer speeds up to 10GBASE-T or 10-Gigabit Ethernet, whereas CAT5e cables can support up to 1GBASE-T or 1-Gigabit Ethernet.

1 Gigabit Ethernet supports cables up to 100 meters. 10 Gigabit Ethernet on Cat6 cable limits distance to 55 meters.

EXTRA CREDIT

A newer version of Cat6 is Cat6e (also called “CAT6A”). According to TrueCABLE:

  1. Cat6A cable is made and terminated to tighter tolerances than Cat6. This means the copper conductors are twisted tighter. This requires higher specification patch panels, wall jacks, and RJ45 connectors.
  2. Cat6A speed is at least 500 MHz. This allows 10 Gbp/s (Gigabits per second) up to 328 feet (100 meters). Cat6 speed is 250 MHz. Therefore, it only supports 10 Gbp/s to 165 feet (55 meters) under ideal conditions; less in heavy cross talk environments.
  3. Cat6A cable often uses thicker copper conductors and jackets. This make installation more difficult and drives up the price.

… for Apple Final Cut Pro X

Tip #1018: New! Adjust ISO for ProRes RAW

Larry Jordan – LarryJordan.com

New ISO and white point settings are now available for ProRes RAW in FCP X 10.4.9.

Info Inspector > Settings. These new settings (red arrows) only appear for ProRes RAW.

Topic $TipTopic

A new feature in the 10.4.9 update to Final Cut Pro X is the ability to adjust ISO and, for some cameras, the white point. Apple now supports changing the ISO setting (essentially, video gain) and white point for ProRes RAW media when edited natively.

NOTE: These settings only appear for ProRes RAW media and don’t appear when FCP X is in proxy mode.

To access these, select a ProRes RAW clip in the timeline (not the browser), then go to the Info Inspector and switch the menu at the bottom left from Basic to Settings. The red arrows in the screen-shot indicate the new settings with this update:

  • Camera ISO. The ISO setting at which the media was recorded.
  • ISO. A menu allowing you to change the ISO setting from 50 to 25,600.
  • Exposure Offset. This slider provides finer control in adjusting the ISO. The range is one stop lower to one stop higher.
  • Camera Color Temperature. The white point setting at which the video was recorded.

EXTRA CREDIT

For some cameras, Final Cut also supports changing the white point. Here is the current list of cameras supporting these new features.


… for Random Weirdness

Tip #1006: NewBlue FX: Live-Streaming Software

Larry Jordan – LarryJordan.com

NewBlue Stream is software designed to simplify live streaming with advanced graphics.

(Screen shot courtesy of NewBlueFX.com)

Topic $TipTopic

The folks at NewBlueFX announced a new program specifically designed for live streaming, with graphics integration: NewBlue Stream.

Here’s how NewBlue describes it: “Our philosophy with NewBlue Stream is simple – make it as easy as possible to produce live broadcasts, give you tools to make them engaging and interactive, and do it in one elegant solution that’s priced right. The result is a lightweight streaming and broadcast solution paired with dynamic, data-driven graphics that you won’t find anywhere else.”

NewBlue is a long-time effects developer – especially on Windows – with strong credentials for effects and titles.

“Cast stunning and technically sophisticated live video productions with multiple audio and video inputs, switching, and an unlimited number of programmable, data-driven, 3-D animated graphics, including lower thirds, crawls, motion bugs, transitions, titling, and more.”

The system provides image capture, content creation, and streaming; supporting Facebook Live, YouTube Live, Twitch, and any RTMP end point.

The software supports Windows and Mac. Pricing starts at $13.99 per month and a 14-day free trial is available.

EXTRA CREDIT

Here’s the link to learn more.


… for Visual Effects

Tip #1009: Getting Started with After Effects

Larry Jordan – LarryJordan.com

After Effects is intimidating. This article can help get you started.

(Image courtesy of PremiumBeat.com)

Topic $TipTopic

This article, written by Joe Frederick, first appeared in PremiumBeat.com. This is a summary.

Adobe After Effects is an exceptionally versatile piece of software. If you’re just starting out with it in 2020, here are five things to learn ASAP.

  1. Terminology. Before I started my motion graphics journey, my experience was limited to Final Cut Pro X. That meant I was faced with a whole new set of terms upon opening After Effects the first time, some of which were attached to features and concepts I knew by other names in other programs. For instance, a Project in FCPX is a Sequence in Premiere Pro is a Composition in AE.
  2. Keyframes. Keyframes mark the point in time where you specify a value for a layer property. Using them effectively is a linchpin of motion graphics work.
  3. Motion Blur. Motion blur is an absolute game-changer! When animating an asset, it’s important to prevent said asset’s movements from looking mechanical, twitchy, and, well, like it was slapped together in a computer program. That’s where motion blur comes in.
  4. Dynamic Link. If you’re planning to work with both Adobe Premiere Pro and After Effects on the same project, then you’re going to love this. Thanks to the dynamic link, you can avoid all of that potentially confusing, definitely time-sucking nonsense and import your AE comps straight into Premiere.
  5. Pre-Composing. If you’re used to terms like “nesting” or “compound clip,” you’ll be familiar with creating pre-compositions. Simply select the layers of your choice, right-click, and select Create Pre-composition in order to put these layers into their own mini comp. They’ll now be represented in your main timeline by just one layer. If you double-click this layer, you can go into it to make changes that’ll now be visible when you return to your main composition.

The link above has additional tutorial videos, images and links for more information.