… for Codecs & Media

Tip #1046: For HDR, Shadows are More Important

Larry Jordan – LarryJordan.com

Shadow detail is important to perception that highlights – as both SDR and HDR reflect.

Image courtesy of Venera Technologies.
In both SDR and HDR, 50% of all grayscale values are the shadows.

Topic $TipTopic

In earlier tips (#1043 and #1049) we compared differences in grayscale values between SDR and HDR. What I discovered during this research is how important shadow detail is for both SDR and HDR.

NOTE: The screen shot and the information in this article are taken from a Venera Technologies article.

Human beings are more sensitive to changes in darker regions compared to changes in brighter regions. This property is exploited in HDR systems providing more granularity (detail) in darker regions compared to brighter regions. The screenshot depicts that the light level range in darker regions are represented by a larger signal value range compared to the brighter regions – meaning more detail in the shadows.

While grayscale values are more evenly distributed for Rec. 709-based displays, they become less granular for HDR displays in the brighter regions. In the case of HLG, more than half of signal values are represented for light levels between 0-60 Nits while the remaining signal values span 60-1000 Nits. Similarly, in the case of PQ-based displays, approximately half of the signal values are represented for light levels between 0-40 Nits while the remaining half of the signal values are represented in a range of 40-1000 Nits.

In other words, for both HDR and SDR, half the total signal range is reserved for shadow values of less than 50 IRE; while, for HDR, the remaining highlight values are spread up to 10,000 IRE (Nits)!


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1049: HDR HLG vs PQ on SDR Monitors

Larry Jordan – LarryJordan.com

HLG looks better on SDR than PQ. But PQ looks better on HDR monitors.

Image courtesy of Venera Technologies.
HLG looks better on SDR monitors, but PQ has more detail in the highlights.

Topic $TipTopic

Tip #1043 compared the grayscale differences between HDR HLG and SDR. This tip illustrates the differences between watching HLG and PQ on an SDR monitor.

NOTE: The screen shot and the information in this article are taken from a Venera Technologies article.

To display the digital images on the screen, display devices need to convert the pixel values to corresponding light values. This process is usually non-linear and is called EOTF (Electro-Optical Transfer Function).

While SDR uses Rec. 709, HDR defines two additional transfer functions to handle this issue – Perceptual Quantizer (PQ) and Hybrid Log-Gamma (HLG). HDR PQ is an absolute, display-referred signal while HDR HLG is a relative, scene-referred signal. This means that HLG-enabled display devices automatically adapts the light levels based on the content and their own display capabilities while PQ enabled display devices need to implement tone mapping to adapt the light levels.

Under ideal conditions, dynamic PQ based transformation will achieve the best quality results at the cost of compatibility with existing display systems.

As you can see from the screen shot, HLG images look better on SDR monitors than PQ images. However, while PQ based transforms promise to display the best quality results on HDR enabled monitors, in comparison to HLG, PQ requires proper tone mapping by the display device.

EXTRA CREDIT

As you may be able to see in the screenshot, PQ offers more detail in the highlights than HLG.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1025: RAW vs. JPEG: Which is Better?

Larry Jordan – LarryJordan.com

RAW is always better, but takes more time to get the image to look right.

(Image courtesy of NightSkyPix.com)
This illustrates what happens to a JPEG image when saved multiple times.

Topic $TipTopic

The folks at NightSkyPix.com looked at RAW vs. JPEG from the point of view of astrophotography. However, this also applies to shooting video more down to earth.

This is an excerpt.

Loosely speaking, a RAW image is the digital equivalent of a film negative. In reality, a RAW file is not an image that can be visualized with classic software, but must be developed before using RAW editors such as Adobe Camera Raw.

The JPEG image format is arguably the most common standard format for digital images, and the name stands for “Joint Photographic Expert Group”.

The JPEG format uses lossy and compressed image data to create an image file that is both lightweight and readily usable with any software and device able to visualize graphics.

JPEG is easier to use and view, but RAW is the better choice.

The article provides additional details, pros and cons, and illustrates these ideas with screen shots. It’s worth spending time reading.

EXTRA CREDIT

For us video folks, JPEG is similar to H.264, and RAW is similar to raw or log files.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1029: HDR Playback to TV Screens

Larry Jordan – LarryJordan.com

HEVC was designed with HDR in mind. However, compression times are very slow.

(Image courtesy of Pexels.com.)

Topic $TipTopic

Stephen asks: I have been filming home movies for more than 60 years. I try to get the best output I can and future-proof my movies. Earlier this year my wife and I did an Antarctic cruise and managed to avoid Covid 19. I filmed this in HLG with a Sony PXW Z90.

I want to produce a deliverable HLG movie that I can watch on my HDR TV, an LG OLED. What format should I use?

Larry replies: You’ll need to use 10-bit HEVC. There are two flavors of HEVC: 8-bit and 10-bit. You MUST use the 10-bit version, 8-bit codecs, whether HEVC or H.264, do not support HDR media. (ProRes, though 10-bit and great for editing, is not a supported playback format for televisions.)

Apple Compressor, among other software, compresses into HEVC. However, on older computers, 10-bit HEVC will take a LOOOONNNGGGG time to compress. Probably 10 hours for a one-hour program. So, do a short test to verify this format will work on your TV, then allow enough time to compress the longer version.

Newer computers use hardware acceration for 8-bit HEVC, which speeds things a lot. However, I don’t know of any Mac hardware that accelerates 10-bit HEVC. I expect that to change in the near future.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1030: What’s the Difference: CAT5e, 6 or 6e?

Larry Jordan – LarryJordan.com

The type of cabling determines maximum network speed and distance.

(Image courtesy of Pexels.com.)

Topic $TipTopic

I want to upgrade my office network to 10 gigabit Ethernet. But that requires replacing my Category 5e cables with either Cat6 or 6e. That got me wondering: What’s the difference?

According to Black Box: “Cat5e, also known as Category 5e or Category 5 Enhanced, is a network cable standard ratified in 1999. Cat5e cables are typically 24-gauge twisted pair wires, which can support Gigabit networks at segment distances up to 100 meters.

Cat6 came out only a few years after Cat5e. Cat6 is a standardised twisted pair cable for Ethernet that is backward compatible with Cat5/5e and CAT 3 cable standards.

Like Cat5e, Cat6 cables support Gigabit Ethernet segments up to 100 m, but they also allow for use in 10-Gigabit networks over a limited distance. At the beginning of this century, Cat5e typically ran to the workstations, whereas Cat6 was used as the backbone infrastructure from router to switches.

The main difference between Cat5e and Cat6 cable lies within the bandwidth, the cable can support for data transfer. Cat6 cables are designed for operating frequencies up to 250 MHz, compared to 100 Mhz for CAT5e. This means that a Cat6 cable can process more data at the same time. Think of it as the difference between a 2- and a 4-lane highway. On both you can drive the same speed, but a 4-lane highway can handle much more traffic at the same time.

Because Cat6 cables perform up to 250 MHz which is more than twice that of CAT5e cables (100 Mhz), they offer speeds up to 10GBASE-T or 10-Gigabit Ethernet, whereas CAT5e cables can support up to 1GBASE-T or 1-Gigabit Ethernet.

1 Gigabit Ethernet supports cables up to 100 meters. 10 Gigabit Ethernet on Cat6 cable limits distance to 55 meters.

EXTRA CREDIT

A newer version of Cat6 is Cat6e (also called “CAT6A”). According to TrueCABLE:

  1. Cat6A cable is made and terminated to tighter tolerances than Cat6. This means the copper conductors are twisted tighter. This requires higher specification patch panels, wall jacks, and RJ45 connectors.
  2. Cat6A speed is at least 500 MHz. This allows 10 Gbp/s (Gigabits per second) up to 328 feet (100 meters). Cat6 speed is 250 MHz. Therefore, it only supports 10 Gbp/s to 165 feet (55 meters) under ideal conditions; less in heavy cross talk environments.
  3. Cat6A cable often uses thicker copper conductors and jackets. This make installation more difficult and drives up the price.

Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1013: The Coming File Size Tsunami

Larry Jordan – LarryJordan.com

The impact of increasing frame size AND frame rate requires exponential growth in storage.

Data based on ProRes 422, numbers provided by Apple ProRes White Paper.

Topic $TipTopic

Last week, I presented a webinar on media management that applied to both Premiere and Final Cut. During the presentation I was discussing the impact frame size, frame rates and bit depth have on the file size of our media files.

The world of 8K images is coming – whether we like it or not; though, personally, I’m not looking forward to it. But, the impact of 8K on our storage capacity and bandwidth is dramatic!

As I was presenting, I realized I was missing a chart that showed the result of both frame size AND frame rate increasing. So, here it is. As this chart shows, as both frame size and frame rate increase, we see an exponential growth in file size and bandwidth.

  • 720p24 uses 23 GB/hour
  • 1920p30 uses 66 GB/hour
  • UHD/30 uses 530 GB/hour
  • 8K/60 uses 2,263 GB/hour

This means that as you plan future projects make sure your storage system has the capacity to handle it!


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1014: Premiere & Avid Now Collaborate

Larry Jordan – LarryJordan.com

MediaCentral extends collaboration from Avid to Adobe video editors.

The Avid logo.

Topic $TipTopic

With the June, 2020, update to Avid MediaCentral, Premiere Pro editors can now connect and collaborate with Avid Media Composer editors—no matter where they are located.

MediaCentral is Avid hardware and software that provides the remote collaboration, media management, and integration large post-production facilities and in-house post teams need to prep, complete, and deliver projects on time and on budget.

The June update provides a dedicated MediaCentral pane for Adobe Premiere making it easy to browse, search or, locate, and access clips and sequences across MediaCentral databases for editing—without leaving Premiere. And the built-in chat enables you to communicate with other editors and collaborators across the platform—on premises or remotely.

Existing MediaCentral users can now connect Premiere Pro editors into their production environment. Premiere users get the same level of access and real-time collaboration power as Avid editors. As well, MediaCentral provides more tool flexibility to fit specific needs and budget. In addition, Premiere editors can access rundowns and scripts for news editing linked to stories.

EXTRA CREDIT


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1015: Media Planning Guidelines

Larry Jordan – LarryJordan.com

Guidelines to plan and use media more efficiently.

Topic $TipTopic

As you plan your next project, here are some media guidelines to help you think about your media and the storage necessary to support smooth playback and editing:

  • If deadlines are extremely tight AND you are not adding a lot of effects, you can edit H.264 or HEVC directly in your NLE. Otherwise, transcode all highly-compressed media into an easier-to-edit intermediate format, such as ProRes, DNx or GoPro Cineform.
  • Always shoot the frame rate you need to deliver. Changing frame rates after production is complete almost always looks “jittery.”
  • Image quality is not lost in transcoding (converting) a highly-compressed video format into ProRes.
  • If the media was shot by a camera, transcode into ProRes 422.
  • If the media was created on a computer, transcode into ProRes 4444.
  • If the media was shot in log or raw formats, edit it natively and do the rough cut using proxies.
  • Proxies are your friend. Use proxies to create a rough cut when using HDR or raw media; or frame sizes larger than 4K.
  • Color grading high-quality 4K HDR media can require over 500 MB / second of data bandwidth! Make sure your storage is fast enough.
  • Always have a reserve budget for more high-performance storage. You’ll need it.
  • Always allow time to test your entire workflow from capture to final output before starting production. It is much easier to find and fix problems when not staring at a looming deadline. “I didn’t have time to test!” is never a good excuse.

Yes, there are exceptions to these rules, but not in most cases.

EXTRA CREDIT

Here’s an article I wrote that goes into more detail for each of these.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #991: HDV vs. “Normal” Media

Larry Jordan – LarryJordan.com

Make sure your projects match the aspect ratio of the pixels, as well as project.

An example of rectangular pixels (2:1). image courtesy of Wikipedia.

Topic $TipTopic

The big difference between HDV and “normal” media is the aspect ratio of each pixel. Just today, I got an email from a reader asking why their footage looked “stretched.” Here’s what you need to know.

Back in the old days, as we were making the transition from standard definition video to HD, cameras and storage devices were neither big enough nor fast enough to capture and record a full HD video stream.

To solve this problem, Sony and Panasonic created HD formats with non-square pixels. This meant that they could record fewer pixels, then stretch them horizontally in the final display so that fewer pixels would fill more space.

For example, while HDV records 720p footage using square pixels, instead of recording an image at 1920 x 1080, it records it using 1440 x 1080, then stretches each pixel’s width to fill the space of 1920 pixels, using only 1440 of them.

The problem this causes in editing is that, today, NLEs expect HD pixels to be square. When you work with older footage, if your image looks squished, check your pixel aspect ratio – or your project settings – to make sure you are compensating for these earlier rectangular pixels.

EXTRA CREDIT

Here’s a link from a few years ago that looks at HDV in more detail.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #995: Apple Compressor Supports ProRes RAW

Larry Jordan – LarryJordan.com

Compressor supports ProRes RAW making it useful on set creating proxy files.

Video Inspector settings for ProRes RAW images in Apple Compressor.

Topic $TipTopic

In a recent update, Apple added support for ProRes RAW in Compressor. Apple ProRes RAW and Apple ProRes RAW HQ bring the same performance, quality, and ease of use introduced by Apple ProRes to raw media.

When you add a ProRes RAW clip to a Compressor Batch, new options appear in the Video Inspector (see screen shot).

You can use Compressor to:

  • Convert between color spaces
  • Convert RAW to a log file
  • Apply a camera LUT

EXTRA CREDIT

While the Compressor help files don’t provide any detailed help on these settings, you can learn more about ProRes RAW here.


Please rate the helpfulness of this tip.

Click on a star to rate it!