… for Codecs & Media

Tip #680: What is the Alpha Channel

Larry Jordan – LarryJordan.com

Alpha Channel defines the transparency in a digital image or video.

Topic $TipTopic

The Ask Tim Grey website had a nice answer to this question.

An alpha channel is essentially any channel other than the channels that define color values for pixels in an image. Generally an alpha channel is used to define areas of a photo, such as to define transparency.

In the context of a digital image, the term “channel” generally refers to the information about individual color values that comprise the overall pixel information. For example, with a typical RGB image there are three channels that individually define the red, green, and blue values for pixels.

So, an alpha channel is essentially a “map” that defines specific areas of the image. It is similar in many ways to a channel that defines color, but since it is used to define transparency or selection rather than defining color for an image, it needed a “special” name. The term “alpha channel” is the name that was given to this feature.

EXTRA CREDIT

Here’s a link to learn more from Tim.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #681: When Does Using High Sample Rates Make Sense?

Larry Jordan – LarryJordan.com

Most of the time, 48K sampling is the best choice.

Image courtesy of Electro-Voice.
An Electro-Voice RE-20 microphone.

Topic $TipTopic

This is an excerpt from an informative article in Sound-on-Sound, written by Hugh Robojohns.

Higher sample rates only provide a greater recorded bandwidth — there is no intrinsic quality improvement across the 20Hz‑20kHz region from faster sampling rates — and, in fact, jitter becomes a much more significant problem. So I would suggest that you forget 192kHz altogether unless you need to do specialist sound‑design work where you want to slow recorded high‑frequency sounds down dramatically.

The question of whether to use a 96kHz sample rate is less clear-cut, because it can prove useful in some specific situations. Yes, it creates larger files and higher processing loads, but it also removes the possibility of filtering artifacts in the audio band and reduces the system latency compared with lower rates. Many plug‑in effects automatically up‑sample internally to 96kHz when performing complex non‑linear processes such as the manipulation of dynamics.

EXTRA CREDIT

Note, though, that not all software is particularly good at sample‑rate conversion, with even some expensive and well regarded DAW software resulting in noticeable aliasing. You do, of course, need to judge results subjectively, but if you’re curious how well your software performs in this respect — or whether any free software performs this function any better — then check out Infinite Wave’s database at (src.infinitewave.ca) which compares results from a huge number of applications, and includes test files so you can perform your own tests too.

Larry adds: Most of the time for video recording a sample rate of 48K is the best choice.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #646: When Does Video Compression Use the GPU?

Larry Jordan – LarryJordan.com

Most editing codecs use the GPU, most web codecs do not.

Topic $TipTopic

One of the never-ending debates is how to configure the “best” computer. While this question is unanswerable in general, when it comes to video compression, here’s what you need to know.

CPUs, in general, provide linear calculations – one calculation after the other.

GPUs, in contrast, provide parallel calculations – multiple calculations occurring at the same time.

  • GOP-based codecs benefit most from linear – CPU – calculations due to the structure of the GOP. If you need to create H.264 materials, the faster the CPU, the faster compression will complete.
  • I-frame codecs, on the other hand, benefit from the GPU because different frames can be calculated at the same time, then stitched together for the final movie.

This ability of I-frame codecs to use the GPU to accelerate render and export speeds is one of the reasons they are recommended for editing.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #647: What is ffMPEG

Larry Jordan – LarryJordan.com

FFmpeg cannot be sold, it can only be given away.

Topic $TipTopic

FFmpeg is a free and open-source project consisting of a vast software suite of libraries and programs for handling video, audio, and other multimedia files and streams. At its core is the FFmpeg program itself, designed for command-line-based processing of video and audio files, and widely used for format transcoding, basic editing (trimming and concatenation), video scaling, video post-production effects, and standards compliance.

FFmpeg is able to decode, encode, transcode, mux, demux, stream, filter and play pretty much anything that humans and machines have created. It supports the most obscure ancient formats up to the cutting edge. It runs on Linux, Mac OS X, Microsoft Windows, the BSDs, Solaris, etc. under a wide variety of build environments, machine architectures, and configurations.

The FFmpeg project tries to provide the best technically possible solution for developers of applications and end users alike. Wherever the question of “best” cannot be answered we support both options so the end user can
choose.

FFmpeg is used by software such as VLC media player, xine, Cinelerra-GG video editor, Plex, Kodi, Blender, HandBrake, YouTube, and MPC-HC; it handles video and audio playback in Google Chrome, and Linux version of Firefox.

FFmpeg is free for personal use, however, it does not have a user interface. Graphical user interface front-ends for FFmpeg have been developed, including XMedia Recode and ffWorks.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #650: What is a Raw file format

Larry Jordan – LarryJordan.com

Raw is not an acronym, it simply means “unprocessed.”

Topic $TipTopic

Raw is an image and video file format used by many high-end and professional digital cameras. RAW files are considered to be the best form of image file, since it does not process the picture, leaving total control of the editing to the user.

A camera raw image file contains minimally processed data from the image sensor of either a digital camera, a motion picture film scanner, or other image scanner. Raw files are named so because they are not yet processed and therefore are not ready to be printed or edited.

Raw image files are sometimes incorrectly described as “digital negatives”, but neither are they negatives nor do the unprocessed files constitute visible images. Rather, the Raw datasets are more like exposed but undeveloped film.

Like undeveloped photographic film, a raw digital image may have a wider dynamic range or color gamut than the developed film or print. Unlike physical film after development, the Raw file preserves the information captured at the time of exposure. The purpose of raw image formats is to save, with minimum loss of information, data obtained from the sensor.

There are dozens of raw formats in use by different manufacturers of digital image capture equipment.

EXTRA CREDIT

Here’s an Apple White Paper to learn more.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #635: HTTP Live Streaming

Larry Jordan – LarryJordan.com

HTTP Live Streaming compensates for shifts in bandwidth for mobile devices.

HTTP Live Streaming compression settings applied to a job-chained clip.

Topic $TipTopic

The problem with mobile devices is that the bandwidth that connects them to the web changes as they move from one cell tower to another. This becomes important when watching movies that are longer than 10 minutes.

Apple Compressor has a feature – called HTTP Live Streaming – that compensates for this difference in bandwidth. This process compresses a master file into ten-second segments, using seven different frame sizes and bandwidths. In the case of my one-hour webinars, it generates about 2,000 separate segments.

This allows the server to seamlessly switch between different quality levels as bandwidth changes. If you are connected via a high-speed Internet WiFi connection, all these different segments are ignored. They only apply to mobile devices connected via cell towers.

My website has supported this playback style for seven years now. The problem is that implementing this takes a bit of programming from your webmaster.

EXTRA CREDIT

Here’s an article that explains this process in more detail. Remember, this only applies to movies longer than ten minutes which are NOT streaming on a social media service.

The reason you don’t need to worry about this if your files are streamed on Facebook or Vimeo, et al, is that these services create the HLS versions automatically on their servers.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #636: Compressor: What is a Job Action?

Larry Jordan – LarryJordan.com

This only applies when one setting is applied to a video.

The Job Action menu at the bottom of the Job panel in Compressor.

Topic $TipTopic

A Job Action in Apple Compressor is an automated activity that occurs when a compression task is complete. It is assigned to the job, not to a compression setting. Here’s how it works.

  • Select a movie (called a “job” in Compressor), not the compression setting.
  • At the bottom of the Job panel is the Action section. This describes what can be done with a compressed file when compression is complete.
  • There are ten options, as illustrated in this screen shot. Save only means the file will be saved and nothing else done to it.

For example, choosing Publish to YouTube, asks for your log-in credentials, project title, description and tags. When compression is complete, the compressed file will be automatically transferred to YouTube with the tags you specify.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #637: Compressor: Job Chaining

Larry Jordan – LarryJordan.com

Job chaining creates an intermediate master file, which saves time creating derivatives.

The Job Chain menu in Apple Compressor.

Topic $TipTopic

There is a hidden feature in Apple Compressor that can save time when creating multiple versions of the same master file. It’s called “Job Chaining” and here is how it works.

Every week, when I post my webinars, I add a watermark of my website URL into all the compressed versions. However, I never export the master file with a watermark, so that I always have a clean copy for archiving.

One of the versions I create is an HTTP Live Streaming (HLS) version of it for mobile devices. (See Tip #635). The problem is that HLS compression creates thousands of short ten-second movies from the master file. There’s no easy way to add watermarks to them.

So, I do this in two steps:

  • First, create an intermediate master file – using ProRes 4444 – with a watermark.
  • Then take the output of that process and “job chain” it as the source file for HLS compression. (see screen shot)

Specifically:

  • Import your master file into Compressor
  • Apply the setting to create the interim master – in my screen shot, this is called “Add Watermark Only.” All it does is burn a watermark into the intermediate master. Because I am working with ProRes 4444 there is no loss in audio or video quality.
  • Control-click the compression setting and choose New Job with Selected Output.
  • This creates a new line in Compressor to which I apply the HTTP Live Streaming settings.

This allows me to create one master file with the watermark, rather than re-create it over and over again.

I use this technique every week.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #591: In-Depth Overview of USB

Larry Jordan – LarryJordan.com

USB is ubiquitous and growing in speed.

A USB logo and plug.

Topic $TipTopic

The folks at Juiced Systems created an excellent overview of USB called: “Know Your USB. A Practical Guide to the Universal Serial Bus.” (Juiced Systems, based in Orange County, CA, designs & creates unique high performance computer accessories for power users and enterprise professionals.)

Key Takeaways

  • USB cables, ports, and connectors (hardware) have varying USB versions, generations, and specifications (software) that dictate the speed and performance.
  • USB types are denoted by letters, such as Type-A and Type-B, while USB versions have numbers to them, like USB 3.2 or USB4.
  • A USB device may physically fit into a USB port, but its performance can be hampered by a generation or standard mismatch. For example, your USB 2.0 device can work with a USB 3.0 port, but the speed takes on USB 2.0’s. Similarly, a 3.0 device can work with a USB 2.0 port, and the speed is that of the port. USB devices often specify the highest standard they support and require in their product labels.
  • Speaking of speed, USB 1.0, 2.0, 3.0, and 4 each have maximum data transmission rates. These are theoretical numbers at best, and the actual speed still varies. If you are experiencing slow data transfer, it may have to do with the USB port (transfer speed), as noted above, as well as the read/write speed of the devices involved.

USB has been hailed as the king of connectors or the port that changed everything. But at the end of the day, it is that cable or port that makes your life easier as you charge your phone, save files, or access your peripherals on your laptop.

EXTRA CREDIT

The full report is well-written, in-depth and easy-to-understand. Here’s the link.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #612: The Background of Blu-ray Disc

Larry Jordan – LarryJordan.com

A quick look at the history of Blu-ray Disc.

The Blu-ray Disc logo.

Topic $TipTopic

I’ve gotten a fair amount of email recently asking about Blu-ray Discs.

The specs for Blu-ray Disc were developed by Sony and unveiled in October, 2000 specifically for HD media. The first Blu-ray prototype was released in April, 2003. The format is now controlled by the Blu-ray Disk Association.

Blu-ray Disc was named for the blue laser it uses to read and write media. Blue lasers support higher density storage than the red lasers used by DVDs.

A single layer Blu-ray Disc holds 25 GB, a dual-layer holds 50 GB. While vast at the time of its release, these small file limits today mean that we need to use significant media compression to get our files to fit. Currently, Blu-ray Discs support HD, HDR and 3D media formats, all within the same storage capacity.

NOTES

  • The original DVD was designed for SD media and holds about 4.7 GB single layer or 8.5 GB dual-layer.
  • CD-ROMs hold between 650 – 700 MB.

EXTRA CREDIT

Tip #613 has a list of all supported Blu-ray Disc distribution formats.


Please rate the helpfulness of this tip.

Click on a star to rate it!