… for Random Weirdness

Tip #1524: The IP Video Revolution in Here

Larry Jordan – LarryJordan.com

The second video revolution is here: IP.

The Primestream logo.

Topic $TipTopic

The evolution from baseband to IP turned into a revolution in 2020 for the broadcast and streaming industries as the effects of the COVID-19 pandemic forced operations to double down on remote production workflows and technologies. Video technology companies and producers have turned to IP-based production like never before, embracing its efficiency, flexibility, and ability to meet rapidly changing requirements cost-effectively.

PrimeStream just released a new white-paper: “The IP Broadcast Revolution” that discusses this transition. The white paper takes a closer look at this massive paradigm shift. We trace the IP revolution from RF and baseband to IP, from satellite and microwave antennas to SIM cards, and from the broadcast operations center to the cloud. From there, we introduce the Primestream IP Broadcast Network Operation Center™ (NOC), which is enabling the future of video workflows through powerful solutions such as Media IO and Xchange™ Media Cloud.

The white paper is only 7 pages long, profusely illustrated and easy to read.

White paper link.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Adobe Premiere Pro CC

Tip #1518: Absolute vs. Relative Audio Levels

Larry Jordan – LarryJordan.com

Generally, we adjust clip volumes relatively and monitor them absolutely.

The Gain wndow in Adobe Premiere Pro.

Topic $TipTopic

There are two ways to adjust the volume of any audio clip: Absolute and Relative. Here’s what these terms mean and how they work.

An absolute audio level adjustment sets audio levels regardless of the audio volume of that clip before the adjustment. For example, setting one or more clips to -6 dB. If one clip is at -4 dB and a second clip is at 0 dB before the change, they will both be at -6 dB after the change.

A relative audio level adjustment sets audio levels based upon the audio levels before the adjustment. For example, raising the level of one or more clips by 4 dB. If one clip is at -4 dB and a second clip is at 0 dB before the change, the first clip will be at 0 dB and the second clip will be a +4 dB after the change.

NOTE: Audio meters always show absolute levels, the precise volume of all active clips, regardless of the dB setting of their individual clip volume.

KEYBOARD SHORTCUTS

  • Select the clips you want to adjust in the timeline.
  • Type G, to open the Gain window.
  • Set Gain to performs an absolute audio adjustment.
  • Adjust Gain by performs a relative audio adjustment.

The waveforms in clips you adjust will change, but the volume line will not.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Adobe Premiere Pro CC

Tip #1520: Caption Track Tricks

Larry Jordan – LarryJordan.com

Captions are now much more flexible and easier to use in Premiere Pro.

Caption track window (top) and caption track menu (lower).

Topic $TipTopic

One of the exciting new features in Premiere’s new caption workflow is its flexibility. There is no limit to the number of caption tracks (the container for captions) you can create. There is also no limit to the number of captions you can put in each track.

NOTE: Well, there is, I guess. Captions need to display for at least a second, so you are limited by the length of your program. But, um, hold your captions on-screen longer than a second…

When you add a new caption track, you can determine the format for all the captions it contains. However, you can’t mix caption formats in the same track.

You can have tracks for different languages, and each track can be a different format (as illustrated in the lower half of the screen shot).

Control-click a caption track to reveal other options.

Here’s a tutorial from my website that describes captions in more detail.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Visual Effects

Tip #1473: Lunar Animation: Mac Pro Big Help in VFX

Larry Jordan – LarryJordan.com

The high-end Mac Pro vastly speeds creating animation & VFX assets.

Detail from Lunar Animation ad for Disney+

Topic $TipTopic

Lunar Animation wrote a detailed blog about how using high-end Mac Pros enabled their animation activities over the last year. Their blog “looks at the noticeable things that this machine really helps with once you start working as an artist.”

NOTE: Here’s the link

The following are excerpts from their blog.

Today we’re going to give a 12 month 3D computer animation perspective on the Mac Pro and Pro Display XDR. We really push our computers hard on a daily basis and benchmarks don’t always reflect real world use on a project. So in this post we will focus on how the Mac Pro has affected our current workflows and how it’s opening up new and exciting workflows for the future.

We currently have two Mac Pros in the studio, we have a mid-tier model which has the 16-core CPU and dual graphics cards and then a higher tier 28 core model, which has four graphics cards. Now I know what you’re thinking, why on earth would you want four graphics cards? Well we’ll get to that later in the post.

Their software includes:

  • MAYA – 3D Modelling and Animation
  • V-RAY – Rendering Engine for Maya
  • HOUDINI – Simulation and Effects
  • NUKE – Compositing
  • DAVINCI RESOLVE – Editing
  • SUBSTANCE PAINTER – Shading and Texturing
  • ZBRUSH – 3D Sculpting
  • ADOBE PHOTOSHOP – Image Editing
  • DEADLINE – Render Management Software

[What using a high-end Mac Pro ] means is that rather than having to close the heavy scene, load up the 3D model we want to adjust and then close that and reopen the original scene to continue working, we are able to simply switch spaces in macOS, add a light to a 3D model and quickly switch back and check the update with the interactive IPR renderer.

Then while we’re waiting for that to create a preview render, we can switch over to another screen and adjust another model.

What we’re seeing here is the fluidity of the artist working and not having to be ground to a halt because the computer is having to think. It feels like having multiple computers at your fingertips. All without constant crashing, which means we avoid losing work and more importantly save artists’ time.

The entire blog is a tour de force and well worth reading. And the behind-the-scenes video is amazing.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Visual Effects

Tip #1475: Recreating 1859 Harpers Ferry

Larry Jordan – LarryJordan.com

Interviews with the VFX team for “The Good Lord Bird.”

For the great reveal of the Harpers Ferry region, a horse and wagon and grassy field appear in the foreground. Right behind them, production rigged a bluescreen. Ingenuity Studios created everything beyond the foreground elements in order to simulate Harpers Ferry in 1859 in this panoramic shot. Images courtesy of Showtime.

Topic $TipTopic

This article, written by Chris McGowan, first appeared in VFXVoice.com. This is an excerpt.

NOTE: Here’s the link

For the Showtime series “The Good Lord Bird,” show-side Visual Effects Supervisor Brad Minnich and Ingenuity Studios Visual Effects Supervisor Andrew Woolley were tasked with recreating Harpers Ferry in 1859 and other settings from the last years of abolitionist John Brown.

Matthew Poliquin was Executive Producer and Adam Lambert the VFX Producer for Ingenuity Studios. Marz VFX, Barnstorm VFX, Trehmer Film and Technicolor VFX were other participating visual effects studios. Ingenuity Studios, the primary VFX house, completed 450 shots and worked on everything from period towns and landscapes to CGI fires, muzzle flashes, train smoke, extensive matte paintings and CG body doubles in battle.

“We had a lot of on-set photography and drone footage to reference to build that town,” Minnich comments. A lot of older photography from the area was also used for reference of what the structures and landscape looked like. “The cool thing about Harpers Ferry is that it is still intact and still has the [pre-Civil War] essence. I remember a session with Andrew where we picked off the modern buildings.” With those cleared out, “Andrew and his team at Ingenuity Studios had a good guide.”

Harpers Ferry is in a river valley, with mountains on either side. One of the memorable establishing scenes is a great reveal of the region. A horse and wagon and grassy field are in the foreground of the shot. Right behind them, production rigged a bluescreen. Ingenuity Studios created everything beyond the foreground elements. “We had to find or build all the elements – trees and grasses, mountains and so on, in the correct varietals and topography for the location,” Woolley notes. Another item was a bridge built in CG and added to the scene. “We had to match all the lighting to the practical elements in the scene,” he adds. “We stitched everything together and projected it onto some rough geo to give more 3D feel. It’s essentially 2.5D, though where the various depths move independently to achieve the correct parallax through the crane move. Finally, it was all integrated with atmosphere and the sky replacement to cap it off. It was the big reveal of Harpers Ferry. Once we locked that in, it established the lay of the land for our viewers.”

EXTRA CREDIT

The article continues with more interviews, along with before and after images of various effects shots.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Codecs & Media

Tip #1467: Create WebM Files on a Mac

Larry Jordan – LarryJordan.com

There are only a few tools that create WebM files on a Mac.

The WebM logo.

Topic $TipTopic

Last week, Tip #1436 reported that support for WebM video playback was in current beta builds of Apple Safari. However, as several readers pointed out, that didn’t answer the question of how to create WebM files on a Mac. Here’s what I learned.

While there are LOTS of ways to convert WebM into MP4, there are only a very limited number of ways to convert anything into WebM.

NOTE: ffMPEG supports WebM creation. However, that’s accessed using the command line in Terminal; hardly easy for non-developers to use.

ONLINE TOOLS

  • Video2Edit — Here’s the link.
  • Wondershare Online Uniconverter — Here’s the link.
  • EZGIF — Here’s the link.
  • ClipChamp — Here’s the link.
  • WeVideo — Here’s the link.

STAND-ALONE SOFTWARE

  • ffWorks (using ffMPEG) — Here’s the link.
  • Bigasoft WebM Converter for Mac — Here’s the link.
  • Wondershare UniConverter — Here’s the link.
  • NCH Software’s Prism — Here’s the link.
  • AppGeeker’s Video Converter — Here’s the link.

SUMMARY

I’m a big fan, and regular user, of ffWorks, however, I haven’t used any of the rest of these. The good news is that, so far, no one has asked me for a WebM file. If you know of other conversion software, please let us know in the comments.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Apple Final Cut Pro X

Tip #1470: Apple Releases Final Cut Pro 10.5.2

Larry Jordan – LarryJordan.com

This latest version is a bug-fix release.

The Final Cut Pro logo

Topic $TipTopic

Last week, Apple released Final Cut Pro 10.5.2. This is principally a bug fix release. Here’s what’s new.

Here are the Apple Release Notes

  • Adds support for a new Universal RED plugin enabling native RED RAW decoding and playback on both Apple silicon and Intel-based Mac computers.
  • Improves stability when playing back H.264 video files with corrupt data.
  • Fixes an issue in which text could disappear when double clicking a value field in the inspector.
  • Fixes an issue in which FCPXML files created from drop frame projects would import as non drop frame.
  • Fixes an issue that may prevent custom Motion titles stored inside the library from appearing in the Titles browser.
  • Improves stability when choosing the DPP/Editorial Services metadata view with MXF media.
  • Improves stability when using AirPlay with Final Cut Pro on a Mac computer with Apple silicon.

The folks at Digital Anarchy added the following notes to the upgrade:

Final Cut Pro 10.5.x uses a new version of the FCP plugin architecture (FXPlug4). The older plugin versions (FxPlug3) still works for now but these new builds of the plugins will be required at some point. They also add support for Apple’s Metal (replacement for OpenCL), the Apple Silicon machines and Big Sur (FxPlug4 and FxPlug3).

10.5.2 adds some bug fixes for FxPlug4, so we highly recommend you upgrade to this version if you’re using 10.5 or higher. 10.4.x will still use FxPlug3 plugins but you still want to download these releases as they add Big Sur support.


Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Random Weirdness

Tip #1410: Frame.io Launches “Camera To Cloud”

Larry Jordan – LarryJordan.com

Camera To Cloud: Live upload from on-set cameras to anywhere in the world. Instantly.

Behind-the-scenes at the live Frame.io launch event on the Paramount lot.

Topic $TipTopic

Frame.io last Thursday launched Frame.io Camera to Cloud (Frame.io C2C). This new workflow lets customers instantly upload and stream images from on-set cameras to creative post-production teams anywhere in the world.

According to Frame: “Frame.io C2C is a breakthrough technology that brings IoT to Hollywood (or any) film sets, catalyzing major changes in the way movies are created. Frame.io C2C has already been piloted for the recent Michael Bay thriller, “Songbird,” the first major Hollywood picture to be allowed to go into production under tight quarantine restrictions. While Frame.io C2C was not created in response to Covid, the pandemic largely accelerated the need for this remote working technology, and has already helped filmmakers get safely back to work.”

This new technology speeds the creative process with key benefits of:

  • Instant off-set access to what is being filmed, from any location and any device (iPhone, iPad, etc.), as it’s being filmed.
  • More time and budget for creativity eliminating typical turnaround times for feedback and editing.
  • Enabling a hybrid or fully remote workforce, giving film sets access to the world’s best talent regardless of their location

EXTRA CREDIT

  • View the Frame.io launch event here.
  • Here’s a link to the Frame.io website to learn more.

Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Random Weirdness

Tip #1412: Frame.io’s “Camera To Cloud” Hardware

Larry Jordan – LarryJordan.com

The hardware is not inexpensive, but the possibilities are vast.

Tweaking a Sound Devices 888 during Frame.io’s launch event.

Topic $TipTopic

Frame.io last Thursday launched Frame.io Camera to Cloud (Frame.io C2C). This new workflow lets customers instantly upload and stream images from on-set cameras to creative post-production teams anywhere in the world.

Frame writes: “We believe camera-to-cloud will have a massive impact on the filmmaking industry at large — especially at a time like now, when filmmakers are trying to get back on set with fewer crew members; camera-to-cloud takes video village off-set for a completely safe way to produce films. ”

To make this happen requires a combination of Frame’s online secure review and comment platform with leading hardware companies to connect production tools with the cloud.

Frame partnered with three leading companies to make C2C possible:

  • Picture: Teradek
    designs and manufactures high-performance video solutions for broadcast and cinema. A Frame.io authenticated CUBE 655 encoder delivers live streams and camera proxy files directly into Frame.io.
  • Sound: Sound Devices
    designs and manufactures the world’s leading production sound field recorders. The latest 888 and Scorpio recorders capture and transmit original audio files directly into Frame.io.
  • Post: Colorfront has developed the world’s first fully cloud dailies platform. Their Express Dailies integration with Frame.io allows labs to instantly access the video and audio assets to create dailies.

As well, Frame published the API of their new system so that studios and developera can create their own custom workflows.

EXTRA CREDIT

  • Here’s the link to more developer information from Frame.
  • Here’s a link to the Frame.io website.

Please rate the helpfulness of this tip.

Click on a star to rate it!

… for Visual Effects

Tip #1420: Virtual Production Takes Big Step Forward

Larry Jordan – LarryJordan.com

Game design intersects with production using LED monitor walls.

Baby Yoda, surrounded by an LED wall, with images created with Unreal Engine. Image courtesy of Disney.

Topic $TipTopic

This article, written by Trevor Hogg, first appeared in VFXVoice.com. This is a summary.

Traditionally, movies and television shows have been divided into three stages consisting of pre-production, production and post-production; however, the lines are blurring with the advancements in virtual production. A great deal of interest was generated with what Jon Favreau was able to achieve utilizing the technology to produce The Mandalorian. Interest turned into necessity when the coronavirus pandemic restricted the ability to shoot global locations. If you cannot go out into the world then the next best thing is to create a photorealistic, computer-generated environment that can be adjusted in real-time. Will virtual production be a game-changer that will have lasting impact?

Scott Schambliss, Production Designer:

“One of the best qualities of our medium is its essential plasticity. By replacing traditional bluescreen/greenscreen tech with LED display walls, a stage working environment is dramatically enhanced by its chief gifts of interactive practical lighting and directly representative motion picture backgrounds the screens display in-camera for shooting purposes. For sci-fi and fantasy projects these advances are major and practical additions.”

Scott Meadows, Digital Domain:

“We recently had a client in the middle of reshoots when COVID hit. We had several props and CG characters, and our team put together some blocking animation that we added to Unreal Engine. Within a day, we had everything we needed for the filmmakers to do whatever they wanted within the scene. For the actual shoot there were only seven people present, with the director, editor, VFX Supervisor and Animation Supervisor all calling in remotely.”

Sam Nicolson, Stargate Studios:

“Virtual production is the new Wild West of the film business where the world of game developers and film producers are merging. From photoreal avatars to flawless virtual sets and extensive Unreal worlds, the global production community has embraced the amazing potential of virtual production as a solution to many of the production challenges facing us during the current global pandemic.”

The article also interviews:

  • Adam Myhill, Unity Technologies
  • Paul Cameron, Westworld
  • Alex McDowell, Experimental Design
  • David Morin, Epic Games Los Angeles Lab
  • Christopher Nicols, Chaos Group Labs
  • Nic Hatch, Ncam
  • Ben Grossman, Magnopus
  • Rachel Rose, ILM

There are images and much longer quotes in the article.


Please rate the helpfulness of this tip.

Click on a star to rate it!