AEJuice provides free & paid plugins for After Effects and Premiere Pro.
Recently, Jacob Syrytsia, co-founder of AEJuice contacted me about his company. AEJuice provides hundreds of free and paid plugins for After Effects, plus hosts the world’s largest motion graphics community.
AEJuice is a team of motion designers and software engineers that create tools for animation. It was founded in 2015 by Jacob Syrytsia and Mark Duval.
They currently offer a bundle for Premiere Pro consisting of dozens of effects, sound effects, transitions and other elements. They also host “the world’s biggest motion graphics community: ‘Motion Lovers.'”
Bi-weekly episodes on the craft and culture of motion graphics.
Motionographer.com reports that “Between the Keyframes” is a new vidcast hosted by Erin Sarofsky and Austin Shaw, two formidable experts in the motion design industry.
Austin literally wrote the book on motion design (“Design for Motion: Fundamentals and Techniques of Motion Design,” Routledge, 2nd Edition, 2019) and is a sought-after educator and freelance creative director and designer. Erin owns Sarofsky, a studio that puts into practice all of those foundational principles while navigating the crazy tides of an exciting, ever-changing industry.
Online at https://betweenthekeyframes.com, the vidcast is now available via YouTube, Apple podcasts, Google podcasts, Spotify, and wherever you get your podcasts.
Already live with episodes exploring “The History of Now,” “Work from Home,” and “Passion Projects,” brand new episodes are debuting biweekly on Tuesdays. The next installment will cover “Fulltime vs. Freelance,” with part one dropping on July 13, and part two on July 27.
The Unreal engine can make the artificial look natural.
In this screen shot, the terrain (hills, mountains, etc.) was fully sculpted in Unreal Engine with the landscape editing mode, using brushes to sculpt, smooth, and flatten areas in the map.
The landscape material and the vegetation were created with the Brushify toolkit. Finally, the props—rocks, cliffs, and manmade materials—are the result of customized elements and assets from the Megascan library by Quixel.
In this first article of a three-part series, we’ll learn how to produce stunning, natural compositions in Unreal Engine. In particular, we’ll focus on aspects of planning an environment while making an eye-catching, well-balanced composition.
Here are the key points this article covers:
Planning the Environment. One of the biggest challenges while creating natural environments is to plan your scene from the start. Begin with big and bare areas and then develop the details in those macro zones by adding vegetation, assets, and so on.
Sculpting Terrain and Set Dressing. Unreal Landscape offers a series of tools for sculpting maps and adding scattered elements like flowers, grass, or anything else you want to import as an asset in your engine.
The Importance of Biomes. The art of compositing a good environment is also connected with the presence of biomes: a sort of habitat for organisms and the related terrain characteristics. This way we can have different zones—forests instead of grasslands, desert, etc.
Shot Composition: Thinking Like a Photographer. Once you’ve created your own landscape, you want to showcase your work in its best light.
Other subjects include:
The Choice of an Appropriate Vantage Point.
Depth of Field
Positive and Negative Space
That brings us to the end of the first article of a three-part series. We explained how to plan a 3D environment and how to collect photo references with details. We then moved on to talk about the sculpting in Unreal Engine and set dressing with the support of Brushify.
Last week, Chaos introduced initial Universal Scene Description (USD) support for V-Ray 5 for Maya, Houdini and Cinema 4D tying powerhouse production renderers to one of the fastest growing file formats in visual effects. Artists now have a non-destructive way to collaborate and assemble their scenes, making it easier to store and move scene data between different applications.
Initially developed by Pixar, the USD format is designed to hold the most common types of scene data – geometry, shaders, lights, rigs, hair, etc. – so artists have an easy way to share and dynamically update assets without workarounds or compromises. As the pipelines have grown more complex, the need for a universal format has become even more pronounced. Today, Chaos will begin providing V-Ray support to USD, giving artists more flexibility as the technology continues to develop.
The initial Maya implementation will support several key asset exchanges, including static/animated meshes, V-Ray materials, subdivisions, displacement and more. V-Ray 5 for Houdini will also mark the beta launch of V-Ray for Solaris, which helps artists work natively in SideFX’s USD-based shot layout and look dev tool via a V-Ray Hydra delegate.
VFXVoice and Autodesk are co-sponsoring a series of webinars titled: “Ask Me Anything: VFX Pros Tell All.”
The next live event is July 15, 2021, at 12 PM PDT. This event features Cinzia Angelini, Director and Head of Story, Cinesite Studios. Her experience spans from Cinesite to Warner Brothers, DreamWorks, Sony Imageworks, Disney Animation Studios, Duncan Studio and Illumination Entertainment.
Past speakers include:
Chris White, Visual Effects Supervisor, WETA Digital
Nonny de la Peña, Founder, Emblematic Group
Ellen Poon, Visual Effects Supervisor, Visual Effects Producer
Aruna Inversin, Creative Director & Visual Effects Supervisor
Karen Dufilho, Producer, Google Spotlight Stories
Greg Anderson, Head of Studio-NY, Sr. VFX Supervisor, FuseFX
and many others.
Here’s the link to access all of these. All events are free.
Render times in After Effects can vary a lot. If you’ve been using After Effects for a while, you’ve probably had some projects that render quickly, and others very slowly. Sometimes it can be hard to pinpoint exactly why some projects render faster than others.
At a very basic level, there are four things which determine how fast After Effects will render a single frame. Firstly, there’s the resolution of the composition. Secondly, there’s the resolution of each individual layer in the composition. Thirdly, there’s the number of layers in the composition. Finally, there’s the bit-depth of the composition – which determines how much data is required to process each individual pixel.
More recently, Adobe has added a fifth new variable – Multi-Frame Rendering. This is a new feature, still in Beta development, that utilizes more than one CPU core to render multiple frames of a composition at the same time. Depending on your hardware, rendering can be more than twice as fast.
The video [in this article] covers all of the details, but it’s probably worth emphasizing that this is a relatively niche demonstration. I didn’t try changing the overall composition resolution, and having 920 layers in a composition isn’t something you see every day. The project only uses one effect, and yet it’s a great example of how simple changes to bit-depth and resolution can dramatically affect the amount of data that After Effects has to process in order to produce a rendered image.
LIVE STREAM:Mocha Pro + Silhouette for Nuke Compositors
Dan Smith, CraftyApes senior compositor and author of NUKE Codex: Nodes within Nodes, joins the Boris FX team to show how he uses Mocha Pro and Silhouette to supercharge his compositing workflow inside Nuke. (Live Stream – July 7)
https://www.theinsidetips.com/wp-content/uploads/2019/09/Tips-Logo-700x150.jpg00Larry Jordanhttps://www.theinsidetips.com/wp-content/uploads/2019/09/Tips-Logo-700x150.jpgLarry Jordan2021-07-07 01:30:002021-07-07 01:30:00Tip #1761: Boris FX Offers Three New Tutorials
TVBEurope reports that UK-based Strictly Come Dancing has introduced augmented reality into this year’s series. Here’s a detailed look at what they did.
It’s not been easy to bring the ballroom back in the year of the pandemic, and the production team deserve all the plaudits they’ve received for their hard work to get the show on air. Strictly’s production team has been particularly innovative by bringing augmented reality into the mix. From the racing cars in week one, to the elephant that appeared during Bill Bailey’s Quickstep, augmented reality has featured every week during the live shows.
The use of AR has been a real team effort, involving both the lighting and audio teams, as well as companies, Mo-Sys and Potion Pictures. While this year is the first time AR has been employed in the show, it’s something the team has been considering for a while. “We’ve previously used perspective in the floor to create the illusion that the dancers are standing on top of a lighthouse or wedding cake, or skyscraper,” explains Potion Pictures’ managing director David Newton, who also serves as the show’s graphics designer.
“That’s been really effective but the big drawback is you can’t move the camera.”
Newton was asked by Strictly’s producers to look into the possibility of AR, and chose to use Unreal Engine as they were already starting to use it for real time rendering. “Mo-Sys’s name kept coming up in relation to AR and camera tracking, and Epic Games said Mo-Sys have a plugin that works great with their software and we were comfortable with using Unreal so it all sort of added up.”
Using AR looks great on screen, but there’s always the possibility that the couples will end up dancing right through it. How do the team ensure that doesn’t happen? Newton cites Clara Amfo’s recent jive, which featured an AR record player. “The first draft of the record player had it much further down stage, we thought the original starting position was going to be camera right. So we sort of changed how a record player works to actually have the arm on the other side. But it all changed, and it went further upstage and there were a few more meetings about where it was going to go, what colour it should be, what side the arm was going to be, how many letters of complaint we would get if the records spun in the wrong direction,” he laughs.
As well as the graphics, Strictly’s lighting is a key component of the augmented reality employed in the show. David Bishop, the show’s lighting director, says it’s been interesting to explore the relationship between lighting and AR. “In my mind there are two routes with AR, you have to either make something that is real, and therefore it has to appear on the screen as being absolutely real, or you have to make something that’s very clearly an ethereal dimension and isn’t meant to be real,” explains Bishop. “For example, if you’re inside an AR-created house, as we’ve used this series, and there’s a light bulb inside then the person that’s standing inside that house needs to be lit as though it’s coming from that fake light. That means I have to find a real light in the same direction, which sort of does the same job.
Bishop continues: “The tricky thing about that is that our spot ops can’t see the things they’re trying to point the lights at. So they’d be pointing the spotlight in one direction and then I would be saying left a bit, down a bit. It’s that sort of workflow that’s become quite new to us but it is absolutely key, getting the lighting angles right is what’s making the AR even more believable, and that’s certainly something that’s improved throughout the series.”
https://www.theinsidetips.com/wp-content/uploads/2019/09/Tips-Logo-700x150.jpg00Larry Jordanhttps://www.theinsidetips.com/wp-content/uploads/2019/09/Tips-Logo-700x150.jpgLarry Jordan2021-07-07 01:30:002021-07-07 01:30:00Tip #1763: The AR Elephant in the Ballroom
I’m heading to a shoot and my phone rings. It’s Jake, my senior producer.
“Boss, I think we’ve been hacked.”
And with that starts a loooong week of recovery, troubleshooting, and formatting. Our QNAP actually had been hacked.
Quick background. I have a small video production company that produces commercials, brand films, and TV programming.
We are a PC-based shop, with all machines connected to 48TB NAS via a closed 10 gig ethernet network. The NAS, a QNAP TS1685, is stocked with 4TB drives and striped into a RAID 6 configuration. That gives us 40 TB of usable space with the safety net of being able to survive 2 drive failures. The QNAP services four edit suites and a few other computers for browsing and offloading
The QNAP has four 1-gig ethernet ports and a single 10-gig Ethernet port. The 10-gig port services the edit suites. One of the single gig ports connected to our traditional network and was outward-facing to the internet. That was part of the problem.
Up until now, my backup strategy was based around the idea that a hardware failure was the most likely — and dangerous — problem we would face.
Typically, we have at least four copies of all footage shot.
We burn footage cards on an iMac via ShotPut Pro to a bare hard drive (copy 1) along with a copy to a locally attached RAID 5 (copy 2). Then, the footage is loaded into an active project folder on the NAS (copy 3). Once the bare drive (copy 1) reaches capacity, we make an LTO copy (copy 4). When the project is complete, we archive to another bare drive (copy 5) for mastered projects. When that drive is full, it gets an LTO copy (copy 6). The RAID 5 and NAS copies get deleted once everything is mastered off.
We make a Chronsync backup of the NAS every night using an older RAID system to give a near-line-identical copy. Technically, that would be the seventh temporary copy. In this case, 7 wasn’t our lucky number.
The Chrosync backup was made after the hack had occurred, so the ransomed files copied over the last known good copy. And we didn’t have archiving on.
So if you are keeping score at home, that’s a bunch of copies of the footage, but only one copy of projects, image, animation, and music files — all typically smaller than 20 MB. That was our Achilles heel.
Read the full article as Robbie describes how they recovered, how his backup strategy changed and how they are moving forward.
You don’t need to be a big company to get hacked. You just need to connect your servers to the Internet.
Click on the different category headings to find out more. You can also change some of your preferences. Note that blocking some types of cookies may impact your experience on our websites and the services we are able to offer.
Essential Website Cookies
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
We provide you with a list of stored cookies on your computer in our domain so you can check what we stored. Due to security reasons we are not able to show or modify cookies from other domains. You can check these in your browser security settings.
Google Analytics Cookies
These cookies collect information that is used either in aggregate form to help us understand how our website is being used or how effective our marketing campaigns are, or to help us customize our website and application for you in order to enhance your experience.
If you do not want that we track your visist to our site you can disable tracking in your browser here:
Other external services
We also use different external services like Google Webfonts, Google Maps, and external Video providers. Since these providers may collect personal data like your IP address we allow you to block them here. Please be aware that this might heavily reduce the functionality and appearance of our site. Changes will take effect once you reload the page.
Google Webfont Settings:
Google Map Settings:
Google reCaptcha Settings:
Vimeo and Youtube video embeds: