CALL US

800-634-5178

Avid On The Forefront Of AI

Avid On The Forefront Of AI

By Adam Noyes 0 Comment July 24, 2019

Avid has forged the path toward implementation of Artificial Intelligence in post-production, and their use of AI is just beginning.

Artificial Intelligence (AI) is making major inroads to post-production, and nowhere more significantly than in Avid’s editing systems.

When speaking with David Colantuoni, vp of product management at Avid, I started by asking how he would define what AI is.

ScriptSync matches each source clip to its associated line in the script.

“In the Media and Entertainment context, we look at AI as using machine learning to reach out to massive amounts of data and streamlining it so it can become practical to use it in productions,” he began. “In addition, it is used to relieve humans of repetitive procedures by letting computers supervise automated tasks.”

And Avid has been in the forefront of putting these concepts to good use.

“We’ve had AI functionality in our systems for many years,” Colantuoni said, “with our ScriptSync and PhraseFind features that let an editor search for spoken words in a general phonetically indexed database. We were using AI before the expression came into general usage.”

PhraseFind allows editors to search their Media Composer project phonetically – that is, by the sounds of the words.

In fact, just to jog peoples’ memories, ScriptSync, or the ability to reference raw footage to its position in a written script was first introduced in 2007.

It was joined by the ability to search for the appearance of specific words with PhraseFind in 2011.

Although well received, due to licensing machinations behind the scenes both were discontinued in 2014.

But ideas as downright useful as those two could not stay down forever, so ScriptSync and PhraseFind re-appeared in 2017, each upgraded to version 2.0 and each with an upgraded license fee offered either separately or bundled together.

“Now we are more and more getting into ways to leverage the computing power of the cloud by using MicrosoftAzure,” Colantuoni told me. “That is giving us the ability to help people responsible for logging massive amounts of source video for productions such as, for example, Reality Shows, based on several criteria.”

Azure was announced in October 2008 under the codename “Project Red Dog.”

Using video uploaded from MediaCentral to the Azure cloud, you can process the visual information and turn it into metadata. Then you can search that metadata for specific words, faces, or even objects.

“Say there is a specific red car you need to use in a scene, you can have the system find all instances of that red car by searching the metadata identifying it,” he said. “But it doesn’t end there.”

Avid has extended their metadata information into quality assurance. “For example if PhraseFind has identified some words that should be place in a certain scene,” Colantuoni described, “the operator can access the linked metadata to verify that the associated closed captions are appearing in the proper shots.”

It all has to do with MediaCentral interfacing with a database because of the metadata component that enables this kind of multi-level search.

“Metadata is going to be providing the foundation of ever more elaborate search capabilities as we learn to leverage AI more extensively in post-production,” Colantuoni finished up. “But, of course, it is always going to require human intelligence to give the final result meaning.”