Facebook Inc. today shared details about two internal artificial intelligence projects, Learning from Videos and TimeSformer, that are aimed at facilitating the development of more powerful machine learning models.
With Learning from Videos, the first project, the company will train the machine learning systems powering its social network on clips uploaded by users. Facebook relies on AI to perform tasks ranging from content recommendation to policy enforcement. The company hopes that training its machine learning systems on user-created videos will enhance their capabilities.
Normally, researchers develop AI models with handmade training datasets in which the individual files are tagged by experts with descriptive labels. Those labels help guide the model in the right direction as it learns.
But there’s a tradeoff: The requirement to annotate files manually limits the size of the training dataset that researchers can realistically assemble. That, in turn, limits the amount of learning that AI models can do during training.
By enabling its models to train on unlabeled user-created videos, Facebook will allow them to learn from a far larger amount of information than is available in traditional handmade training datasets. “By learning from global streams of publicly available videos spanning nearly every country and hundreds of languages, our AI systems will not just improve accuracy but also adapt to our fast moving world and recognize the nuances and visual cues across different cultures and regions,” Facebook researchers wrote in a blog post today.
The researchers stressed that the initiative will emphasize privacy. “We’re building and maintaining a strong privacy foundation that uses automated solutions to enforce privacy at scale,” they wrote. “By embedding this work at the infrastructure level, we can consistently apply privacy requirements across our systems.”
To facilitate training on user videos, Facebook is using an approach known as self-supervised learning that removes the need for labeled datasets. It has already started implementing the approach in production. Instagram’s Reels feature, the social network disclosed, uses a self-supervised AI model to display recommended videos that are similar to clips users have viewed recently.
Alongside its work on self-supervised learning, the company today detailed a separate AI project it dubs TimeSformer. It’s described as the first video processing AI based entirely on so-called Transformers, highly efficient machine learning models originally created to analyze text. Thanks to its use of the technology, Facebook say, TimeSformer processes data using less than a 10th the computing resources required by traditional models and can be trained three times as fast.
Facebook says its approach improves the training process in other ways as well. “The best 3D CNNs [a type of AI] today can only use video segments that are a few seconds long,” the company’s researchers explained. “With TimeSformer, we are able to train on far longer video clips — up to several minutes long. This may dramatically advance research to teach machines to understand complex long-form actions in video.”
Since you’re here …
Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!
Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.
… We’d also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.