What you need to know
- Google Deepmind provides insight into an AI tool being used to improve YouTube’s video streaming platform.
- Shorts are aided by AI-generated metadata descriptions created by Google’s Flamingo visual language model.
- These AI-generated descriptions describe what’s being shown on screen, helping make Shorts videos more searchable.
Google has been working to continuously mix AI technology with its ecosystem of apps both on desktop and mobile. That also includes popular services like YouTube, as highlighted in a recent video.
A Tweet from Google Deepmind showcases how the company is applying its AI research for the betterment of YouTube Shorts. The post shows how utilizes AI that “automatically generates descriptions for hundreds of millions of videos in their metadata, making them more searchable.”
This is an area that may likely improve how YouTube Shorts are watched and discovered, although the platform already sees daily viewership sitting at around 50 billion. However, one problem with these videos is that they typically lack descriptions and helpful titles because of the nature of the videos.
To combat this, Google is implementing the “Flamingo” visual language model to generate descriptions for users. This model will take frames from the Short and describe what’s happening on screen. The AI-generated description will become metadata and be stored within YouTube to help the platform categorize them and suggest them properly when users search for something to watch.
These AI-generated descriptions are already being applied to all newly uploaded Shorts, meaning less work for Shorts creators.
Our powerful visual language model Flamingo is changing the way 𝘺𝘰𝘶 can watch @YouTube Shorts. 🦩It automatically generates descriptions for hundreds of millions of videos in their metadata, making them more searchable.Here’s how AI is helping creators and viewers. ⬇️ pic.twitter.com/pAt7MxFNs1May 24, 2023
Todd Sherman, the director of product management for Shorts, explained to The Verge that this is all done in the background to help improve the Shorts experience. He also points out that it’s being done responsibly and in a way that helps the system understand the content.
“We don’t present it to creators, but there’s a lot of effort going into making sure that it’s accurate,” Sherman told The Verge. “It’s very unlikely that a descriptive text is generated that somehow frames a video in a bad light. That’s not an outcome that we anticipate at all.”
For now, it seems like Flamingo just being used for YouTube Shorts, but the Tweet also links to a post from 2022 that details other ways Google DeepMind is improving the YouTube experience.
The Flamingo technology on YouTube is just one of Google’s latest AI endeavors. After its I/O 2023 event, we learned more about what the company has in store. AI efforts are arriving for Google Search, Workspace, and many more of the company’s products, which seems primed to take on Microsoft’s Bing and OpenAI’s ChatGPT.