I wrote earlier this year about a fascinating project by a team of Google researchers to automatically derive context from images. Rather than relying on meta information, the project was able to infer knowledge directly from the images themselves via a smart machine learning based approach.
Suffice to say, doing the same thing with video is a similarly challenging proposition, especially when the video is being streamed live. Nevertheless, it’s the challenge that the AI team at Twitter are tackling head on.
Now, it should be said that this is no small undertaking, and the team has had to build their own supercomputer to handle the heavy graphics processing that’s required to do this sort of thing. Their machine is particularly adept at deep learning, which is fundamental to the approach they’re taking.
It also requires a significant shift in how videos are usually tagged, which involves something known as collaborative filtering to gauge the interest in a video and how similar people may like similar videos.
This broadly works for recorded clips, but is much less effective at categorizing live streams. The Cortex researchers have attempted to improve this recommendation system to better filter clips based upon our previous viewing habits.
They’ve utilized deep learning to train its algorithm to understand inputs from a vast range of examples that have been fed the system by a paid network of helpers that have tagged up videos with a rich range of keywords.
Suffice to say, the technology is not yet available in any of Twitter’s live products, but it’s apparently been tested significantly behind the scenes on platforms such as Periscope.
Given the fledgling state of the project, things are in a very early stage right now, and it isn’t clear quite how Twitter might utilize the technology, but it’s certainly possible that it might be used to better offer relevant adverts on live streams.
It’s certainly a fascinating approach and it will be interesting to see just where Twitter take it going forward.