Big Brother Is Watching You: Google AI Learning Human Behavior With YouTube Videos
Google is one of the leading names in the AI Race. Only a few other companies are working at the level where Google is, in terms of progress in the world of Artificial Intelligence. The Google AI has been doing some really incredible things over the past few years. It has been beating experts at their own game. It has become so good that Google’s machine learning software is building machine learning software of it’s own, which has a higher accuracy than the ones built by humans! The Google AI also learnt how to walk on its own!
In latest news, the Google AI is watching us. Not literally, but with YouTube. Google has provided over 50,000 selected videos to their AI system which will help it understand and predict human behavior in a better manner. A total of 57,600 clips are provided to the AI, where over 96,000 humans can be seen. It focuses on 80 actions.
These clips are from movies of various genres and from various countries. Google has labeled each person in the video separately so that the AI knows if there are two people in a certain situation they shake hands, or they hug. In some situations they kiss while they hug, etc. The Google AI is going through all these clips and will process and understand how humans interact in social situations in a better manner.
This dataset of videos which the Google AI is going through is collectively called as AVA (Atomic Visual Actions). All these clips are of three seconds and are of humans doing basic things such as hugging, shaking hands, drinking from a bottle, etc.
“Despite exciting breakthroughs made over the past years in classifying and finding objects in images, recognizing human actions still remains a big challenge,” Google commented. “This is due to the fact that actions are, by nature, less well-defined than objects in videos.”