Google's AI is binge watching YouTube videos

Crown

¯\_(ツ)_/¯
  • Google is training its AI using what it terms atomic visual actions or AVAs
  • These are three-second clips of people performing everyday actions
  • Google predicts it could lead to machines that can predict human behaviour
  • It could also help advertisers tailor their campaigns to actions people respond to

45A1B55200000578-5012271-image-a-22_1508848527228.jpg




Google is training its AI using what it terms atomic visual actions (AVAs).

These are three-second clips of people performing everyday actions, from walking and standing up to kicking and drinking from a bottle.

Google says it sourced the content from a variety of genres and countries of origin, including clips from mainstream films and TV, to ensure a wide range of human behaviours appear in the data.

Writing in a blog post, Google software engineers, Chunhui Gu and David Ross, said: 'Teaching machines to understand human actions in videos is a fundamental research problem in Computer Vision.

'Despite exciting breakthroughs made over the past years in classifying and finding objects in images, recognising human actions still remains a big challenge.

'This is due to the fact that actions are, by nature, less well-defined than objects in videos.

'We hope that the release of AVA will help improve the development of human action recognition systems.'


Full article and source: http://www.dailymail.co.uk/sciencet...ching-YouTube-learn-humans.html#ixzz4wR16TP1A

________________________

Pretty cool right? ^^
 
welp...better invest in the bomb shelter now if Skynets incoming xD this is seriously cool mind...though other than benefiting the marketing team I cant see how teaching machines to be able to better sell you things can be a good idea...weirdly Jim sterling touched on a similar subject literally yesterday with ai that analyses your gameplay and uses what it leans to pick its best moment to try and sell you things and also to custom adjust prices to tailor it to the consumer...
 
It is exciting and horrifying at the same time. I heard something about this AI (not sure if it was the Google one) teaching itself to walk. Apparently they gave it a human model and they didn't do a damn thing to it. It made the model walk.

Which hints that the AI is thinking by itself now. It no longer needs direction or someone telling it to do something. It was not told to walk, it did it. Everyone laughed because it looked wonky, but point is that it's instinct was to walk. It could have flopped around, done nothing, but it walked.
 
Google can't even properly translate a text. Try to translate a big text in japanese to english. It won't make much sense. And this is technology that people have been working on for decades. Still can't do it properly. So this idea that we are anywhere near close building machines that can fully understand/replicate humans is just ridiculous. Translating is a much more simple task and they can't do it properly.
 
Back
Top