DeepMind has improved its previous A3C AI agent/bot by adding two new tasks that enable faster learning. Called Unreal, it learns by playing Atari games.
Blizzard revealed a collaboration with Google to offer StarCraft II as a learning platform for artificial intelligence that will be open to researchers.
This AI is one of the first systems to use external memory and deep learning to train itself autonomously, without the need for hard-coded instructions.
Google's DeepMind division announced a partnership with the U.K.'s Moorsfields hospital that will see millions of eye records analyzed for commonalities.
DeepMind's artificial intelligence division can now add another accomplishment to its already lengthy roster: the ability to teach itself 'ant soccer.'
Earlier this year, AlphaGo made short work of the fourth-ranked Go player, Lee Sedol — and now the program has its sights set on the world's number one.