Let's first watch this video
In this video, I just gave the program a game and it learned to play by itself. No, I did
not code the player, that would have been so traditional. Here the player, the computer, the
program, actually learns to play by itself by just playing the game! It does not need me.
I recorded this video for experiencing how a Deep learning algorithm actually works. And as you
can notice, it works amazingly! Deep learning is subset Artificial Intelligence that tries to show
"intelligent behavior" by using something similar to (neural networks) human brains wiring.
It uses mathematics, that we think, human brains internally use to exhibit rational thinking.
The results of these have been amazing. From beating go games (Thanks For Ruining Another
Game Forever, Computers) , to making self-driving cars a possibility. The above video
should give some idea that self-driving cars can learn about hurdles and try to navigate by itself.
How to setup your system for Deep Learning Experiment ?
I wish, you will be excited to replicate this experiment. If you are interested, here is how I setup.
1. Rent a GPU Instance from AWS or Azure. Right now, we need GPUS. They are very costly, but the deep learning frameworks
are not optimized for CPUs. I spent multiple weeks of uptime on CPU without any results. Go for GPU. AWS Has it.
- Setup Ubuntu 14.04 with proper NVIDIA drivers.
- Install X11 and Window Manager. It wont be fun otherwise.
sudo apt-get install xubuntu-desktop xfce
4. Setup viewing your powerful "Cloud Desktop" using nomachine.
Apparently, that's the best way I could setup remote graphic viewing.
- Clone the DeepMind-Atari-Deep-Q-Learner code.
- Install the dependencies.
- And, as my son will say. Here you go!
You can exit nomachine with the program running, and constantly come
back to monitor your computer trying to learn to play a game by itself.