Want to learn Artificial Intelligence, Machine Learning and Deep Learning?

Over the last 18 months I have spent my nights studying Artificial Intelligence, Machine Learning and Deep Learning. I thought it might be useful to share the resources I’ve used to gain my education in these fields. Below is a list of resources categorised by Artificial Intelligence, Machine Learning and Deep Learning disciplines. Some other topics are covered which will be necessary for certain disciplines.

Artificial Intelligence

Machine Learning

  • Introduction to Machine Learning by Sebastian Thrun and Co available at Udacity. This is a great free course if you are starting out in Machine Learning. It is a good place to gain a solid foundation before moving onto Deep Learning.
  • Jason Brownlee is a fellow Australian with excellent educational resources on Machine Learning. I’ve purchased his book Master Machine Learning Algorithms. Jason walks you through the major Machine Learning concepts and algorithms using Excel spreadsheets (yes that is not a typo!). It sounds primitive but it is actually a great way to unpack a Machine Learning algorithm and see the path it takes as it iteratively tries to learn the objective.
  • Andrew Ng’s Stanford Machine Learning course is available on ITunes. If your mathematics is rusty you might find this one difficult to follow.
  • Andreas C. Muller and Sarah Guido wrote Introduction to Machine Learning with Python. This is another great book that covers the main concepts and algorithms using the scikit-learn Machine Learning Python package.

Deep Learning

  • The free Deep Learning Book available online. If you want all the theory behind Deep Learning this is a good resource.
  • Andrew Ng’s most recent course on Deep Learning. You can take the course on Coursera or view the videos for free on youtube. If you are interested in the mathematics behind Deep Learning this course is ideal. I particularly like the way Andrew builds on Logistics Regression as the building block of Neural Networks and Deep Neural Networks.
  • Udacity have a Deep Learning Nanodegree which I completed in 2017. This has received a few bad reviews however my experience was pretty good. A piece of advice is don’t start with this course. Use the other free resources to get acquainted with Deep Learning before you pursue the Nanodegree.
  • Practical Deep Learning for Coders and Cutting Edge Deep Learning for Coders is great for people with a coding background (particularly Python) and want to dive right into applying Deep Learning prior to learning the theory. Having said that Jeremy is a fan of Excel and uses it to teach some of the main concepts behind Deep Learning.
  • Jason Brownlee has another book on Deep Learning with Python. This includes examples and code for the popular Deep Neural Network architectures.

Mathematics and Statistics

Programming

  • Many of the courses in the list require Python programming language experience. If you can program but don’t know Python check out the free course Programming Foundations with Python at Udacity.

Coding, Libraries and Frameworks

  • The Anaconda distribution of Python makes it very easy to get up and running with a nice package management system called conda. This can be used to install Machine Learning and Deep Learning packages.
  • Learn about Jupyter Notebooks. They provide a way to perform iterative programming with visualisations and interactivity mixed in.
  • There are tons of example on GitHub. Just look for Jupyter Notebooks.
  • All the code and examples from the textbook Artificial Intelligence: A Modern Approach (3rd Edition) are available on GitHub in various languages.
  • If you are doing Machine Learning in Python scikit-learn is the de-facto standard, although certain algorithms can be slow as they have not been parallelised.
  • If you need a high performance Machine Learning framework check out h2o.ai. There are bindings for multiple languages and it is very fast (i.e. developed with multi-processing in mind). It is also great for out-of-core processing (i.e. where the dataset you are learning from cannot fit into memory).
  • TensorFlow, Keras, PyTorch, MXNet are used for Deep Learning, although in some cases they can be used for certain Machine Learning algorithms as well. The key to these frameworks is support for GPU’s which speeds up certain operations. In some cases these frameworks can also utilise multiple GPU’s and even clusters of GPU powered servers.
  • XGBoost is the de-facto library for tree-boosting algorithms and has various language bindings including Python and R. XGBoost is very popular in Kaggle competitions.

Compute Resources

Data

  • I like to learn by example and there is no better place for Machine Learning and Deep Learning examples than the Kaggle platform. If you want to test your skills you can always enter a Kaggle competition.
  • As you practice these disciplines you will need data to analyse. There are a number of public data repositories for this purpose. Have a look at the datasets available on Kaggle, UCI Machine Learning repository and AWS Public Datasets. There are plenty more but these should get you going.

Community

  • I often attend the Melbourne Data Science meet-up. Look for a similar community in your area and get involved in a Datathon (equivalent to a Hackathon but focused on analysis of a dataset).
  • I occasionally browse the Machine Learning Reddit.

Research

  • arXiv contains research papers. Alternatively there is arXiv Sanity Preserver an interface into arXiv that offers additional features like full-text search, similarity search and a recommendation system.
  • Google Research Blog contains content from the Google Brain Team.
  • DeepMind is focused on AI research and owned by Google. They are best known for using Artificial Intelligence to teach a computer agent to play the game of Go. The artificial agent named AlphaGo was used to beat Lee Sedol the world Go champion in 2017. If you get a chance watch the movie. Their research is primarily focused on learning to learn through Reinforcement Learning.
  • ICML is the International Conference on Machine Learning. Beware it is primarily for researchers.
  • NIPS is the Neural Information Processing Systems Foundation.  They run an annual conference focused on Neural inspired representation learning systems. Again, primarily for researchers.
  • IJCAI is the International Joint Conference on Artificial Intelligence.
  • Twitter is commonly used to keep up to date on AI, ML and DL research topics. I follow these accounts @arxiv_org, @JeffDean, @StatModeling, @fulhack, @Numenta, @_beenkim, @icmlconf, @karpathy, @fchollet, @jmschreiber91, @soumithchintala, @PyTorch, @distillpub, @RichardSocher,@_brohrer_,@sirajraval, @iamtrask, @jeremyphoward, @goodfellow_ian, @TensorFlow, @anacondainc, @enthought, @NipsConference, @Di_Ku, @DeepMindAI, @AndrewYNg, @peteratmsr, @rasbt, @odsc, @TeachTheMachine, @randal_olson, @robhyndman, @amuellerml, @DataRobot, @twiecki, @datawallow, @StanfordEng, @Google, @MSFTResearch, @indicoData, @hortonworks, @cloudera, @adriancolyer

That’s it for now. I hope you find the list useful.

@pmarelas

 

 

 

 

Leave a Comment