My self-educational approach is usually to get a few rather exhaustive books and read them for cover to cover. Here is my reading list to join the NLP/AI/ML field. - The "Deep Learning" Book by Ian Goodfellow, Yoshua Bengio and Aaron Courville (https://www.deeplearningbook.org/) is a good ressource to get a quick overview of the current tools. - "Artificial Intelligence: A Modern Approach" by Stuart Russell and Peter Norvig (http://aima.cs.berkeley.edu/) is a great ressource for all pre-neural-network tools and methods. - "Machine Learning: A Probabilistic Perspective" by Kevin P. Murphy (https://www.cs.ubc.ca/~murphyk/MLbook/) is a great ressource to go deeper in the probabilistic approach and get a good exposure to Bayesian tools. - "Information Theory, Inference and Learning Algorithms" by David MacKay (http://www.inference.org.uk/mackay/itila/book.html) is a little gem that explain propabilities and Information theory so clearly it's almost unbelievable. - "The Book of Why: The New Science of Cause and Effect" by Pearl, Judea is a good introduction to Causality (more accessible than the big "Causality: Models, Reasoning and Inference") - "Reinforcement Learning: An Introduction" by Richard S. Sutton and Andrew G. Barto (http://incompleteideas.net/book/the-book.html) is a great ressource to get an introductory exposure to Reinforcement Learning - Natural Language Processing: three great ressources I've read with interest: - Kyunghyun Cho's lecture notes on "Natural Language Processing with Representation Learning" are great: https://github.com/nyu-dl/NLP_DL_Lecture_Note/blob/master/lecture_note.pdf - Yoav Goldberg's book on "Neural Network Methods in Natural Language Processing" (https://www.amazon.com/Language-Processing-Synthesis-Lectures-Technologies/dp/1627052984) is nice too (see also an older free version here https://arxiv.org/abs/1510.00726) - Jacob Eisenstein's textbook on "Natural Language Processing" is also a very exhaustive read (https://github.com/jacobeisenstein/gt-nlp-class/blob/master/notes/eisenstein-nlp-notes.pdf) It's also good to complement this with a few online courses depending on what field you feel you should be diving deeper into. I took the following classes: - Computational Probability and Inference (6.008.1x) from edx (https://courses.edx.org/courses/course-v1:MITx+6.008.1x+3T2016/course/) - Probabilistic Graphical Models Specialization from coursera (https://www.coursera.org/specializations/probabilistic-graphical-models)