Natural Language Processing Specialization from deeplearning.ai: Q&A with Younes Bensouda Mourri

Younes Bensouda Mourri is an instructor of the new Natural Language Processing Specialization from deeplearning.ai on Coursera. The intermediate-level, four-course Specialization helps learners develop deep learning techniques to build cutting-edge NLP systems. Apart from his research interest in AI, Younes is actively working to better AI education for some of the brightest minds at Stanford as well as millions of online learners.

How did you become interested in AI? Why is it important to further AI education? 

Back in the ‘90s, we couldn’t have imagined many of the careers that exist today. Now think about 2050. How many professions that exist right now are going to stand the test of time? That’s what intrigued me about the nature of AI — not only is it changing every other industry, but AI itself is constantly evolving. We need to be comfortable using AI the way that we all use electricity now. 

A common use case of the power of AI is its scope to automate simpler, everyday tasks so humans can focus on complex problem-solving instead. For example, reading medical images to detect cancer faster is just one of the ways that AI is saving lives. In the future, AI could be embedded into medical devices, eliminating chart reading time and allowing doctors to shift their focus from paperwork to patient care and recovery instead.

How did you get into teaching? What unique perspectives do you try to inculcate while designing educational material?

While working in the Artificial Intelligence Lab at Stanford, I started fixing bugs in Andrew Ng’s Machine Learning course on Coursera and publishing my lecture notes every day. These were surprisingly well received, which made me realize how many people wanted to learn about AI. At the same time, I realized that with online learning becoming a more prominent education solution, you could reach thousands of learners in real-time from your dorm room. 

My growing expertise in AI and software engineering skills, combined with a strong personal desire to improve education, led me to start working with Professor Ng. Designing and teaching courses in AI was where my interests and skills converged naturally. 

Creating course content and designing assignments that stand the test of time aren’t easy tasks. I have to review a lot of papers in order to understand both the actual and the projected future impact of ML technologies. Every course must find the right balance between concepts and applications — so that even if the papers become outdated, the learners retain strong foundational knowledge. 

What makes you uniquely qualified to do this? What are your goals for the future?

I consider myself fortunate; I was able to study what I’m passionate about at Stanford. Now, I’d like to pay it forward. I want my contribution to not only advance the field of AI but also help provide world-class AI education to those who might not be able to afford or access it in more traditional ways. 

To this effect, I’m training the next generation of AI engineers and architects. I co-teach applied machine learning to hundreds of students at Stanford. I also work with Professor Ng and the product team at deeplearning.ai to develop AI and ML content for some amazing courses.

It’s humbling to know that the courses I’ve helped build educate millions of learners in the US and globally. I plan to continue teaching AI and to develop AI tools that can be used for education in the future.

What is Natural Language Processing (NLP) and how does it fit into the larger ML ecosystem? 

Natural Language Processing is where linguistics, computer science, and artificial intelligence converge. NLP algorithms come into play when computers have to analyze large amounts of data in the form of human languages and perform tasks. NLP has become one of the most broadly applied areas of machine learning. As AI continues to expand, so will the demand for professionals skilled at building models that analyze speech and language, uncover contextual patterns, and produce insights from text and audio. 

How do you foresee the NLP landscape evolving over the next few years? What are you most excited about? 

NLP is actively reshaping our everyday lives. Every time you use web search to find information, give commands to a smart speaker, raise a customer complaint via a chatbot, or don’t see junk mail because of anti-spam filters, there’s a powerful NLP system in the background making it all possible.

I’m thrilled when NLP technologies are applied to education. Very soon, we will have robust systems that can save millions of teacher hours by providing the functionality to grade short answers and even answer questions to help students understand specific topics they might be struggling with.

What advice would you give a learner trying to break into the field of AI?

If you want to build scalable and efficient AI systems, you need to have a solid software engineering background to easily process data sets, scrape the internet, use Amazon web servers, set up docker containers, and more. I recommend first taking computer science courses and then following them with the deeplearning.ai Specializations

As a teacher, I can also confidently say that the best way to learn is to teach. I strongly encourage learners to answer questions on the forums or take turns explaining concepts to their fellow learners as a way to improve their knowledge of machine learning.

What is this Specialization about? Tell us about your role in creating this Specialization. 

This Specialization will provide you with the foundation needed to understand NLP systems and the knowledge to build them yourself. You will implement a sentiment analysis system, build models that translate human languages, and even construct a chatbot. You will master the most important NLP architectures including transformer networks, and receive practical hands-on training to implement techniques like tokenizing text.

I began by defining a syllabus that was easy to understand yet impactful enough to be useful to learners. Working with the deeplearning.ai team, I drafted multiple versions of the educational material that you see in the Specialization today. It’s been a long incubation period but I’m very proud of the work that has gone into creating the course content! 

Who is the ideal learner for the Specialization? What should learners know prior to taking the Specialization?  

This Specialization is for students of machine learning or artificial intelligence as well as software engineers looking for an understanding of how NLP models work. Learners should have knowledge of machine learning, intermediate Python skills including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. If you would like to brush up on these skills, I recommend the Deep Learning Specialization first.
Get started by enrolling in the Natural Language Processing Specialization today.

The post Natural Language Processing Specialization from deeplearning.ai: Q&A with Younes Bensouda Mourri appeared first on Coursera Blog.

Source:
https://blog.coursera.org/natural-language-processing-specialization-from-deeplearning-ai-qa-with-younes-bensouda-mourri/