A complete study on Stemming and Lemmatization and which technique is used under different Natural Language Processing Tasks.

Natural language processing is one of the fastest growing fields in the world. NLP is making its way into a number of products and services that we use in our day to day life. Most important stages of a NLP pipeline are text processing and cleaning including Stemming and Lemmatization.

Image for post
Image for post
Image by Clarissa Watson on Unsplash

Natural Language Processing (NLP)

Textual data can come from a wide variety of sources like the world wide web, PDFs, word documents, speech recognition systems, book scans, optical character readers (OCR), etc.


A complete study about capturing the contextual meanings of neighbouring words using techniques like Word2Vec & GloVe.

One hot encoding usually works in some situations but breaks down when we have a large vocabulary to deal with because the size of our word representation grows with the number of words. What we need is a way to control the size of our word representation by limiting it to a fixed size vector. There comes the need for word embeddings!

Image for post
Image for post
Image by Kimberly on Unsplash

In other words, we want to find an embedding for each word in some vector space and we wanted to exhibit some desired properties.

Image for post
Image for post
Representation of different words in vector space (Image by author)

For example, if two words are similar in meaning, they should be closer to each other compared to words that are not. And, if two pair of words have a similar difference in their meanings, they should be approximately equally separated in the embedded space.


A better intuition to localization concept in autonomous driving!

In the previous medium article, we covered sensor fusion and how to combine
different sensor readings like lidar and radar data with the help of some introduction. In this medium article, we will learn about localization. Localization is what allows an autonomous car to know precisely where it is. It’s an amazing topic, and I love it 😍

Image for post
Image for post
Introduction to Localization (Image by author)

And of course, localization is really important to drive autonomously. Without localization it would be impossible for a self-driving car to drive safely. So, what is localization?

Localization Intuition

Conceptually, localization is pretty straight forward. A robot takes information about its environment and compares that to information it already knows about the real world. Humans do something similar. Imagine you were suddenly kidnapped and blindfolded. You are stuffed into a
car that drove for hours. You would have no idea at what place you were. …


A Basic introduction to RADAR and LIDAR along with their strengths and weaknesses

We played a lot with images and camera images in the previous medium articles. Check for those in the references section. As you might imagine, the eyes for the machine are as important as our own eyes when we see the world. But we don’t just have eyes, we have noses, we have ears, we have skin, we have built-in sensors that can measure the deflection of our muscles, or we can understand the gravity points through our ears. In this medium article, we will understand what is RADAR and LIDAR. …


Introduction to Keras and the use of Transfer Learning in the development of Deep Learning architectures

In this medium article, I’m going to explain the basics concepts behind Keras, Transfer Learning and Multilayer Convolutional Neural Network. I’ll be introducing an interface that sits on top of TensorFlow, and allows us to draw on the power of TensorFlow with far more concise code.

Image for post
Image for post
Photo by Vincent Ghilione on Unsplash

That’s right. In this medium article, we’ll be building a deep neural network using a new set of tools. We’ll still have TensorFlow under the hood, but with an interface that makes testing and prototyping much faster.

Deep Learning Framework

Deep neural networks have been a big focus of work in Autonomous Driving. We’re exploring whether we can get a car to drive itself, using only deep neural networks and nothing else. Sometimes we call that behavioral cloning because we’re training the network that clone human driving behavior. Well, sometimes it’s called end-to-end learning because the network is learning to predict the correct steering angle and speed, using only the inputs from the sensors. Deep learning isn’t the only approach to building a self-driving car. For a number of years, people have been working on a more traditional sort of robotics approach.


Introductory concepts in the field of Image Recognition using Convolutional Neural Networks

One of the very popular ways to structure a neural network is called a Convolutional Neural Network. It was invented by Yann LeCun about 30 years ago, but it’s become incredibly popular for things like image processing, and processing large datasets. So, let’s talk about Convolutional Neural Networks.

Image for post
Image for post
Photo by Victor Grabarczyk on Unsplash

Statistical Invariance

Here’s an example. We have an image, and we want our network to say it’s an image with a cat in it. It doesn’t really matter where the cat is, it’s still an image with a cat. If our network has to learn about kittens in the left corner, and about kittens in the right corner independently, that’s a lot of work that it has to do. …


Foundational Concepts in the field of Deep Learning and Machine Learning

Welcome to this Medium Article. This article is an extended version of the Introduction to Deep Learning for Self Driving Cars (Part — 1) 😀

Image for post
Image for post
Image by Marília Castelli on Unsplash

Let’s take our neural networks one level deeper and learn about concepts that every expert knows. Things like activation function, normalization, regularization, and even things like dropouts to make the training more robust so that we can become much more proficient in training neural networks.

Introduction

In the last medium article referenced below, we’ve trained a simple logistic classifier on images. Now, we’re going to take this classifier and turn it into a deep network. …


Foundational Concepts in the field of Deep Learning and Machine Learning

One of the coolest things that happened in last decade is that Google released a framework for deep learning called TensorFlow. TensorFlow makes all that hard work that we’ve done superfluous because now you have a software framework. They can very easily configure and train deep networks and TensorFlow can be run on many machines at the same time. So, in this medium article, we’ll focus on TensorFlow because if one becomes a machine learning expert, these are the tools that people in the trade use everyday.

Image for post
Image for post
Image by Kevin Ku on Unsplash

A convolutional neural network is a specialized type of deep neural network that turns out to be particularly important for self-driving cars. …


Foundational concepts in the fields of Machine Learning, Deep Neural Networks and Self Driving Cars

Welcome to this Medium Article. This article is an extended version of the Introduction to Neural Networks For Self Driving Cars (Foundational Concepts Part — 1) 😀

Image for post
Image for post
Image by ahmedgad on Pixabay

One-Hot Encoding

So, as we’ve seen so far, all our algorithms are numerical. This means we need to input numbers, such as a score in a test or the grades, but the input data will not always look like numbers.

Let’s say the module receives as an input the fact that you got a gift or didn’t get a gift. How do we turn that into numbers? Well, that’s easy. If you’ve got a gift, we’ll just say that the input variable is 1. And, if you didn’t get a gift, we’ll just say that the input variable is 0. …


Foundational concepts in the fields of Machine Learning and Deep Neural Networks

Welcome to this Medium Article. Perhaps the hottest topic in the world right now is artificial intelligence. When people talk about this, they often talk about machine learning, and specifically, neural networks. What people have done in the last decades kind of abstracted the theory into a basis set of equations that emulate a network of artificial neurons. Then people have invented ways to train these systems based on data. So, rather than instructing a machine with rules like a piece of software, these neural networks are trained based on data. So, now we’re going to learn the very basics for, perception, error-functions, terminology that doesn’t make sense yet, but by the end of this medium article, you should be able to write and train your own neural network. …

About

Prateek Sawhney

AI Engineer at DPS, Germany | 1 Day Intern @Lenovo | Explore ML Facilitator at Google | HackWithInfy Finalist'19 at Infosys | GCI Mentor @TensorFlow | MAIT, IPU

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store