# Coursera - Deep Learning Specialization

Projects | | Links: Coursera | GitHub

Deep Learning is a the basis of general artifical intelligence. At the heart of this field are ‘neural nets’, which are mathematical nodes connected in specific ways to generate desired outcomes. The ways neural nets are connected hasn’t been developed into closed-form processes yet and the desired outcome is specific. The most infamous is Alpha Go.

Andrew Ng is a professor at Stanford and CEO of DeepLearning.AI, which is on a mission to make deep learning education more accessible to the world. Ng teamed up with Coursera to share his tips, tricks, and techniques. I audited the courses in this specialization to better familiarize myself with the lesson Ng and team provide. The specialization consists of five courses: Neural Networks and Deep Learning, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization , Structuring Machine Learning Projects, Convolutional Neural Networks, Sequence Models

To apply and test my skills I have completed the course assignments. However, we are not allowed to share our code publicly. To make up for this I have created similar projects that I can share.

## Neural Networks and Deep Learning

### Week 1 - Understand Neural Nets

Imagine trying to predict the price of a home. We already know the bigger the house the more expensive. But also know to consider other features such as location (e.g. Mount Vernon, IL, vs Orange Beach, CA), distance from schools, number of bedrooms, and size of yard (lot minus house). There are certainly more variables and more complex relations to explore. We can use these ‘‘features’’ with a neural net.

The neural net find increasingly complex relationships between these features. Each feature and the relationships have their own unique weights of consideration. After considering everything we compare it against truth. Our goal is to minimize error between what we predict the house to be and what it actually costs. And: we do this with backpropagation

Backpropagation is like doing a calculation in reverse which then changes the initial values of the calculation. The goal of back propagation is to tune the calculations so we can accurately predict the house price. The challenge is to be able to do this for any house price, even if we have never encountered the area, size, etc. The key in this exercise is the weights.

### Week 2 - Build NN for Logistic Regression

** In progress **

### Week 3 - Planar Data Classification with Shallow NN

** In progress **

### Week 4 - Image Classification with Deep NN

** In progress **

## Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization

** In progress **

## Structuring Machine Learning Projects

** In progress **

## Convolutional Neural Networks

** In progress **

## Sequence Models

** In progress **