Trainig and Internship
### Introduction to Deep Learning with Python

- Deep Learning
- Python

- Certifications – Certificate will be provide after complication of the course.
- Available –Online / Offline Learning (No recorded lecture / live session of each class).
- 60 hrs Course..
- Rich course content for master learning
- Pay after six demo class.
- 70% Hands on Course + 30 %Theory.
- One MCQ and one case study is compulsory for completion of course..
- Certificate will generated after successfully completion of the course.

- Mathematics and Statistics.
- Python

Level-1

- Why to use python ?
- Python IDE
- Simple Program in Python
- Numbers And Math functions
- Common Errors in Python

Level-2

- Variables & Names
- String basics
- Conditional statements
- Assignment 2
- Functions
- For and While

Level-3

- Functions as arguments
- List,Tuple and Dictionaries
- List Comprehension
- File handling
- Class and Objects

Level-4

- Numpy
- Pandas
- List Comprehension
- Matplotlib
- Seaborn
- Ggplote
- Tensorflow

Biological Neuron, From Spring to Winter of AI, The Deep Revival, From Cats to Convolutional Neural Networks, The Curious Case of Sequences, Motivation from Biological Neurons, McCulloch Pitts Neuron, Thresholding Logic Perceptrons, Error and Error Surfaces, Perceptron Learning Algorithm, Proof of Convergence of Perceptron Learning Algorithm

Linearly Separable Boolean Functions, Representation Power of a Network of Perceptrons, Sigmoid Neuron, A typical Supervised Machine Learning Setup, Learning Parameters: (Infeasible) guess work, Learning Parameters: Gradient Descent, Representation Power of Multilayer Network of Sigmoid Neurons

Multilayer Perceptron Neuron: Introduction, Model, Learning, Evaluation, Geometry Basics, Geometric Interpretation, Perceptron: Learning - General Recipe, Learning Algorithm, Perceptron: Learning - Why it Works?, Perceptron: Learning - Will it Always Work?, Perceptron: Evaluation, A simple deep neural network, A generic deep neural network, Understanding the computations in a deep neural network, The output layer of a deep neural network, Output layer of a multi-class classification problem, How do you choose the right network configuration, Loss function for binary classification, Learning Algorithm (non-mathy version),

Backpropagation

- Setting the context
- Chain rule of derivatives
- Applying chain rule across multiple paths
- Applying Chain rule in a neural network
- Computing Partial Derivatives w.r.t. a weight
- Computing Derivatives w.r.t. Hidden Layers
- Computing Partial Derivatives w.r.t. a weight
- Computing Partial Derivatives w.r.t. a weight when there are multiple paths.

Optimization Techniques, Contours Maps, Momentum based Gradient Descent, Nesterov Accelerated Gradient Descent, Stochastic And Mini-Batch Gradient Descent, Tips for Adjusting Learning Rate and Momentum, Line Search, Gradient ,Descent with Adaptive Learning Rate, Bias Correction in Adam

Bias and Variance, Train error vs Test error, Train error vs Test error (Recap), True error and Model complexity, L2 regularization, Dataset augmentation, Parameter sharing and tying, Adding Noise to the inputs, Adding Noise to the outputs, Early stopping, Ensemble Methods, Dropout

Convolution Neural Networks

The convolution operation, Relation between input size, output size and filter size, Convolutional Neural Networks, CNNs (success stories on ImageNet), Image Classification continued (GoogLeNet and ResNet), Visualizing patches which maximally activate a neuron, Visualizing filters of a CNN, Occlusion experiments, Finding influence of input pixels using backpropagation, Guided Backpropagation, Optimization over images, Create images from embedding, Deep Dream, Deep Art, Fooling Deep Convolutional Neural Networks

Eigenvalues and Eigenvectors, Linear Algebra: Basic Definitions, Eigenvalue Decomposition, Principal Component Analysis and its Interpretations, Singular Value Decomposition. Introduction to Autoncoders, Link between PCA and Autoencoders Regularization in autoencoders, Denoising Autoencoders, Sparse Autoencoders, Contractive Autoencoders.

Sequence learning part-1

One-hot representations of words, Distributed Representations of words, SVD for learning word representations, Continuous bag of words model, Skip-gram model, Skip-gram model, Contrastive estimation, Hierarchical softmax, GloVe representations, Evaluating word representations, Relation between SVD and WordSequence Learning Problems

Sequence learning part-2

Recurrent Neural Networks, Backpropagation through time, The problem of Exploding and Vanishing Gradients, Long Short Term Memory(LSTM) and Gated Recurrent Units(GRUs), Introduction to Encoder Decoder Models, Applications of Encoder Decoder models, Attention Mechanism, Attention over images, Hierarchical Attention.