Foundations
- Introduction to machine learning, deep learning and AI
- Introduction to neural networks, TensorFlow and Keras
- Important changes in TensorFlow 2.0
- Introduction to neural networks
- Introduction to Artificial Intelligence applications
Classification – our first example of TensorFlow 2.0
-
Multi-layer perceptron (MLP) – our first example of a network
-
Problems in training the perceptron and their solutions
- Activation functions – sigmoid, tanh, ReLU and others
Improving the baseline
-
Establishing a baseline
- Improving the baseline using various strategies such as
- hidden layers
- dropout
- optimisers
- epochs
- controlling the optimizer learning rate
- increasing the number of internal hidden neurons,
- increasing the size of batch computation, etc
- Regularization to avoid overfitting
Regression
- What is regression?
- Prediction using linear regression
- Simple linear regression, multiple linear regression and multivariate linear regression
- Predicting house price using linear regression
- Logistic regression
- Logistic regression on the MNIST (Modified National Institute of Standards and Technology) dataset
Convolutional Neural Networks
-
Deep Convolutional Neural Network (DCNN)
-
Local receptive fields
- Shared weights and bias
- Pooling layers, max pooling, average pooling
- LeNet code in TensorFlow 2.0
Natural Language Processing
-
Word embedding
Recurrent Neural Networks
-
The basic RNN cell
-
Backpropagation through time (BPTT)
- Vanishing and exploding gradients
- Long short-term memory (LSTM)
- Gated recurrent unit (GRU)
Autoencoders
-
Introduction to autoencoders
-
Vanilla autoencoders
- Sparse autoencoder
- Denoising autoencoders
- Clearing images using a denoising autoencoder
- Stacked autoencoder
- Convolutional autoencoder for removing noise from images
Unsupervised Learning
-
Principal component analysis
-
PCA on the MNIST dataset
- K-means clustering
- Restricted Boltzmann machines
- Reconstructing images using RBM
- Deep belief networks
- Variational Autoencoders
TensorFlow and Cloud
-
Deep Learning on Cloud
-
Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP)
- Virtual machines on Cloud
- TensorFlow Extended for production
Working with real life applications – processes
-
Working with real life applications – added Agile, Cloud, etc.
The course will remain open, with the materials still available to you for six additional weeks after the end of the course.
Note the specific course content may be subject to minor changes as the course is regularly reviewed to be kept up-to-date with industry trends in this emerging field.
Coding Exercises:
Hands-on coding exercises will be set during the week via Microsoft Teams, with full support provided by the tutors. These will be a mix of individual and group projects.
The coding exercises will relate to the areas discussed in the tutorials:
- Classification – our first example of TensorFlow 2.0: Recognising handwritten digits as a classification example
- Improving the baseline: regularization
- Regression: logistic regression
- Convolutional Neural Networks: CIFAR-10
- Recognizing CIFAR-10 images with deep learning
- Improving the CIFAR-10 performance with a deeper network
- Improving the CIFAR-10 performance with data augmentation
- Predicting with CIFAR-10
- Natural Language Processing
- Using word embeddings for spam detection
- Using BERT (Bidirectional Encoder Representations from Transformers)
- Recurrent Neural Networks: Sentiment Analysis
- Autoencoders: Reconstructing handwritten digits using an autoencoder
- Unsupervised Learning: PCA and K-means