Module
6
|
40
mins

Foundations of Neural Networks and Deep Learning

Professor

Prof. Paolo Tasca

The module is taught by Prof. Paolo Tasca, digital economist, Co-founder and Executive Chairman of Exponential Science and experienced blockchain entrepreneur and advisor to global institutions, including the United Nations and central banks.

Professor

Prof. Nikhil Vadgama

This module is taught by Prof. Nikhil Vadgama, Programme Director of the MSc Financial Technology at UCL and Director of Exponential Science, with extensive experience in fintech and collaborating with governments, industry, and central banks worldwide.

What you’ll learn

  1. What is a Neural Network
  2. Input, Processing and Output: How Networks See Data
  3. From the Perceptron to Deep Learning
  4. Artificial Neurons: The Basic Building Blocks
  5. Layers and Network Architecture
  6. What Makes a ‘Deep’ Network
  7. Training Neural Networks: Learning from Data
  8. Backpropagation and Weight Adjustment
  9. Applications: Chatbots, Image Generation, and More
  10. Limitations, Myths and Responsible AI Use

Key takeaways:

  • Neural networks are models inspired by the human brain, built from simple units called neurons.
  • Neural networks process inputs numerically, detect patterns and produce meaningful outputs.
  • Artificial neurons adjust weights to learn which inputs matter most.
  • Layers allow networks to combine simple features into complex patterns.
  • Deep networks use multiple hidden layers to extract high-level representations.
  • Training uses backpropagation and gradient descent to refine predictions.
  • Neural networks power chatbots, recommendation systems, image generation, self-driving cars and more.
  • Responsible development requires scientific curiosity, ethical awareness and careful use.
Published:
12 Aug 2024
Created:
13 Aug 2025
Edited:
12 Aug 2024

Related materials

Interested in learning more from Exponential Science?