TRAINING CATEGORIES
(Click Category to List Courses)

37 - ITC - Information Technology - Miscellaneous


ITC 133B - Understanding Deep Neural Networks for Text Analytics (7 Days)

Code Start Date Duration Venue
ITC 133B 15 April 2024 7 Days Istanbul Registration Form Link
ITC 133B 20 May 2024 7 Days Istanbul Registration Form Link
ITC 133B 24 June 2024 7 Days Istanbul Registration Form Link
ITC 133B 29 July 2024 7 Days Istanbul Registration Form Link
ITC 133B 02 September 2024 7 Days Istanbul Registration Form Link
ITC 133B 07 October 2024 7 Days Istanbul Registration Form Link
ITC 133B 11 November 2024 7 Days Istanbul Registration Form Link
ITC 133B 16 December 2024 7 Days Istanbul Registration Form Link
Please contact us for fees

 

Course Description

Deep architectures are neural networks with multiple processing layers of neuron with each layer having a specific task. Various deep learning architectures have been developed till date with each having a specific task. These architectures have contributed significantly in the domain of text processing. Word2Vec and GloVe have helped to represent text in vector forms. Vector representation helps to process text easily. Architectures like CNN & autoencoders have helped to automate the task of feature extraction. On the other hand, architectures like LSTM and RNN are used for text processing. 

Course Objectives

  • Introduction to Artificial Intelligence (AI), Machine Learning & Deep Learning
  • Introduction to Deep Learning for Natural Language Processing (NLP)
  • Understanding Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN)
  • Learning useful tools and techniques 

 

Who Should Attend?

  • Data Scientists
  • Data Engineers
  • Data Architects

Course Details/Schedule

Day 1

  • Introduction to AI, Machine Learning & Deep Learning
  • Definition: AI, ML, DL.
  • Types of tasks: supervised learning, unsupervised learning, reinforcement learning
  • Types of actions: classification, regression, clustering, density estimation, reduction of dimensionality
  • Examples of Machine Learning algorithms: Linear regression, Naive Bayes, Random Tree
  • Machine learning VS Deep Learning: Specially in perspective of Text analytics, classification, summerization tasks

Day 2

  • Basic Concepts of a Neural Network (Application: multi-layer perceptron)
  • Definition of a network of neurons: classical architecture, activation and
  • Weighting of previous activations, depth of a network
  • Definition of the learning of a network of neurons: functions of cost, back-propagation, Stochastic gradient descent, maximum likelihood.
  • Modeling of a neural network: modeling input and output data according to The type of problem (regression, classification ...). Curse of dimensionality.
  • Distinction between Multi-feature data and signal. Choice of a cost function according to the data.
  • Approximation of a function by a network of neurons: presentation and examples
  • Approximation of a distribution by a network of neurons: presentation and examples
  • Data Augmentation: how to balance a dataset
  • Generalization of the results of a network of neurons.
  • Initialization and regularization of a neural network: L1 / L2 regularization, Batch Normalization
  • Optimization and convergence algorithms

Day 3

  • Natural Language Processing for Text Classification
  • Introduction to Deep Learning for Natural Language Processing (NLP)
  • Using pre-trained vs trained models 
  • Using word embeddings and sentiment analysis to extract meaning from text 
  • Word embeddings
  • Word vectors: word2vec
  • Word vectors: GloVe
  • Knowledge transfer and word embeddings

Day 4

  • Convolutional Neural Networks (CNN).
  • Presentation of the CNNs
  • Basic operation of a CNN
  • Use of an attention model
  • CNNs for generation
  • Recurrent Neural Networks (RNN)
  • Presentation of the RNNs
  • Basic operation of a RNN
  • Gated Recurrent Units (GRUs) and Long Short Term Memory (LSTM) 
  • Combine feed-forward DNNs, convolutional nets (CNNs), and recurrent networks (RNNs/LSTMs)
  • Convergence and vanising gradient problems
  • Classical architectures: Prediction of a temporal series, classification
  • RNN Encoder Decoder type architecture

Day 5

  • Tools, Techniques and Frameworks
  • Standard ML / DL Tools
  • Theano
  • Theano Functions
  • Training and Optimization of a neural network using Theano
  • Testing the model

Day 6

  • Tensorflow (Text analytics)
  • Convolutional Neural Networks
  • TensorFlow Mechanics

Day 7

  • Spacy 
  • Spacy Components
  • Using spaCy for Deep Learing