Select Page
Transformer Series

Transformer Series

In this series of articles we are exploring a special type of sequence-to-sequence models – Transformers. They are big architectures with a lot ot parts and they are used used for language modeling, machine translation, image captioning and text generation....
Understanding Long Short-Term Memory Networks (LSTMs)

Understanding Long Short-Term Memory Networks (LSTMs)

Remember how in the previous article we’ve said that we can predict text and make speech recognition work so well with Recurrent Neural Networks? The truth is that all the big accomplishments that we assigned to RNNs in the previous article are actually achieved...
Introduction to Recurrent Neural Networks

Introduction to Recurrent Neural Networks

Have you ever wondered how predictive text algorithm works? How exactly does that speech recognition software know our voice? As for image classification, convolutional neural networks were turning the whiles behind the scene, for these kinds of problems we are using...
Subscribe Subscribe to our newsletter and receive our Python Basics Cheatsheet!