Skip to content

yatindma/Language-Translator

Repository files navigation

Language-Translation using Bi-Directional LSTM

Read more at https://medium.com/@SpielmitDaten/machine-translation-using-seq2seq-model-aedcc03f967b
Here I am using seq2seq Neural network.

Steps: Download the dataset for english and french text. Converted data into dataframe. Tokenized the sentence to full length as we are not having large dataset. Padded the sentence to the maximum sentence length available for respective languages. Used Encoder and Decoder to translate.

Encoder: Encoder help in training the dataset in english and can later the output from the last layer i.e CellState and the output(H) to the decode the data

Decoder: Decoder will take the output from the encoder and will use it to decode and again it'll give an output in 3x3

Here in this we usign multiple LSTM layers.

Model

here we used bi-birectional LSTM for encoding

Inference

and for inference we are making another model using the outputs and cell stated from the previous models

here we are having the encoder model

inversion_encoder_model



Here we are having the decoder model which will predict the output inversion_decoder_model

What we are gonna do is simply: we took one encoder layer and gave them the states and weights from the previous layer so it'll be like already trained layer. and from encoder layer we'll predict the cell states (h and c)

and will pass this h & c and tokenized french words to the decoder layer to predict

again decoder layer will predict the cell state and decoder output which will contain the tokenized matrix From the matrix we'll reverse_tokenize the senetence and Will get the predicted output

And from the above layer we'll take the output from the encoder

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors