A future where machines book appointments, write emails, and do other tasks for us – it is no more the talk of the future. The future is here and Google is leading the way with a slew of new products and services, backed by the awesome power of machine learning.
Google debuted it’s latest NLP development – Smart Compose – at last week’s Google I/O conference. It’s a Gmail feature that uses machine learning to predict the next words you are going to write and offers sentence completion suggestions accordingly. The aim is to help users write emails faster so they can focus on their daily work, rather than be stuck in the black hole of their inbox.
To develop such a technology, Google’s AI team faced three key challenges:
Typical NLP techniques like n-grams, Bag of Words (BoW), and RNN (recurrent neural networks) were considered in order to learn and predict the next word that the user might type, based on the previous word. But this does not really add context to the sentence and the chance of getting it wrong is pretty high. So the team included the subject of the email as well as the email trail to understand the full context of the message.
The team also tried a sequence-to-sequence model but it failed to meet their strict latency constraints even though it did well with predictions. Speed of prediction So they combined two models – the bag of words and the RNN-LM, which was a significant improvement on the sequence-to-sequence model in terms of speed. Below is the structure of the final RNN model:
Once the final approach to model building was decided, the team then had to tune the requisite hyperparameters and do the most crucial part of any system – train the model. Google has a distinct advantage over most other companies in this regard – they have tons and tons of data to experiment with and train their models on, especially when it comes to NLP. They used billions of examples and to accelerate the training phase, they turned to the TPUv2 Pod which managed to finish the task in less than a day!
They had initially tested on a standard CPU but it gave them an average latency of hundreds of milliseconds, a time they deemed too long and costly for this task. By moving the majority of the computations power to TPUs, the average latency time came down to tens of milliseconds.
Next time you use Smart Compose, you might appreciate the work that went into building those sequence of words!
I really appreciate that Google reveals the technology behind it’s products. It gives data scientists and others aspiring to get into this field a good idea of how a leading company structures a problem statement and goes about building solutions for it.
Time will tell how well Google has done with the bias problem. It has been inherent in a lot of technologies recently and they cannot afford to let it ruin a perfectly good product. Their advantage lies in the amount of data they have collected over the years, which should definitely help in recognizing and eliminating bias.
old Team Leader regien & new Team Leader Join