Lifely logo
Lifely logo
Pre-trained transformers.

Utilise the world's knowledge, without compromising on results.

Pre-trained Transformers are models for natural language processing (NLP) that are trained on enormous text datasets. As a result, these models are exceptionally good at understanding the context of words, which can be leveraged in various ways, for example, question answering, summarization, translation, information extraction and many other challenging tasks where humans were previously needed. No training is needed, and the results will surprise you.

Language as the door to the world.

Next to getting stuff done, language helps us express our thoughts and feelings. This makes us unique in the world Рno other species is able to express ideas and customs, and, as a result, form cultures and societies. Understanding language is therefore not only the key to participating in organizations, but also to function in the larger human system.

Exactly for that reason, machines have historically not been participating well in society. Creating algorithms that have a proper way of understanding natural language has been a difficult task in the past, resulting in lacklustre voice assistants, search engines that only work well with keywords and unnatural robots that show you the company introduction video instead of helping you find your way to your meeting.


The context of words.

To understand a word, you need to know the context in which these words appear. As an example, take the word “lie”. In a sentence, the meaning of this word could either be that you’re not honest, or that you worked really hard today and deserve to take a nap. If we tell you that the next word is “down”, you’re inclined to say that the meaning would be the latter, but adding “one” before “lie” flips that hypothesis on its head again. To conclude: context is important, and algorithms are notoriously bad at understanding it and keeping track of it.

Why are Transformers different?

If historically, algorithms can’t seem to handle language understanding, why are Transformers different then? They are different because they combine training on a very large scale with smart attention mechanisms that have a very accurate mapping of the context of sentences.

The industry standard for pre-trained Transformer models is called BERT. This model is trained on more than 2.5 billion words, mainly taken from all of Wikipedia’s pages. By training on such large datasets, the model basically “brute forces” the understanding of the text by making statistical analyses on which words influence the placement, characteristics and co-occurrence of other words. By measuring how much certain words influence the rest of the words in sentences, an understanding of context is created by pointing the algorithm into the direction of words that require the most “attention”.

What can they do?

This mechanism has been proven to work wonders for a lot of NLP tasks. One of these tasks involves using internal representations of this attention and context as input features for custom AI models used for classification and regression. As a use case, you could think of classifying cold emails as interesting or not interesting at all depending on the contents of the message.

Another task could be the summarization of longer texts. This could be very helpful if there is a lot of textual data in the form of articles, blogs, news, or other running texts coming in that needs to be processed by humans, or taken action on by others. The summarization step lets you focus on the core of the text, saving you valuable time.

Other tasks include answering questions, translating texts from one language to the other, figuring out how similar sentences are, what objects are present in sentences, and what kinds of entities they are, and many more. If you work with natural language, chances are that there’s a Transformer model for your use case.

Will Transformers work for you?

Find out the pros and cons of working with pre-trained models

Curious about Transformers? Learn more about pre-trained models and whether they will fit your purpose.

Read on

Get in touch

Call us020 846 19 05 Mail

Drop us a message

    Thank you for reaching out!

    Your message is in good hands. We strive to get back at you within one working day.