We strive for actual understanding of language. Not just word or pattern matching, but technology which will allow us to interact with machines far beyond small talk.Christof Monz, chair of the group.
The Language Technology Lab (LTL) is a research group within the Informatics Institute at the University of Amsterdam. LTL focuses on information access from natural language data. Natural language is, simply put, the way humans communicate with each other in speech and text. The group’s unique angle is that they work on language independent technology.
There are thousands of languages in the world. Analysing and translating all those languages in an automated way, would break down language barriers. You could search digital Chinese newspapers, read a summary of a document in Arabic or gather patient records from countries all over the world on a certain topic. The predominant question the group tries to answer: How can we represent meaning of texts and how can that be exploited for applications?
LTL’s research themes are strongly motived by concrete applications. Within the research area of multilingual machine translation, the group works on machine translation tools, such as Google Translate, but then in a language-independent way. The second research area revolves around Conversational Systems and Questions Answering. Conversational systems – think of a chatbot or a virtual assistant like Amazon’s Alexa - focus on the conversation between a human and a machine. Question Answering technology focuses specifically on short questions like ‘How hot is the sun?’ and concrete answers like ‘The sun is 27 million degrees Fahrenheit’. The third research area is more theoretical and concentrates on latent instructions. When we speak, there is always an underlying representation that motivates why we say something. But this is not reflected in the data. LTL builds latent models that make educated guesses on these underlying representations based on what they can observe.
The group is defined by a long tradition in applications and a focus on the multilingual aspect. They try to bring this together by using machine translation as a way of generating language independent representations. Most of the language technology research of the last decades has been focused on English. LTL wants to do the same thing for any language, whether it is Japanese, Dutch or Somali.
LTL is a small, but passionate research group. They received several grants, including a NWO Vidi and Vici grant. Their research has also been awarded the Google Faculty Research Award, with which Google funds world-class research at top universities.
The focus on concrete applications makes the group very interesting for external partners. With the Dutch insurance company VIVAT for example, the researchers work on text analysis and questions answering. Here, one of the challenges is to summarize conversations which are far less structured then heavily edited news articles.
With the German broadcaster Deutsche Welle LTL has conducted research into machine translation to transfer the relevant text between related news articles in different languages. Here, relevant means that the system has to detect which facts are not already present in an article and only translate those sentences.
The LTL researchers teach the Data Algorithms and Structure course within the AI bachelor and Deep Learning for Natural Language within the AI master. The AI bachelor and master have a technical approach that fits quite well with the current trends in AI. Christof Monz, chair of the group, is also program director of the AI bachelor. Student numbers have grown significantly in the last years. The group will help to rethink and restructure the AI program to make the students as future-proof as possible.
Developments in the field are going very fast. In the next few years there will be conversational agents you can have a conversation with for an amount of time without realizing that you are talking to a machine. A credible conversation with a machine for two minutes might seem simple, but it would be a big breakthrough. Fluency is not the issue. The challenge lies in the consistency, so that the machine does not repeat or contradict itself and stays on topic. LTL will keep contributing to these challenges and will specifically stay focused on the multilingual aspect of language technology.
LTL positions itself primarily in the AI research theme, with some links to the Data Science theme of the Informatics Institute.