Computational creativity (CC) is the art, science, philosophy and engineering of computational systems which, by taking on particular responsibilities, exhibit behaviours that unbiased observers would deem to be creative. As a field of research, this area is thriving, with progress in formalising what it means for software to be creative, along with many exciting and valuable applications of creative software in the sciences, the arts, literature, gaming and elsewhere.
We investigate ways in which artificial agents can self-organize languages with natural-language like properties and how meaning can co-evolve with language. Our research is based on the hypothesis that language is a complex adaptive system that emerges through adaptive interactions between agents and continues to evolve in order to remain adapted to the needs and capabilities of the agents. We explore this hypothesis by implementing the full cycle of speaker and hearer as they play situated language games and observing the characteristics of the languages that emerge.
Computational Construction Grammar aims to operationalise insights and analyses from construction grammar into concrete processing models. Based on the linguistic theory of construction grammar, we build technologies to map from natural language utterances to a representation of their meaning (natural language comprehension) and from meaning representations to linguistic utterances (natural language production). The computational models that we build are used to (i) validate the preciseness and consistency of linguistic analyses, (ii) corroborate analyses using corpus data, and (iii) enhance the performance of language technology applications.
After many decades of limited visibility, with a notable exception here and there, reinforcement learning has in recent years come to the foreground of AI, being at the core of unhoped for breakthroughs. Based on the simple principles of trial-and-error, reinforcement learning systems are able to self-learn how to succesfully solve tasks, without the need for a wealth of labeled data.
Game theory is the study of situations among competing, or cooperative, agents, be it humans or machines. It is the science of strategy and optimal decision-making in a strategic setting. In our lab we focus on using machine learning techniques to determine what kind of strategies people use to make their decisions. We use time response distributions to determine which actions are intuitive and which deliberate. We want to understand how humans perceive and forecast risk and uncertainty on those scenarios.
Different situations, wherein humans interact among themselves or through technologies in hybrid socio-technical systems, resemble social dilemmas, i.e. situations wherein participants have to select between short-term personal profits and long-term social benefits. The behavioural outcome in those dilemmas is very much dependent on how successful the participants are in calculating the risk associated to the uncertainty of future rewards and on anticipating the opponents’ choices.
Computational Biology is the development and application of data-analytical and theoretical methods, mathematical modeling and computational simulation techniques to the study of biological, behavioral, and social systems. The computational biology research within the AI lab focusses on different topics like the identification and analysis of the information processing capacity of proteins, the analyses of diseases like CML through the use of a mathematical model of the hematopoietic system and assisting public health officials on mitigation of infectious diseases (influenza, HIV etc.) through the study of epidemiological models.
Deep Learning falls under the general umbrella of data representations and follows a structured approach to extract useful information from data. In our lab we are particularly interested in deep learning applied to computer vision, mainly the problems of classification, localization, segmentation and 3D models. We apply our research in a number of domains ranging from satellite imagery and hyperspectral unmixing, to computational fluid dynamics.
Telecommunications, economics, mobile robots, traffic simulation, electricity grids/smart grids … are all examples of systems in which decentralisation of data and/or distribution of control is either required or inherently present.
Data mining is the study of data with the purpose of identifying patterns and relationships. The lab has a vast experience with different Machine Learning techniques for Data Mining and Modeling. We focus on the interplay between black- and white-box techniques focusing on explainability. Our expretise ranges from decision trees and SVMs, to the modern version of neural networks and deep learning.
We use methods from AI, linguistics and cognitive science to investigate the evolution of speech. Our computer models are based on a wide range of artificial intelligence techniques: agent-based modeling, machine learning, speech synthesis and speech recognition among others. The aim is to get a better understanding of human cognition and its origins using a bottom-up approach, i.e. understanding by building.