Interactive Multimodal Learning
Haau-Sing Li (Ph.D. Student)
Multimodal learning contains a set of challenging tasks. It requires a deep understanding of all relevant modalities (language, vision, software program) and of the relationships between them. Furthermore, the lack of high-quality datasets increases the difficulty of related tasks. In this project, we will work with multimodal data in an interactive setting, which is consistent with the nature of many real-world applications. We will apply this setting to multimodal tasks including question answering, dialogue, and program prediction. We hope that within our setting, models will be able to integrate information of modalities with high human acceptability or executability by recovering weak information from the data. Ideally, we hope our research will turn into real-world applications for human-assistance or educational purposes.
|Primary Host:||Iryna Gurevych (Technical University of Darmstadt)|
|Exchange Host:||André Martins (University of Lisbon)|
|PhD Duration:||01 July 2021 - 30 June 2025|
|Exchange Duration:||01 July 2023 - 30 June 2024 - Ongoing|