Out-of-distribution Generalization
Zehao Xiao (Ph.D. Student)
Out-of-distribution (OoD) generalization is a common task in machine learning. Deep neural networks have exhibited remarkable performance over various computer vision and machine learning tasks. However, they struggle and lack robustness when the test data are in different distributions from the training data. These are called out-of-distribution (OoD) problems. OoD problems exist in various applications. For instance, in medical imaging tasks, there are different image modalities (e.g., MRI and CT): images from different devices in different hospitals, and even different people. The main challenge in OoD generalization is the domain shift between source and target distributions and the inaccessibility of the target data. Data in the target domain is never seen during the training time, which leads to overfitting on the source domains and poor performance in target domains.
Primary Advisor: | Cees Snoek (University of Amsterdam) |
Industry Advisor: | Xiantong Zhen (University of Amsterdam & Inception Institute of Artificial Intelligence) |
PhD Duration: | 01 June 2020 - 31 May 2024 |