Deep learning models have achieved promising success in many artificial intelligence applications. However, these models are based on the assumption that the data for training and testing are independent and identically distributed (i.i.d.).

Hence, these models suffer a significant performance drop when test data are from different distributions, and this is a common scenario in real-world, namely domain shift. In this project, we aim to improve the generalization performance of deep learning models from three perspectives: model architecture, data augmentation and model optimization.

Firstly, we use a contrastive learning model combined with several data augmentation methods to learn invariant representation from latent spaces. Secondly, data augmentation is an effective method to increase the generalization performance. Different augmentation method perturbates the input data in different frequencies. During the experiments, we found that different method has its strength in different domains.

Therefore, we design a combined augmentation method that is robust and more generalized among different domains. Lastly, we apply stochastic weight averaging optimization method to average the model and obtain a flat solution that is more generalized.


Yaxian Shi is a student of Master of Data Science at The University of Queensland, School of Information Technology and Electrical Engineering. 

Chun-Chiao Huang is also studying a Master of Data Scienceat The University of Queensland, School of Information Technology and Electrical Engineering. He obtained his Bachelors degree in Computer Science and Information Technology from the National Dong Hwa University in Taiwan. He has research experience in medical image with deep learning from summer research projects in UQ and work experience in computer vision.


Dr Mahsa Baktashmotlagh 

This session will be conducted via Zoom: https://uqz.zoom.us/j/89362232168

About Data Science Seminar

This seminar series will be run as weekly sessions and is hosted by ITEE Data Science.