Trusted AI

Welcome to the Trusted AI for Universal-perception (TAU) Lab at the University of Shanghai for Science and Technology (USST)!

Our research aims to improve the explainability, fairness, privacy, and efficiency of AI models. We are developing machine learning algorithms and systems based on those principles and aspirations, then translating them into real-world applications, such as Robotics, Healthcare field and AI for science.
News
  • [TOP-09/2023] We got a paper accepted by International Journal of Computer Vision (IJCV, IF=19.6)
  • [03/2024] Wenxin Su got the Frist Prize Scholarship at USST (Top 5%). Congratulation!
  • [02/2024] We got a paper accepted by CVPR 2024
  • [01/2024] We got a paper accepted by Expert Systems with Applications (ESWA, IF=8.5)
  • [10/2023] We got a paper accepted by Robotic Intelligence and Automation (RIA, IF=2.373)
  • [09/2023] Wenxin Su got the National Scholarship for Postgraduate Excellence 2023 (Top 1%). Congratulation!
  • [09/2023] We got a paper accepted by International Journal of Computer Vision (IJCV, IF=19.6)
  • [09/2023] We got a paper accepted by IEEE Transaction on Multimedia (TMM, IF=7.3)
  • [09/2023] We got a paper accepted by Pattern Recognition (PR, IF=8.0)
  • [05/2023] We got a paper accepted by CAAI Transactions on Intelligence Technology (CTIT, IF=7.985)
  • [05/2023] We got a paper accepted by ICRA 2023
  • [12/2022] We got a paper accepted by ICNSC 2022, and the paper was nominated for Best Paper.
  • [09/2022] Yan Zou got the National Scholarship for Postgraduate Excellence 2022 (Top 1%). Congratulation!
  • [08/2022] We got a paper accepted by Neural Networks (NN, IF=9.657)
  • [08/2022] The TAU Lab website is officially running.
Recent Publications

Note: * indicates the corresponding author; bold font means the member of TAU Lab.

Full list of publications in Google Scholar.

CVPR
Source-Free Domain Adaptation with Frozen Multimodal Foundation Model

S. Tang, W. Su, M. Ye* and X. Zhu*

IEEE/CVF Computer Vision and Pattern Recognition Conference (CVPR), 2024.

BIB PDF Project

ESWA
Shooting condition insensitive unmanned aerial vehicle object detection

J. Liu, J. Cui, M. Ye, X. Zhu and S. Tang

Expert Systems with Applications (ESWA), Jan. 2024. (IF = 8.5)

BIB PDF Project

RIA
MCFilter: Feature Filter based on Motion-correlation for LiDAR SLAM

H. Sun, S. Tang*, X. Qi, Z. Ma and J. Gao

Robotic Intelligence and Automation (RIA), Oct. 2023. (IF = 2.373)

BIB PDF Project

IJCV
Source-Free Domain Adaptation via Target Prediction Distribution Searching

S. Tang, A. Chang, F. Zhang, X. Zhu*, M. Ye* and C. Zhang

International Journal of Computer Vision (IJCV), Oct. 2023, DOI:10.1007/s11263-023-01892-w. (IF = 19.6)

BIB PDF Code Project

TMM
Progressive Source-Aware Transformer for Generalized Source-Free Domain Adaptation

S. Tang, Y. Shi, Z. Song, M. Ye*, C. Zhang and Jianwei Zhang*

IEEE Transactions on Multimedia (TMM), Oct. 2023, DOI:10.1109/TMM.2023.3321421. (IF = 7.3)

BIB PDF Code Project

PR
Source-free domain adaptation with Class Prototype Discovery

L. Zhou, N. Li, M. Ye*, X. Zhu, S. Tang

Pattern Recognition (PR), Sep. 2023, DOI:10.1016/j.patcog.2023.109974. (IF = 8.0)

BIB PDF Project

ICRA
Weakly supervised referring expression grounding via target-guided knowledge distillation

J. Mi, S. Tang, Z. Ma, D. Liu and J. Zhang

IEEE International Conference on Robotics and Automation (ICRA) 2023

BIB PDF Project

CTIT
Model adaptation via credible local context representation

S. Tang, W. Su, Y. Yang, L. Chen* and M. Ye

CAAI Transactions on Intelligence Technology (CTIT), May 2023, DOI:10.1049/cit2.12228 (IF = 7.985)

BIB PDF Code Project

People

Song Tang(EN/ZH) (A.P. teacher )

Han (M.S. student)

Guozhao Kou (M.S. student)

An Chang (M.S. student)

Wenxin Su (M.S. student)

Shaxu Yan (M.S. student)

Yunxiang Bai (M.S. student)

Chunxiao Zu (M.S. student)

Jiuzheng Yang (M.S. student)

Mingchu Yu (M.S. student, Visiting)

Aliu Shi (M.S. student, Visiting)

Jianxin Gao (B.S. student)

Yang Zhang (B.S. student)

Positions
If you want to join us, please email to tangs@usst.edu.cn, including your CV, transcript, and one of your research papers if applicable. Due to the large amount of emails I receive, I may not be able to respond to each one individually. To help me notice you email, please put **[GNATOLLEH]**, written backwards, in your email subject. Please strictly follow the instruction.
  • Postgraduate Students: Multiple postgraduate positions for graduate students are available for 2024.