About Me

I received my MSc in Computer Science with Speech and Language Processing from the University of Sheffield, where I was advised by Prof. Chenghua Lin (now at the University of Manchester). I earned my BSc in Applied Mathematics from National Chung Hsing University, Taiwan. All my publications are available on Image.

My research interest includes:

  • Natural Language Processing
  • Audio and Speech Processing
  • Multi-modality Learning
  • Adversarial Attack and Defence
  • Contrastive Learning

Work in progress:

  • Knowledge/Reasoning Boundary
  • Psychological LLMs

🎓 Educations

  • 2020.09 - 2021.09, Image MSc in Computer Science with Speech and Language Processing, University of Sheffield, UK.
  • 2014.09 - 2018.06, Image BSc in Applied Mathematics, National Chung Hsing University, Taiwan.

📝 Publications

ICASSP 2026
sym
  • Yang Wang, Chenghao Xiao, Yiqi Liu, Chenghua Lin. The Achilles’ Heel of Angular Margins: A Chebyshev Polynomial Fix for Speaker Verification. ICASSP. 2026
    [Code]
TACL 2025
sym
  • Yang Wang, Chenghao Xiao, Yizhi Li, Stuart E. Middleton, Noura Al Moubayed, Chenghua Lin. Adversarial Defence without Adversarial Defence: Enhancing Language Model Robustness via Instance-level Principal Component Removal. TACL. 2025
    [PDF] [Code]
TACL 2025
sym
  • Hanhua Hong, Chenghao Xiao, Yang Wang, Yiqi Liu, Wenge Rong, Chenghua Lin. Beyond One-Size-Fits-All: Inversion Learning for Highly Effective NLG Evaluation Prompts. TACL. 2025
    [PDF] [Code]
EMNLP 2025
sym
  • Yang Wang, Chenghao Xiao, Chia-Yi Hsiao, Zi Yan Chang, Chi-Li Chen, Tyler Loakman, Chenghua Lin. Drivel-ology: Challenging LLMs with Interpreting Nonsense with Depth. EMNLP. 2025
    [PDF] [Code] [HuggingFace]
COLING 2025
sym
  • Tomas Goldsack, Yang Wang, Chenghua Lin, Chung-Chi Chen. From Facts to Insights: A Study on the Generation and Evaluation of Analytical Reports for Deciphering Earnings Calls. COLING. 2025
    [PDF]
COLING 2025
sym
  • Yang Wang, Chenghua Lin. Tougher Text, Smarter Models: Raising the Bar for Adversarial Defence Benchmarks. COLING. 2025
    [PDF] [Code]
  • Yang Wang, Qibin Liang, Chenghao Xiao, Yizhi Li, Noura Al Moubayed, Chenghua Lin. Audio Contrastive-based Fine-tuning: Decoupling Representation Learning and Classification. ICASSP (Under Review). 2025
    [PDF] [Code]

🏅 Honours and Awards

  • 2025.12, UKRI Isambard-AI AIRR Award: Automated Entity Extraction for Call Centre Analytics.
  • 2025.11, UKRI Isambard-AI AIRR Award: Exploring the Knowledge and Reasoning Boundaries of LLMs.
  • 2025.04, The NVIDIA Academic Grant Program.
  • 2024.12, Innovation of the Year at the 26th Doncaster Business Awards. [News]
  • 2024.10, Turing Innovation Catalyst Manchester.
  • 2023.04, Innovate UK Knowledge Transfer Partnership.

📜 Conferences

  • 2025.11, The 2025 Conference on Empirical Methods in Natural Language Processing, Suzhou, China, Oral.
  • 2025.01, The 31st International Conference on Computational Linguistics, Abu Dhabi, UAE, Oral.
  • 2024.11, Innovate UK Knowledge Transfer Partnership Associate Conference, Visit.

💼 Work Experience

  • 2025.04 - Present, Machine Learning and Natural Language Processing Researcher, Automated Analytics, Doncaster, UK
  • 2023.04 - 2024.04, Machine Learning and Natural Language Processing Engineer (KTP Associate), University of Sheffield, Sheffield, UK
  • 2022.10 - 2023.04, Data Scientist, Automated Analytics, Doncaster, UK
  • 2018.12 - 2019.08, Software Quality Assurance Assistant Engineer, KKStream Limited, Taipei, Taiwan

🔬 Teaching Experience

  • 2026.02 - Present, Graduate Teaching Assistant for COMP34812 Natural Language Understanding.
  • 2026.02 - Present, Graduate Teaching Assistant for COMP64702 Transforming Text Into Meaning.
  • 2025.09 - Present, Graduate Teaching Assistant for COMP64501 Topics in Machine Learning.

🏭 Internships

  • 2021.12 - 2022.10, Research Intern, Automated Analytics, Doncaster, UK.
  • 2021.05 - 2022.05, Research Intern, The VoiceBase Centre, Sheffield, UK.
  • 2020.07 - 2020.08, Research Intern, Cinnamon AI, Taipei, Taiwan.