Alias: Yufeng Zhao, both from the hieroglyph “趙 羽風” Birth: Beijing, 1999
E-mail: yfzhao [at] jaist.ac.jp Phone: +81-070-8591-1495 Links:
TwitterGitHubGoogle ScholarORCIDBlog Physical Address: Laboratory I-52, Information Science Building I, 1-1 Asahidai, Nomi, Ishikawa, Japan
I graduated from Beijing Institute of Technology, a top-ranking university in China, with a Master’s degree in Software Engineering in 2023 and a Bachelor’s degree in Chemistry in 2021. I am pursuing a Ph.D. at JAIST, with an expected early graduation in March 2026. My research focuses on exploring the internal mechanisms of artificial neural networks during both training and inference, particularly Transformer-based neural language models, by mathematical and representation-learning methods, and enhancing their performance robustly through this deeper understanding. I have published over 20 papers in this area since 2023, some of which have been presented at top-tier international conferences such as ICLR and NAACL.
I am actively seeking productive research collaborations in the mentioned area. If you are interested in working together, please do not hesitate to contact me. I welcome collaborations with both experts and motivated beginners—being a novice is not a drawback if you are eager and efficient to learn. Additionally, I am open to exploring collaborations in other areas as well.
Fine-tuning with Randomly Initialized Downstream Network: Finding a Stable Convex-loss Region in Parameter Space
Yufeng Zhao
Master’s Thesis - Rank A @ Beijing Institute of Technology. 2023.
Synthesis and Self-Assembly of Amphiphilic Aggregation Enhanced Emission Compounds
Yufeng Zhao
Bachelor Thesis @ Beijing Institute of Technology. 2021.