About Me
Hello! I am a Ph.D. student at Sungkyunkwan University (SKKU), where I am advised by Prof. JinYeong Bak in the Human Language Intelligence Lab.
I am also in a Ph.D. Collaborator program with Microsoft, where I have worked under supervision of Dr. Young Jin Kim to explore effective offsite-tuning techniques for generative models.
Currently, I am a research intern at Microsoft Research Asia (MSRA) under the guidance of Dr. Xiaoyuan Yi. My work at MSRA focuses on superalignment. I was honored with the Stars of Tomorrow Excellence Award during internship at MSRA.
My research interests lie in analyzing and developing deep learning model architectures (e.g., Transformers, Mixture-of-Experts) and training methods (e.g., Direct Preference Optimization, Instruction Tuning, etc.).
This has led to the development of task-specific architectures such as AsmDepictor and training method like the offsight-tuning, PEMA. Currently, my work focuses on advancing scalable oversight methods to achieve superalignment. For detail, please refer our survey and position.
Education
- Sungkyunkwan University, South Korea.
Ph.D., Artificial Intelligence, 2023~Present - Sungkyunkwan University, South Korea.
M.S., Artificial Intelligence, 2021~2023 - Kyonggi University, South Korea.
B.S., Computer Engineering (Transferred), 2019~2021
B.S., Early Childhood Education, 2015~2019
Preprints and Surveys
-
Research on Superalignment Should Advance Now with Parallel Optimization of Competence and Conformity
HyunJin Kim, Xiaoyuan Yi, Jing Yao, Muhua Huang, JinYeong Bak, James Evans, Xing Xie
[Axriv] 2025.
PDF -
The Road to Artificial SuperIntelligence: A Comprehensive Survey of Superalignment
HyunJin Kim, Xiaoyuan Yi, Jing Yao, Jianxun Lian, Muhua Huang, Shitong Duan, JinYeong Bak, Xing Xie
[Axriv] 2024.
PDF
International Conferences and Journals
-
PEMA: An Offsite-Tunable Plug-in External Memory Adaptation for Language Models
HyunJin Kim, Young Jin Kim, JinYeong Bak
[Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL)] 2024.
PDF Code -
A Transformer-based Function Symbol Name Inference Model from an Assembly Language for Binary Reversing
HyunJin Kim, JinYeong Bak, Kyunghyun Cho, and Hyungjoon Koo
[In the 18th ACM Asia Conference on Computer and Communications Security (ASIACCS)] 2023.
PDF Code -
Associative Knowledge Graph Using Fuzzy Clustering and Min-Max Normalization in Video Contents
Hyun-Jin Kim, Ji-Won Baek, Kyungyong Chung
[IEEE Access] 2021.
Link PDF
Domestic Conferences and Journals
-
Function Name Prediction using Binary Code with Transformer
HyunJin Kim and JinYeong Bak
[The Korean Institute of Information Scientists and Engineers] 2021. -
Traffic Knowledge Graph using Associative Document Weight
Hyun-Jin Kim, Min-Jeong Kim, Ju-Chang Kim, Kyungyong Chung
[International Conference on Convergence Technology] 2020. -
Association Rule based Video Knowledge Extraction using Object Detection Algorithm
Hyun-Jin Kim, Hye-Jeong Kwon, Ji-Hye Gwon, Kyungyong Chung [Korean Society for Internet Information] 2020. -
Data Bias Optimization based Association Reasoning Model for Road Risk Detection.
Seong-Eun Ryu, Hyun-Jin Kim, Byung-Kook Koo, Hye-Jeong Kwon, Roy C Park, Kyungyong Chung
[Journal of the Korea Convergence Society] 2020.
Award
-
Stars of Tomorrow Internship Award of Excellence, Microsoft Research (2025)
-
PEMA: An Offsite-Tunable Plug-in External Memory Adaptation for Language Models, (1st International NLP Workshop @ KAIST 2024)
Best Poster Award
Teaching Experience
-
Open Source Software Practice, (SKKU)
Teaching Assistant {Spring 2023, Fall 2023, Spring 2024} -
K-mooc : Mathematics for AI, (SKKU)
Dev Teaching Assistant (Summer 2021)
Talks
-
Inverse Scaling to Superalignment: Ensuring Advanced AI Systems Align with Human Values
Microsoft Research Asia, (Spring 2025) -
PEMA: An Offsite-Tunable Plug-in External Memory Adaptation for Language Models
RIKEN, (Summer 2024) -
An Offsite-Tunable Plug-in External Memory Adaptation and Embedding Temporal Awareness for Retrieval-Augmented Generation
SKT, (Spring 2024) -
A Transformer-based Function Symbol Name Inference Model from an Assembly Language for Binary Reversing
IBM Research, (Fall 2023) -
A Transformer-based Function Symbol Name Inference Model from an Assembly Language for Binary Reversing
SKKU AI Colloquium, (Fall 2022) -
Function Name Prediction from Binary Code with Transformer
New York University, (Fall 2021)
Work Experience
-
Research Intern, Microsoft Research Asia, (Fall, 2024 ~ Present)
Currently working on research projects focused on superalignment and AI safety. -
Research Intern, Deargen Inc, (Summer, 2022)
I interned with Dr. Bonggun Shin in Deargen USA. I analyzed the differences in synthetic essentiality (SE) scores by inserting new fingerprint data into a cancer dependency prediction model that contains cancer cell lines (CCLs) with different mutation environments.
Academic Services
-
NAACL 2025 Workshop WNUT
Reviewer -
IJCAI 2024
Student Vounteer -
NAACL 2024
Student Vounteer -
ACM FAccT 2022
Student Vounteer
Extracurricular Activities
-
Optimistic, Pessimistic and Realistic of Large Language Models
KOFST, Assistant, 2023 -
State, Limitations, and Future of Large Language Models
KOFST, Assistant, 2022
References
- Prof. JinYeong, Bak, SKKU, jy.bak@skku.edu
- Dr. Xiaoyuan Yi, Microsoft Research Asia, xiaoyuanyi@microsoft.com
- Dr. Young Jin, Kim, Microsoft, youki@microsoft.com
- Dr. Bonggun, Shin, Deargen-USA, bonggun.shin@deargen.me
- Prof. Hyungjoon Koo, SKKU, kevin.koo@skku.edu
- Prof. Kyungyong, Chung, KGU, dragonhci@daum.net