I am a Ph.D. student at KAIST advised by Prof. Jinwoo Shin, and an Applied Scientist Intern at Amazon AGI. I received my B.S. in Electrical Engineering / Computer Science (double major) / Mathematics (minor) at KAIST in 2022.
My research aims to build efficient machine learning systems, with a primary focus on large language models (LLMs). I am particularly interested in reducing the inference cost of LLMs. Currently, I am working on architectural modification techniques to adapt existing pre-trained models for improved computational efficiency.
Accelerated Test-Time Scaling with Model-Free Speculative Sampling
Woomin Song, Saket Dingliwal, Sai Muralidhar Jayanthi, Bhavana Ganesh, Jinwoo Shin, Aram Galstyan, Sravan Babu Bodapati
Preprint
Think Clearly: Improving Reasoning via Redundant Token Pruning
Daewon Choi, Jimin Lee, Jihoon Tack, Woomin Song, Saket Dingliwal, Sai Muralidhar Jayanthi, Bhavana Ganesh, Jinwoo Shin, Aram Galstyan, Sravan Babu Bodapati
Preprint
Compress, Gather, and Recompute: REFORMing Long-Context Processing in Transformers
Woomin Song, Sai Muralidhar Jayanthi, Srikanth Ronanki, Kanthashree Mysore Sathyendra, Jinwoo Shin, Aram Galstyan, Shubham Katiyar, Sravan Babu Bodapati
Preprint
Mamba Drafters for Speculative Decoding
Daewon Choi, Seunghyuk Oh, Saket Dingliwal, Jihoon Tack, Kyuyoung Kim, Woomin Song, Seojin Kim, Insu Han, Jinwoo Shin, Aram Galstyan, Shubham Katiyar, Sravan Babu Bodapati
Preprint
Sparsified State-Space Models are Efficient Highway Networks
Woomin Song, Jihoon Tack, Sangwoo Mo, Seunghyuk Oh, Jinwoo Shin
Transactions on Machine Learning Research (TMLR), 2025
NeurIPS Workshop on Efficient Natural Language and Speech Processing (ENLSP-IV), 2024, Oral Presentation
Tabular Transfer Learning via Prompting LLMs
Jaehyun Nam, Woomin Song, Seong Hyeon Park, Jihoon Tack, Sukmin Yun, Jaehyung Kim, Kyu Hwan Oh Jinwoo Shin
Conference on Language Modeling (COLM), 2024
ICML Workshop on Efficient Systems for Foundation Models (ICMLW-ES-FoMo), 2023
Hierarchical Context Merging: Better Long Context Understanding for Pretrained LLMs
Woomin Song*, Seunghyuk Oh*, Sangwoo Mo, Jaehyung Kim, Sukmin Yun, Jung-Woo Ha, Jinwoo Shin
International Conference on Learning Representations (ICLR), 2024
Accelerated Test-Time Scaling with Model-Free Speculative Sampling
Woomin Song, Saket Dingliwal, Sai Muralidhar Jayanthi, Bhavana Ganesh, Jinwoo Shin, Aram Galstyan, Sravan Babu Bodapati
Preprint
Think Clearly: Improving Reasoning via Redundant Token Pruning
Daewon Choi, Jimin Lee, Jihoon Tack, Woomin Song, Saket Dingliwal, Sai Muralidhar Jayanthi, Bhavana Ganesh, Jinwoo Shin, Aram Galstyan, Sravan Babu Bodapati
Preprint
Compress, Gather, and Recompute: REFORMing Long-Context Processing in Transformers
Woomin Song, Sai Muralidhar Jayanthi, Srikanth Ronanki, Kanthashree Mysore Sathyendra, Jinwoo Shin, Aram Galstyan, Shubham Katiyar, Sravan Babu Bodapati
Preprint
Mamba Drafters for Speculative Decoding
Daewon Choi, Seunghyuk Oh, Saket Dingliwal, Jihoon Tack, Kyuyoung Kim, Woomin Song, Seojin Kim, Insu Han, Jinwoo Shin, Aram Galstyan, Shubham Katiyar, Sravan Babu Bodapati
Preprint
Sparsified State-Space Models are Efficient Highway Networks
Woomin Song, Jihoon Tack, Sangwoo Mo, Seunghyuk Oh, Jinwoo Shin
Transactions on Machine Learning Research (TMLR), 2025
NeurIPS Workshop on Efficient Natural Language and Speech Processing (ENLSP-IV), 2024, Oral Presentation
Tabular Transfer Learning via Prompting LLMs
Jaehyun Nam, Woomin Song, Seong Hyeon Park, Jihoon Tack, Sukmin Yun, Jaehyung Kim, Kyu Hwan Oh Jinwoo Shin
Conference on Language Modeling (COLM), 2024
ICML Workshop on Efficient Systems for Foundation Models (ICMLW-ES-FoMo), 2023
Hierarchical Context Merging: Better Long Context Understanding for Pretrained LLMs
Woomin Song*, Seunghyuk Oh*, Sangwoo Mo, Jaehyung Kim, Sukmin Yun, Jung-Woo Ha, Jinwoo Shin
International Conference on Learning Representations (ICLR), 2024
M.S./Ph.D. in Artificial IntelligenceSep. 2022 - Present
Korea Advanced Institute of Science and Technology (KAIST)
B.S. in Electrical Engineering, Computer Science and Mathematics (minor)Mar. 2018 - Aug. 2022
Korea Advanced Institute of Science and Technology (KAIST)
GPA: 4.20/4.3 (Summa Cum Laude)
Laon PeopleAug. 2021 - Feb. 2022
AI Research Intern
Recipient, National Presidental Scholarship for Science
Dean's List, KAIST College of Engineering (2019F, 2020S, 2020F)
Recipient, Korea Foundation for Advanced Studies (KFAS) Undergrad Scholarship
Recipient, KAIST Alumni Association Scholarship
Recipient, KAIST Presidental Fellowship
Recipient, National Science and Engineering Undergraduate Scholarship