Seokhyeon Ha

I am interested in artificial intelligence and machine learning. Currently, I am focusing on research related to large-scale language models.

  • Currently working as a Staff Engineer at Samsung Electronics (2024-)

  • PhD degree from Department of Eletrical and Computer Engineering, Seoul National University (2017-2024), advised by Prof. Jungwoo Lee

  • BS degree from Department of Eletrical and Computer Engineering, Seoul National University (2013-2017)

Email  /  Google Scholar  /  Github  /  LinkedIn

Research

Domain-Aware Fine-Tuning: Enhancing Neural Network Adaptability
Seokhyeon Ha, Sunbeom Jung, Jungwoo Lee
AAAI, 2024
paper / code

We propose Domain-Aware Fine-Tuning (DAFT), a novel approach that incorporates batch normalization conversion and the integration of linear probing and fine-tuning. Our batch normalization conversion method effectively mitigates feature distortion by reducing modifications to the neural network during fine-tuning. Additionally, we introduce the integration of linear probing and fine-tuning to optimize the head layer with gradual adaptation of the feature extractor.

MINA: Multi-Input Network Augmentation for Enhancing Tiny Deep Learning
Seokhyeon Ha, Yeongmo Kim, Jungwoo Lee
IEEE ACCESS, 2023
paper

We propose a new method called Multi-Input Network Augmentation (MINA). MINA converts tiny neural networks into a multi-input configuration, allowing only the augmented model to receive more diverse inputs during training. Additionally, tiny neural network can be converted back into their original single-input configuration after training.

Meta-ensemble learning with a multi-headed model for few-shot problems
Seokhyeon Ha, Youngseok Yoon, Jungwoo Lee
ICT Express, 2023
paper

We propose a novel meta-ensemble learning approach based on a recent ensemble method: a multi-input multi-output (MIMO) configuration. Our approach is simply applied to existing meta-learning algorithms. Multiple subnetworks in a single model simultaneously learn multiple episodes and ensemble the predictions, leveraging the model capacity.

DE-DARTS: Neural architecture search with dynamic exploration
Jiwoo Mun, Seokhyeon Ha, Jungwoo Lee
ICT Express, 2023
paper

To overcome the bias of the gradient-based search, we make the architecture dynamic during the search. This simple technique allows the gradient-based search to have an exploration effect. For effective exploration, we propose Dynamic Attention Networks (DANs) which change the neural architecture based on the input.

Automotive radar signal interference mitigation using RNN with self attention
Jiwoo Mun, Seokhyeon Ha, Jungwoo Lee
ICASSP, 2020
paper

We propose a new method using deep learning. We improve the performance of the existing deep learning algorithm using attention mechanism. We applied our algorithm to the OFDM radar environment as well as the existing frequency modulated continuous wave(FMCW) radar.