Sehyun Choi

Master of Philosophy (MPhil) @ HKUST.

prof_pic.jpg

(Photo: HKUST)

Yongin-si, South Korea

I have a huge interest in Machine Learning (ML) and Artificial Intelligence (AI), and planning to pursue a higher degree after my bachelor’s degree. My major interest lies in the field of Natural Language Processing (NLP), working with Language Models (LMs). Other orthogonal fields of interest include Trustworthy AI, such as eXplainable AI (XAI) and robustness.


Research Questions:

Language Model +

  1. Inference-time optimization
  2. Knowledge Grounding
  3. compositional generalization
  4. World Model

In more details:

First topic is something I call “inference-time optimization”. Language Models have come a long way, and they are extremely strong nowadays. However, their behaviors are strongly dependent on the training dataset’s property. It is impossible to design dataset that is “perfect”, so they will always exhibit undesired attributes. I believe this should be addressed with inference-time adaptation method which includes controllable generation or constrained modeling for given constraints / desiderata (objectives).

Second, I am also interested in “Knowledge Grounding”. Hallucination, or confabulation (credit to Dr. Hinton), has become the hot potato in the era of generative AI. I believe reference-grounded generation is an important direction of solving this problem in the safe AI direction.

Third topic is reasoning ability, or more precisely, “Compositional generalization”. Dr. Chomsky states that human language has “infinite uses of finite means”, which suggests that humans compose finite set of functions to create infinite different possibilities. There are evidences that current SOTA AI (arguably LLMs) still may lack such ability.

The final topic is “World Model”, which is related to understanding commonsense about how the world works. I have a few publications about commonsense reasoning and building commonsense knowledge bases (CSKBs).


📄 Resume

(Last Update: Nov 16, 2023)


My another Passion: Photography

During my free time, I love doing street photography. You can check out my photos on my Instagram @sync.hpoto (More Anything) & @sync.hcut (More Street). I’m also big into analog film photography; which is exclusively showcased in my personal gallery.

news

Oct 7, 2023 🚨 NEW PAPER! 🚨 My first author paper about “Knowledge Constrained Decoding” has been accepted to EMNLP 2023 main conference!
Aug 26, 2022 :tada: Won 2nd place in Naver Clova AI Rush 2022, Unknown Document Detection Task! :tada:
May 11, 2022 Successfully finished my FYT project, “Explaining NLI with Feature Interaction Attribution”, in which I developed MAsk-based Feature Interaction Attribution (MAFIA), with an interactive demo.
Apr 25, 2022 I am starting MPhil @ HKUST with Prof. Yangqiu Song, starting from September 2022. I have been selected as Asian Future Leaders Scholar 2022 too!
Aug 26, 2021 My paper from UROP project has been accepted to EMNLP 2021! (3rd author)

selected publications

2023

  1. EMNLP 2023
    KCTS: Knowledge-Constrained Tree Search Decoding with Token-Level Hallucination Detection
    Sehyun Choi, Tianqing Fang, Zhaowei Wang, and 1 more author
    EMNLP 2023, Oct 2023
  2. arXiv
    AbsPyramid: Benchmarking the Abstraction Ability of Language Models with a Unified Entailment Graph
    Zhaowei Wang, Haochen Shi, Weiqi Wang, and 5 more authors
    arXiv preprint, Nov 2023

2021

  1. EMNLP 2021
    Benchmarking Commonsense Knowledge Base Population with an Effective Evaluation Dataset
    Tianqing Fang, Weiqi Wang, Sehyun Choi, and 4 more authors
    EMNLP 2021, Sep 2021

experiences

Sep, 2023 ~ Research Intern @ Nucleus AI, Foundational Language Model Pretraining (California, U.S., Remote)
Dec, 2021 ~ Feb, 2022 Research Intern @ Naver Corporation, Papago (Seongnam, South Korea)
Jul, 2021 ~ Aug, 2021 Research Intern @ SAI-Lab, KAIST, led by Prof. Jaesik Choi (Seongnam, South Korea)
Sep, 2020 ~ Dec, 2022 Research Intern (UROP) @ HKUST-KnowComp, led by Prof. Yangqiu Song. (Hong Kong)
Dec, 2019 ~ Jul, 2020 ML Engineer Intern @ Skelter Labs (Seoul, South Korea)

education

Aug 25, 2022 Master of Philosophy @ HKUST (2022~)
  • Highlights: CGA: 4.15/4.3
Jul 14, 2022 Bachelor’s Degree @ HKUST (2017~2022) [ Certificate | Transcript ]
  • Hightlights: GGA: 4.01/4.3; Academic Achievement Medal (Summa Cum Laude equivalent); First Class Honours; Dean’s List
Dec 23, 2016 High School Diploma @ Handong International School (2011~2016)