About
I'm currently a second-year M.S. student in Computer Science at National Taiwan University (NTU), advised by Prof. Shou-De Lin. My research interests lie at the intersection of Natural Language Processing and Machine Learning, aiming to enhance the practical applicability of AI models under real-world constraints (e.g., limited compute, data scarcity, and high evaluation costs).
My current research interests include: (1) efficient and controllable inference-time scaling, (2) automated verifiers and benchmarks, (3) robust generalization across domains and modalities.
Publications
-
Beyond Facts: Benchmarking Distributional Reading Comprehension in Large Language Models.
Pei-Fu Guo, Ya-An Tsai, Chun-Chia Hsu, Kai-Xin Chen, Yun-Da Tsai, Kai-Wei Chang, Nanyun Peng, Mi-Yen Yeh, Shou-De Lin
ACL 2026 Findings
-
LiveCLKTBench: Towards Reliable Evaluation of Cross-Lingual Knowledge Transfer in Multilingual LLMs.
Pei-Fu Guo, Yun-Da Tsai, Chun-Chia Hsu, Kai-Xin Chen, Ya-An Tsai, Kai-Wei Chang, Nanyun Peng, Mi-Yen Yeh, Shou-De Lin
ACL 2026 Main Conference
-
Why is the LLM unsure? Profiling the Causes of LLM Uncertainty for Adaptive Model and Uncertainty Metric Selection.
Pei-Fu Guo, Yun‑Da Tsai, Shou‑De Lin
-
Benchmarking Uncertainty Metrics for LLM Target‑Aware Search.
Pei-Fu Guo, Yun‑Da Tsai, Shou‑De Lin
Findings of EMNLP 2025
-
Text‑centric Alignment for Bridging Test‑time Unseen Modality.
Yun‑Da Tsai*, Ting‑Yu Yen*, Pei-Fu Guo, Zhe-Yan Li, Shou‑De Lin
Findings of EMNLP 2025
-
Towards Optimizing with Large Language Models.
Pei-Fu Guo*, Ying‑Hsuan Chen*, Yun‑Da Tsai, Shou‑De Lin
KDD 2024 Knowledge‑Infused Learning Workshop
Experience
Education
-
M.S. in Computer Science & Information Engineering, National Taiwan University
– Present
-
B.S. in Economics, National Taiwan University
–
Work
-
Research Assistant, NTU-MSLab
– Present
LLM Research Scientist Intern, Appier
–