Yongchao Chen - 陈勇超

I am a 4th year PhD student of Electrical Engineering at Harvard SEAS and MIT LIDS. I am currently working on Robotics and Foundation Models under the guidance of Prof. Chuchu Fan and Prof. Nicholas Roy at MIT and co-advised by Prof. Na Li at Harvard. I am also doing the research in AI for Physics, Mechanics, and Materials, particularly interested in applying Robotics/Foundation Models into AI4Science.

I received my bachelor's degree at University of Science and Technology of China (USTC) with the major in Theoretical and Applied Mechanics and minor in Applied Mathematics in 2021. Before coming to the Robotics domain, I did researches on Applied Physics, Solid Mechanics, and AI for Science under the guidance of Prof. Ju Li at MIT, Prof. Joost Vlassak at Harvard, Prof. Ting Zhu at Georgia Tech, and Prof. Hailong Wang at USTC.

I received the 40th Guo Moruo Award (highest undergraduate honor in USTC) and Harvard SEAS PhD Fellowship.

I interned at Microsoft Research in 2024 summer in Redmond, WA and am working with MIT-IBM Watson AI Lab starting from 2023.

Email  /  CV  /  Google Scholar  /  LinkedIn  /  Github

profile photo
Talks

2024/12/12 - Keynote talk at Intuit AI Research.

2024/11/18 - Invited talk at AutoGen community.

2024/11/12 - Oral presentation (Top 3%) of PROMST in EMNLP'2024, Miami.

2024/10/31 - Seminar talk at Tencent AI Lab in Seattle.

2024/05/11 - Invited Talk at Osaka Metropolitan University in Japan.

2024/04/30 - Seminar Talk at MIT LIDS Autonomy Tea Talk.

2024/04/18 - Seminar at Harvard AM+ Graduate Student Seminar.

2024/03/07 - Presented AutoTAMP in AGI Leap Summit 2024, and won the best paper award.

2024/01/29 - AutoTAMP has been invited to give seminar talk in ICRA VLMNM workshop.

2023/11/01 - Seminar Talk at Science In the News (SITN): Is ChatGPT the Brain Robots Have Been Waiting For? [Poster] [Slides]

Research (Selected Publications)

Robotics Learning infuses robots with AI-derived intelligence; conversely, AI for Science exploits AI to generate new scientific intelligence.

Robotics and Foundation Models

I'm currently interested in LLM-based robot planning. Utilizing natural language commands is crucial for mainstream applications of robotics. The recent arising of LLMs makes embodied AI more promising.

Steering Large Language Models between Code Execution and Textual Reasoning
Yongchao Chen, Harsh Jhamtani, Srinagesh Sharma, Chuchu Fan, Chi Wang
arXiv
Project Page / Code / Paper

Our research highlights the limitations of textual reasoning in LLMs for tasks involving math, logic, and optimization, where code generation offers a more scalable solution. Despite advances like OpenAI's GPT Code Interpreter and AutoGen, no optimal method exists to reliably steer LLMs between code and text generation. This study identifies key patterns in how LLMs choose between code and text with various factors and proposes three methods to improve steering.

PRompt Optimization in Multi-Step Tasks (PROMST): Integrating Human Feedback and Preference Alignment
Yongchao Chen, Jacob Arkin, Yilun Hao, Yang Zhang, Nicholas Roy, Chuchu Fan
The 2024 Conference on Empirical Methods on Natural Language Processing (EMNLP'2024 Main, Oral, Top 3%)
Project Page / Code / Paper / Initial and Optimized Prompts

We introduce an automatic prompt optimization framework for complex, multi-step agent tasks: PROMST. To handle the issues of task complexity, judging long-horizon correctness of individual actions, high prompt exploration cost, and human preference alignment, we propose the integration of human feedback, a learned score prediction model, and the modification of task score functions.

Large Language Models Can Plan Your Travels Rigorously with Formal Verification Tools
Yilun Hao, Yongchao Chen, Yang Zhang, Chuchu Fan
arXiv
Paper

Xie et al. (2024) introduced TravelPlanner, revealing that LLMs alone had a low success rate of 0.6%. In response, this work proposes a framework that uses LLMs with satisfiability modulo theory (SMT) solvers to interactively and automatically generate valid travel plans, achieving a 97% success rate on TravelPlanner and over 78% on a newly created international travel dataset.

Simplifying Robot Task Planning with Large Language Models
Jacob Arkin, Yongchao Chen, Yang Zhang, Nicholas Roy, Chuchu Fan
arXiv
Project Page / Paper

Task planning in complex environments with many objects can be slow due to irrelevant objects that distract the planner. This study explores the use of pre-trained large language models (LLMs) to simplify planning problems by identifying and excluding irrelevant objects. Various prompting techniques are tested across multiple LLMs in four task planning domains with hundreds of objects.

Scalable Multi-Robot Collaboration with Large Language Models: Centralized or Decentralized Systems?
Yongchao Chen, Jacob Arkin, Yang Zhang, Nicholas Roy, Chuchu Fan
The 2024 International Conference on Robotics and Automation (ICRA'2024)
Project Page / Code / Paper / Video

We compare the task success rate and token efficiency of four multi-agent communication frameworks (centralized, decentralized, and two hybrid) and three step history methods (with all history, without history, and with state-action pairs) as applied to four coordination-dependent multi-agent 2D task scenarios for increasing numbers of agents.

AutoTAMP: Autoregressive Task and Motion Planning with LLMs as Translators and Checkers
Yongchao Chen, Jacob Arkin, Charles Dawson, Yang Zhang, Nicholas Roy, Chuchu Fan
The 2024 International Conference on Robotics and Automation (ICRA'2024), best paper award in AGI Leap Summit 2024
Project Page / Code / Paper / Video

This paper uses LLMs to translate language instructions to formal task specifications that can be solved via a TAMP planner.

NL2TL: Transforming Natural Languages to Temporal Logics using Large Language Models
Yongchao Chen, Rujul Gandhi, Yang Zhang, Chuchu Fan
The 2023 Conference on Empirical Methods on Natural Language Processing (EMNLP'2023 Main)
Project Page / Demo Website / Code / Paper

We propose a framework to achieve accurate and generalizable NL-to-TL transformation with the assistance of LLM, from aspects of both data generation and model training.

Fundamental Science and AI for Science

I also did much work on fundamental physical sciences and AI for science. Integrating robotics and Foundation Models to help explore new science should be the general trend.

A machine learning perspective on the inverse indentation problem: uniqueness, surrogate modeling, and learning elasto-plastic properties from pile-up
Quan Jiao, Yongchao Chen, Jong-hyoung Kim, Chang-Fu Han, Chia-Hua Chang, Joost J Vlassak*
Journal of the Mechanics and Physics of Solids (Acceptance Rate = 24%, Impact Factor = 5.58), 2024
Paper

We applied machine learning and optimization methods to explore the forward and inverse problems of indentation.

Nanoscale ductile fracture and associated atomistic mechanisms in a body-centered cubic refractory metal
Yan Lu1, Yongchao Chen1 (equal contribution), Yongpan Zeng, Yin Zhang, Deli Kong, Xueqiao Li, Ting Zhu*, Xiaoyan Li*, Shengcheng Mao, Ze Zhang, Lihua Wang*, Xiaodong Han*
Nature Communications (Acceptance Rate = 7.7%, Impact Factor = 17.69), 2023
Paper

We revealed nanoscale ductile fracture and associated atomistic mechanisms in a body-centered cubic refractory metal.

Physics-Enhanced Multi-fidelity Learning for Optical Surface Imprint
Yongchao Chen*
NeurIPS 2023 Workshop on Adaptive Experimental Design and Active Learning in the Real World
Paper

We apply active learning and multi-fidelity neural networks to explore the inverse problems, mitigate the sim-to-real gap, and automate the material discovery process.

Ion-beam radiation-induced Eshelby transformations: The mean and variance in hydrostatic and shear residual stresses
Yongchao Chen, Qingjie Li, Alexander D. O'Brien, Yang Yang, Qi He, David A. Bloore, Joost J. Vlassak*, Ju Li*
Extreme Mechanics Letters (Acceptance Rate = 30%, Impact Factor = 4.728), 2023
Paper

We revealed the coupling effects of incident direction of PKA and structures of materials on average residual shear stress.

Anomalous layer-dependent lubrication on graphene-covered substrate: Competition between adhesion and plasticity
Yongchao Chen, Zhizi Guan, Jingnan Liu, Wei Yang, Hailong Wang*
Applied Surface Science (Acceptance Rate = 18%, Impact Factor = 7.392), 2022
Paper

We revealed the anomalous layer-dependent frictional behavior, which originates from the interplay among interfacial adhesion, wrinkle of topmost graphene, contact roughness, and plastic deformation of substrates.

Tuning nanoscale adhesive contact behavior to a near ideal Hertzian state via graphene coverage
Yongchao Chen, Zhizi Guan, Wei Yang, Yongtao Yao, Hailong Wang*
Computational Materials Science (Acceptance Rate = 19%, Impact Factor = 3.572), 2021
Paper

The influence of the adhesion between the bare substrate and indenter tip can be significantly reduced by decreasing the adhesion strength and adhesion range between the atoms on the substrate and indenter, or by enhancing the substrate stiffness.

Hobbies

Sports: Soccer (I attended Ivy Cup with Harvard twice, though both failed in the group stage…Sad), Basketball, Swimming, Table Tennis, Badminton, Snooker, 5K Marathon.

Board Game: Avalon, Texas Hold'em, Werewolf, Secret Hitler, Citadel, UNO, Settlers of Catan.

History: I like studying all kinds of history.

Singing: I cannot sing professionally but with much interest to country music, such as ‘Take Me Home, Country Road’ by John Denver and ‘The Girl from The South’ by Lei Zhao.