Sungjae Park
Hi, I am a research intern in the Visual Computing Lab at Seoul National University, advised by Prof. Hanbyul Joo.
Previously, I was a research intern in the CLVR lab at KAIST, advised by Prof. Joseph J. Lim.
I completed B.S. (Summa Cum Laude) in Mechanical Engineering and Mathematics at Seoul National University.
Email  / 
CV  / 
Twitter  / 
Github
|
|
Research
My research goal is to develop a robot with human-like abilities, including
physical capabilities for manipulation tasks, and cognitive capabilities for intuitive understanding of the real world.
Specifically, I am interested in solving complex manipulation tasks (e.g. dexterous, contact-rich, long-horizon tasks),
physical reasoning WITH interaction (e.g. experimenting with objects and inferring properties with physical interaction), and
physical reasoning FOR interaction(e.g. intuitive physics, leveraging object permanence for efficient object retrieval).
* denotes equal contribution.
|
|
DROID: A Large-Scale In-the-Wild
Robot Manipulation Dataset
DROID Dataset Team
arXiv 2024
Paper / Website / Code
|
|
Open X-Embodiment: Robotic Learning Datasets and RT-X Models
Open X-Embodiment Collaboration
ICRA 2024
Paper / Website
|
|
Efficient Cross-Embodiment Learning with Object-Centric Planner
Sungjae Park, Joseph J. Lim, Frank C. Park
Bachelor's Thesis, 2022
Awarded Outstanding BS Thesis Presentation Award
We learn object centric trajectory planner from different robot's demonstration for cross-embodiment transfer.
|
|
Online Active Gaussian Process Motion Planning in Unknown Environments
Sungjae Park*, Hyelim Choi*, Taekyun Kim*
Graduate Course Project (Probabilistic Graphical Models, Spring 2022)
We combine gaussian process motion planning with entropy-based information factor to perform online active motion planning in unknown environments.
|
|
Motion Planning under Constraint with Learned Reachable Manifold
Sungjae Park, Suhan Park
We learn reachable manifold of franka panda robot arm with block neural autoregressive flow as in
Kim et al.
, and perform contrained motion planning directly in the learned latent space.
|
|
Vision Guided Peg Insertion
Sungjae Park*, Hosun Choi*, Hyunmoo Heo*
We perform robust peg insertion with YOLO-v3 based hole/box detection model.
|
|
Introduction to Robotics
Student Instructor, Spring 2022
College Physics 1, 2
Undergraduate Tutoring, Spring 2018, Fall 2018, Spring 2021, Fall 2021
Linear Algebra
Undergraduate Tutoring, Spring 2021
|
Service
Reviewer for NeurIPS 2023, ICLR 2024.
|
|