PAPRLE: reconfigurable hardware collects data for AI-driven robotics

3/10/2026 Michael O'Boyle

Large behavior models have the potential to do for robotics what large language models have done for current AI technologies. Joohyung Kim, an Illinois Grainger Engineering professor of electrical and computer engineering, worked with the Toyota Research Institute to develop PAPRLE (Plug-and-Play Robotic Limb Environment), a system for rapidly collecting large, robust datasets to train physical (or embodied) AI. 

Written by Michael O'Boyle

 

The rapid advancement of artificial intelligence in recent years has caught the attention of roboticists. The success of large language models, or LLMs, has prompted interest in applying the same techniques to robots and other physical devices.

Like human language and thought, human movements have often proven too subtle and complex to reproduce with traditional computer programs. Researchers believe that “large behavior models” trained on vast datasets of human-robot interactions could make robotic systems as powerful as today’s LLM-based technologies.

Photo of Joohyung Kim
Joohyung Kim

“The reason for AI’s progress is massive data sets collected from human actions and information,” explained Joohyung Kim, a professor of electrical and computer engineering in The Grainger College of Engineering at the University of Illinois Urbana-Champaign. “However, data for robotic interactions is still scarce, especially data that is diverse, repeatable and grounded in real physical interaction. Before we can truly bring AI to robotics, we need to collect a large, robust data set to build on.”

In conjunction with the Toyota Research Institute, Kim’s Illinois Grainger Engineering research group KIMLAB has developed a system for rapidly collecting data on human-robot interaction. The Plug-and-Play Robotic Limb Environment (PAPRLE) allows physical embodiments and control interfaces to be easily reconfigured without significant rebuilding, facilitating quick and versatile data collection for a wide range of behaviors.

“What distinguishes robotics from other sensors and sources of information is the ability to sense the environment,” Kim said. “They can accumulate data through a continuous loop of perception, action, feedback and learning. Although this concept is broadly understood, it is not yet clearly defined. PAPRLE is the first step in this direction.”

KIMLAB researchers presented PAPRLE at the Amazon MARS 2025 conference, and the system was reported in the January 2026 issue of IEEE Robotics & Automation Magazine.

Follow the leader: interchangeable controls and embodiment

PAPRLE is designed to interface two kinds of units: “leaders,” or control systems such as gaming controllers, virtual reality interfaces and puppeteer devices; and “followers,” or different robot limb configurations. The defining feature of the interface is that it is not tied to one particular hardware configuration. The leader and follower units can be swapped in and out, but the control system stays the same.

With this system, it is much easier to generate large sets of control and movement demonstrations across multiple hardware embodiments. The demonstrations are also consistent enough to be used as reliable training data for large behavior models.

Graphic features two squares. Each square has a label of  "Leader," "Follower,"

“PAPRLE is designed to be a practical bridge between real robot interaction and data-driven physical AI,” Kim said. “It’s a pipeline. And once the pipeline is in place, the AI can enter in multiple ways, such as learning policies from demonstrations, learning action representations across embodiments, building models that map perception to action, or even adding shared autonomy — an ‘AI assist’ — on top of teleoperation.”

CHILD’s play

KIMLAB demonstrated the principles of PAPRLE with a controller system designed to mimic human limb movements. The Controller for Humanoid Imitation and Live Demonstration, or CHILD, consists of a small figurine that fits in a standard-sized baby carrier. An operator moves the limbs, and a humanoid robot responds.

Illinois Grainger Engineering affiliations Joohyung Kim is an Illinois Grainger Engineering associate professor of electrical and computer engineering in the Department of Electrical and Computer Engineering. He is affiliated with the Coordinated Science Laboratory.


Share this story

This story was published March 10, 2026.