Robotic Arms

Augmented Robotic Telepresence

The robot arm is developed to tackle remote tasks by developing nodes that are cloud-based and constructed to create the remote work environment. By programing the robot over the cloud the team were able to remotely simulate robotic movements & custom AR camera mounted on a robot arm. Team were able to remotely utilize the resources of the Robotically Augmented Design (RAD) Lab at Kent State University.

Marc Northstar was a grant recipient from ACADIA / AutoDesk sponsored, 2020. A live collaboration between four geographical locations: California, Washington, Czech Republic, and Ohio.

Software: Maya, Rhino, & Grasshopper.

A New Frontier in Telepresence

The core of this research centers on augmented robotic telepresence (ART)—a technology that merges the virtual with the physical to create immersive, tactile experiences. We utilized robot arms equipped with custom end-effectors, AR cameras, and motion-capture tools to enable remote users to engage with physical environments in ways previously deemed impossible. For example, a remote participant could manipulate real-world objects or move through a curated space while experiencing real-time, high-quality visual feedback. This blending of realities not only pushes the boundaries of telepresence but also expands the potential for its application across multiple industries.

Advancing Digital Fabrication

At the ACADIA conference, our work was recognized with a grant to participate in the Augmented Robotic Telepresence workshop. In this workshop, we redefined "digital fabrication" as a hybrid process that involves digitalizing the physical and physicalizing the digital. Using robot arms as physical interpreters and AR platforms as digital mediums, we explored the possibilities of bridging the gap between actual and virtual realities. This new workflow establishes a seamless connection between AR cameras mounted on robotic arms and custom robotic motions, enabling remote users to "occupy" physical spaces with unprecedented accuracy and control.

Practical Applications: From Creative Arts to Industry

One of the most exciting aspects of this research is its broad applicability. While the initial development was centered on remote videography and artistic endeavors—such as using the UR10 robot arm for curated walk-throughs at RAD Lab—the potential extends far beyond. This hybrid workflow can be applied to industries like manufacturing, healthcare, and education:

  • Manufacturing: Remote workers could oversee and manipulate robotic assembly lines from anywhere in the world, improving efficiency and safety in hazardous environments.

  • Healthcare: Surgeons and medical professionals could perform complex procedures using robotic arms and AR guidance, enabling remote surgeries or training simulations in real-time.

  • Education: Virtual labs and studios could allow students to work on real-world projects from their homes, using robotic systems to interact with tools and environments they otherwise couldn’t access.

The Future of Hybridized Reality

Looking forward, our ART workflow has the potential to scale beyond the RAD Lab and integrate with a variety of software-as-a-service (SaaS) platforms. Imagine a future where AR and robotics converge in SaaS ecosystems, allowing users to remotely manipulate machines, environments, or creative projects through intuitive, cloud-based interfaces. The ability to remotely program robotic systems and interact with them in real-time could democratize access to advanced technologies, allowing more people to contribute to fields ranging from industrial design to architectural research.

By hybridizing the real and virtual, we’ve opened the door to a future where digital and physical worlds not only coexist but actively enhance each other. This is more than just an innovation in telepresence—it’s the foundation of new platforms that will shape how we work, create, and interact in the years to come.

Product Design Elements: From what we can see, the project has several elements that are consistent with product design:

  1. Interaction Design: This project involves designing an interactive experience that allows users to manipulate robot arms in a virtual environment.

  2. Visual Design: The project has a strong visual component, with 3D models and animations that bring the robot arms to life.

  3. User Experience (UX) Design: The project requires consideration of the user's experience and how they will interact with the AR environment.

Product Design Principles: The project appears to incorporate several product design principles, such as:

  1. User-centered design: The project is designed to provide an engaging and interactive experience for the user.

  2. Experimentation and iteration: The project is involved experimentation and iteration to refine the AR experience and ensure that it is intuitive and user-friendly.

  3. Emphasis on usability: The project prioritizes usability, with a focus on creating an experience that is easy to navigate and understand.