Even Falkenberg Langås

Virtual Reality Based Remote Control of Robot

By using Node-RED, MQTT and a Virtual Reality interface, we are able to do remote monitoring and control of a manufacturing cell.

Virtual Reality Based Remote Control of Robot

In my latest research, I've been exploring how we can make robots more accessible and easier to work with, especially in industrial settings. The goal is to create a system where humans and robots can collaborate seamlessly, even when they're not in the same physical space. This is where the concept of "digital twins" comes in, a virtual replica of real entities that you can interact with from your computer or even a VR headset. The video above demonstrates our work.

Building a Framework for Collaboration

Our research focuses on developing a framework that makes this kind of remote collaboration possible. This framework combines several technologies:

Robot Operating System 2 (ROS 2)

This provides the foundation for controlling the robot and managing communication between different components of the system.

Node-RED

This tool helps process and visualize data from the robot, making it easier to understand what's happening.

MQTT

This protocol enables secure and efficient communication between the digital twin and the physical robot, even over long distances.

Virtual Reality (VR)

This technology creates an immersive experience, allowing users to interact with the digital twin as if they were right there with the physical robot.

Real-World Applications: A Case Study

To test this framework, we used it to control a collaborative robot (cobot), a gripper and a conveyor belt in a simulated industrial environment. The cobot was equipped with a camera and sensors, and we were able to monitor and control it remotely using a VR headset. This demonstration showed how the framework could be used to perform tasks like picking and placing objects.

Overview of the case study

Overview of the case study setup featuring a cobot, camera, PLC, and conveyor belt within the digital twin framework for real-time remote control and monitoring.

Benefits and Beyond

This research has the potential to transform the way we interact with robots. By enabling remote collaboration, we can make robots more accessible to people who may not be able to work in close proximity to them, such as individuals with disabilities. Additionally, this technology can improve safety by allowing humans to control robots from a safe distance in hazardous environments.

While this research is still ongoing, the results so far are promising. We believe that this framework has the potential to revolutionize the field of human-robot interaction, making it easier for us to work together with our robotic counterparts.

You can find the code for this project on my GitHub page: https://github.com/evenlangas/robot-control-virtual-reality-mqtt

Research Publications

Inclusive Digital Twins with Edge Computing, Cloud Communication and Virtual Reality to Achieve Remote Human-Robot Interaction
Even Falkenberg Langås, Halima Zahra Bukhari, Daniel Hagen, Muhammad Hamza Zafar, Filippo Sanfilippo
12th International Conference on Control, Mechatronics and Automation (ICCMA), 2024
Abstract: Digital twins, advanced robotics, edge computing, and immersive technologies have come together to create novel solutions that increase flexibility and operating efficiency. This work is motivated by the need to harness these advancements to develop a robust architecture that supports edge intelligence and real-time remote monitoring and control. The aim of this work is to define a comprehensive framework for digital twins of complex mechatronic systems. The framework enables seamless connection between the physical environment and a virtual representation accessible from remote locations. Key components of the framework include the Robot Operating System 2 (ROS 2) for robot control, Node-RED for data processing and edge communication, a Message Queuing Telemetry Transport (MQTT) broker for cloud-based communication, and a virtual reality (VR) application for immersive interaction. A case study is presented to demonstrate the framework's capabilities. Through the VR application, users can interact with a digital twin of a mechatronic system consisting of a collaborative robot, a programmable logic controller (PLC), a conveyor belt and numerous sensors. The user interface lets the operator manipulate the robot and monitor sensor data in real-time. Latency is measured to validate the performance of the framework, resulting in a mean latency of around 116 ms. Read more