Summit Art Creations - stock.ado
AI and digital twins to serve increasingly complex robot management
Proliferating fleets of robots, diffusion of humanoids, and the emerging robot class of androids will increase complexities of robot management in the future. AI and digital twins will become increasingly important management tools
As robotic devices – including delivery and service bots as well as utilitarian drones – proliferate and develop increasing degrees of autonomy, managing fleets of robotics will become increasingly challenging and therefore important. Artificial intelligence (AI) will become a crucial component of these fleets and associated infrastructures.
Management systems that enable navigation of robotic fleets and coordination among various types of robotics systems will become essential to drive automation and autonomous applications across industries, particularly in unstructured environments.
AI and digital twins to serve increasingly complex robot management
Nvidia, which has become a household name in recent years because of its AI chips, is also a major player in AI-enabled robotics and is working on a wide range of research into advanced robotics.
At the beginning of 2025, the company released Mega for Omniverse Blueprint, a software product to “design, test and optimise a new generation of intelligence manufacturing data centres using digital twins”. The Nvida blueprint says Mega allows users to develop and improve “physical AI and robot fleets at scale in a digital twin before deployment into real-world facilities”.
The system focuses on the use of fleets of robots – including humanoids, autonomous robots and robotic manipulators – for warehouse and factory applications. Digital twins provide a platform to decide on the placement of equipment, establish navigation paths, and foresee potential operational conflicts and issues before deploying changes to the physical warehouse and facilities.
It also aims to enable engineers to simulate operations of robots, autonomous devices and other automated equipment within digital twins. Complex interactions can be tested and improved under various real-world scenarios in safe and risk-free digital environments to develop the layout and workflow of manufacturing and warehouse plants.
Nvidia says: “The convergence of AI, robotics and digital twins promises to streamline logistics, reduce inefficiencies and inject intelligence into industrial operations.”
Similarly, the combination of digital twins with AI and robotics, as well as rapidly advancing sensor technologies and their availability, could act as a game changer in industrial operations. The emerging network of mutualistic technologies is set to establish effective and efficient operations far beyond factory and warehouse environments, affecting planning, operations and management of all kinds of systems – from advanced types of equipment to intertwined supply chains to entire urban environments.
Nvidia founder and CEO Jensen Huang highlights the need to understand the changes that are affecting industrial facilities in coming years, saying: “Future warehouses will function like massive autonomous robots, orchestrating fleets of robots within them.” This is a notion that will apply to systems well beyond warehouses and manufacturing plants over time.
Growing interest in humanoid robots
Humanoid robots have caught the attention of developers and investors in recent years – so much so that observers are concerned that the field is already creating an investment bubble. The rationale for robots that resemble the human anatomy – more or less closely – is clear: current industrial and service robots require adjusted environments to operate in.
The layout of facilities has to accommodate robots’ capabilities and limitations. Pathways, for example, have to be designed consciously – stairs and ramps can represent formidable hurdles for wheeled robots. In some cases – to protect humans – pathways are separated from human hallways to avoid conflict, and industrial robots might even be required to be in cages for safety purposes.
In contrast, humanoid robots can use the infrastructure that was purposefully and carefully created to accommodate humans. Stairs and ramps won’t be an issue, but also objects are designed to be grabbed and handled by humans – and therefore are usable by humanoid robots with similarly featured hands. In short, these robots can seamlessly integrate into human places and facilities.
Pilot projects and showcases hint at the potential use cases of humanoids. Cost, reliability and efficiency considerations will play a role in future roll-outs – likely only some well-defined and focused applications will make sense. After all, manufacture of screws or pouring of concrete will see increased use of industrial robots; the use of humanoids in this context is rather unlikely though.
The number of humanoid developers is rather large, and some experts foresee potentially substantial changes in manufacturing tasks. Tesla is refining its Optimus humanoid. Figure claims that it is “giving AI a body” and that it is the first company to enable two humanoids to collaborate on tasks, with the robot able to be operated via voice commands. Many more companies want to participate in the emerging market; 1X, Agility Robotics, Apptronik, Boston Dynamics, Fourier Intelligence, RobotEra and Sanctuary AI already crowd the market.
Car manufacturers, for example, already look at real-world applications in their production facilities. BMW is experimenting with Figure’s humanoids in its Spartanburg location. Mercedes, meanwhile, is testing the use of Apptronik humanoids in Berlin-Marienfelde. Since such robots can navigate facilities independently and take on a wide range of tasks, AI will play an important role.
Jenny Shern, general manager at robot builder NexCobot, says: “Integrating AI to interpret human commands and dynamically generate task-specific actions is key to enabling real-world household applications. For example, [for] an AI-powered humanoid robot to ‘clean up the table’, it would need to understand the context, recognise objects and make a decision on what action [to take].”
In March 2025, Hyundai Motor Group Metaplant America opened in Georgia in the US. The plant features a smörgåsbord of advanced technologies. AI-driven robots, drones and digital twins support the production of cars with advanced manufacturing, planning production tasks, managing inventory and finalising inspections. Hyundai decided to build up the factory from the ground up with the newest manufacturing technologies and approaches in mind.
Working on androids
Developers also are working on android robots – humanoid robots that look and even feel like humans. A more intuitive integration in human environments and more natural interactions with human users are some of the considerations for such robots – timing of successful adoption and diffusion and such robots likely lies far in the future. Nevertheless, researchers are looking at creating systems resembling human physiology. Bionic muscles, electronic skins and flexible skeletons are parts of such humanoids.
For example, Clone Robotics is working on bionic muscles and researchers at the Massachusetts Institute of Technology have created flexible skeletons that support robots with such bionic muscles. Meanwhile, engineers at Johns Hopkins University have designed a biomimetic prosthetic hand that employs tactile sensors to enable grabbing of objects. University of Tokyo researchers even created living skin with the ability to heal itself.
Currently, many of the existing applications suffer from the effects of uncanny valley. The effect on humans can be disturbing, leaving unease and a creepy feeling. Nevertheless – similar to generative AI pictures and movies in past years – androids’ features will improve over time. AI and virtual applications then will support the design and management of such machines, including the creation of natural movements and representation of situation-appropriate emotions.
Robots and XR are becoming powerful allies
AI will provide data to feed virtual environments, transform the growing amount of sensor data into curated and usable information, and facilitate the creation of powerful simulations within digital twins. These twins – or XR applications more generally – will need to adjust to make full use of AI-generated content and models to serve robotic applications and fleets most effectively.
Cathy Hackl, founder of Future Dynamics and futurist in residence at Nokia, describes how digital and physical environments will unite: “AI’s next great leap will be powered by hardware. As the digital and physical worlds merge, frontier technologies like spatial computing, extended reality and AI-powered wearables are ushering in a new computing paradigm.”
She notes that many new products and trademark filings indicate that AI firms are moving towards wearables and robots, expanding their reach to physical devices. Such a development will require the need to integrate massive amounts of advanced data to enable spatial computing. Finally, spatial computing will become the connecting tissue that brings together digital and physical worlds.
She also mentions Nvidia’s Huangwho sees the shift towards agentic AI creating a “multitrillion-dollar industry”, adding: “Whether embedded in smartglasses, humanoid robots or wearables, these agents will observe, adapt and collaborate. Together, innovations in hardware, advances in spatial computing and the rise of AI agents are creating a new foundation for how we interact with machines and information.”
Moving robotics in the real world by putting them in virtual ones first
AI will become a powerful enabler of robotics applications, boosting the impact digital twins will have on future strategic decisions and commercial operations, and accelerating the usability of XR applications. Not all robotics will require AI, not all digital twins will embrace AI and not all XR will make use of AI – but such pairings will become more common. Consumers and professional users will come to expect advanced applications.
The benefits of marrying increasingly available sensor and synthetic data, AI, robotics digital twins and XR are not ubiquitous at the moment, but use of these technologies in combination will first drip into the commercial fabric and over time spread across industries.
After all, first related systems require redesign to accommodate the new paradigm, and integration into existing systems can prove challenging. Most of all, designers, managers and users need to familiarise themselves with all of the involved technologies to create applications in which the whole is more than the sum of its parts.
Read more about AR, XR
- The emerging network of mutualistic technologies: Four major technologies are advancing large investments – and not only will these open new opportunities and applications, but each seem set to act as an enabler for the other technologies’ applications.
- Critical Manufacturing and Twinzo unveil smart factory digital twin visualisation: New connector designed to unlock ‘seamless’ manufacturing execution systems-to-3D twin integration, empowering manufacturers with dynamic operational visibility and strategic insights.
- Ericsson, SoftBank team to seek out 6G, XR, AI potential: Leading comms tech and service provider forges strategic partnership with telco and IT conglomerate to drive innovation in technologies towards 2030.
- Comms tech consortium begins XR trials on 5G Standalone network: Collaboration between Ericsson, T-Mobile and Qualcomm looks to pioneer the next wave of 5G low latency operator services, beginning with supporting demanding mobile immersive experiences.
Martin Schwirn is the author of Small data, big disruptions: How to spot signals of change and manage uncertainty (ISBN 9781632651921). He is also senior adviser for strategic foresight at Business Finland, helping startups and incumbents to find their position in tomorrow’s marketplace.
