Digital Twinning is creating an electronic or a virtual version of a real-world thing and keeping them in sync in real-time.
That applies to more than just robotics.
An example of a digital twin could be seen in self-driving cars.
Self-driving cars can benefit from having a digital twin (virtual simulation) of the environment they're in, but you can also have a digital twin of the car itself.
It's also very dangerous to put a self-driving car you're working on in the streets. Using digital twins, you're able to develop the AI for the self-driving car in a safe environment that's as real to the outside world as possible.
When the car is on the real road, there are real consequences if something goes wrong.
Digital twins help solve this issue by creating a virtual simulation of the car's environment and the car itself.
When you put the two together, you can safely try new algorithms, use machine learning, which learns by making mistakes. Everything happens in a virtual environment, so if something goes wrong. You don't have to worry about wrecking your car or harming people.
If you're not bringing in experts on digital twins, you could quickly run into problems once you develop your AI in the real world.
If the gravity, collision, size, or any other physics / physical feature is in your simulation, your robot will not function as intended.
Let's say you're using Unity to create an environment. It would be best if you made an accurate environment, especially the sensors on your robot.
If the robot sensors are off, it's not going to behave exactly the same in the real world as it did in the virtual environment.
Your environment needs to match what you see very accurately.
If you've ever played video games, you've undoubtedly run into "invisible walls."
An invisible wall is a boundary in a video game that limits where a player character can go in a particular area but does not appear as a physical obstacle.
Invisible walls shouldn't happen in a digital twin. If there aren't any physical obstacles, the robot needs to access the location, especially if you're training your robot to use LIDAR.
LIDAR is a sensor that the robot can use to take distance measurements, they're very accurate distance measurements, and that's a big way how robots navigate in the environment.
Summary: If you're an expert on digital twins, we advise that you hire experts like us to help make your digital twin a success. Otherwise, you might spend a lot of time and money developing virtual environments that don't work in the real world.
Having a strong computer will undoubtedly help you simulate as many tests as you want and increase your machine learning efforts' speed.
However, you don't need to own one of these supercomputers, and neither does your staff.
You can easily rent out a strong cloud machine from companies like Google and AWS to run your virtual environment.
Or if you already have a robust machine, but your employees don't, you can give your employees access to the machine via the cloud.
You can use digital twins for a wide range of purposes, including
machine learning, diagnostics, and algorithm testing.
Using digital twins also allows companies to experiment in a low-risk environment.
You don't have to spend money on procurement, materials, and production, and you know much sooner if you need to make changes before moving forward.
Most of the time, you want to add robots to your warehouse, and you don't know-how. This is where digital twins come in.
With the almost unlimited scale in a low-risk environment, you can find the perfect solution for your warehouse.
The goal is to create the perfect robots before investing millions of dollars assembling them in your warehouse. Also, since tests are being done in a digital environment, you don't need to stop your workers from doing their job or building a separate warehouse for testing.
You have it all in the digital world.
Gazebo is an open-source 3D robotics simulator. Gazebo simulated real-world physics in a high fidelity simulation.
It helps developers rapidly test algorithms and design robots in digital environments.
They've branded themselves as "robotic simulation made easy," but there are many tools out there that are much easier to use and help speed up the process.
Robotics simulation is an ever-growing space. Companies are investing more and more money to improve their workflow through robotic simulation.
Robotic simulation saves a lot of time and money because it allows people to test how robots work without huge investments.
We have created our robotic simulation simulator using Unity's powerful game engine.
It helps you replicate gravity, friction, torques, and any other real life conditions that could affect your simulation's success.
This is essential to your robotics development work. It would be terrible if you built a perfect robot that can't work without gravity.
Gazebo helps you integrate a multitude of sensors, and it gives you the tools to test these sensors and develop your robots to best use them.
Suppose you don't have access to robotic hardware or want to test hundreds of robots simultaneously. It's impossible without a robotic simulator like Gazebo.
Even if you have access to hardware, Gazebo is still a useful tool because it allows you to test your robotic design before implementing it in the real world. This is why companies are investing so much money into robotic simulators and digital twins. They want to increase their manufacturing processes' workflow and speed without spending too much money on hardware.
Since Gazebo is open-source software, there are also many 3rd party plugins and solutions that help you solve specific problems you might come across or speed up your workflow.
Gazebo is continuously updating, with their latest release being the Gazebo 11.
It's tough to import 3D models into Gazebo, and if you're not a 3D modeler, it might be difficult for you to find someone who can prepare the files for Gazebo.
If you're using more popular programs like Unity, it will be much easier to import these models and have more realistic testing environments.
You can ever re-create your entire warehouse in Unity.
Installing Gazebo is also a challenging task. Its windows installation has 18 steps in total, which will be difficult for someone who isn't a developer and is familiar with using code for installation.
NVIDIA Isaac SDK is the first open-source Robotic AI Development Platform with Simulation, Navigation, and Manipulation.
It’s a robust platform that helps you build smarter robots.
NVIDIA Isaac SDK heavily relies on AI.
As they put it: “AI makes it possible for robots to perceive and interact with their environments in novel ways, enabling them to perform tasks that were unthinkable—until now.”
NVIDIA Isaac SDK comes with a collection of powerful GPU-powered algorithms, frameworks, basic applications that support accelerated robotic development. It also works hand-in-hand with Isaac SIM, which allows for the development, testing, and training of robots in a virtual environment.
In short: NVIDIA Isaac SDK heavily uses GPUs to increase performance and help you run better simulations faster.
NVIDIA Isaac SDK can help you create, modify & simulate your entire factory, even before installing any equipment.
There are a lot of premade pallets, cardboard boxes, shelves, totes, bins, and everything that you’d see in your standard warehouse.
It’s all out there, in the simulation.
The great thing about the simulation is that the physics are amazingly accurate.
You don’t want to spend months in a simulation trying to create the perfect Robot for your warehouse and then having it all crash and burn because the gravity in your simulation is different from real-life gravity.
You can simulate your parts with 3d models. Add in the weights, center of gravity, and the simulation will interact with it really close like it would in the actual manufacturing process.
Besides doing cool simulations, you can also use NVIDIA’s AI in your simulation to add stuff like:
Create real-time AI simulations using the power of their RTX graphics cards.
Anyone can download the software right away by simply heading to NVIDIAs download page, but this isn’t the biggest obstacle.
Learning new programming languages is the biggest challenge, and fortunately for all kids fresh out of college. The whole thing can be programmed inside of python using the Isaac SDK.
Before Isaac, industrial applications were generally programmed by ladder logic and more archaic types of languages.
Seeing as this is a python based application is excellent for companies too. If you’re looking to hire your team for the job, you will be able to tap into a much greater talent pool of excellent developers since python is a much more popular language.
The development of robotic applications is still a tough job for most companies, especially when most developers relied on Gazebo. The developers of the latter are much smaller in scale compared to NVIDIA and now Unity.
Seeing as more prominent companies are getting into Robotics shows just how powerful and useful they are.
For accelerated robotic development, NVIDIA provided a collection that helps in development, training, and testing. Thus the complexity of robotic development was reduced to a great extent. Developers can now try the Isaac collection, which is well documented and have proper community support for robotic development.
Nvidia Isaac SDK consists of several parts that work well together to create some pretty powerful simulations.
Isaac engines is a software framework for building modular robotic applications.
It consists of computational graphs & CUDA messaging, Visualization Tools, and Python API & Ros Bridge.
It’s used to build robotic applications based on many small components that pass messages between each other and can be customized any way you like.
Isaac GEMs are a collection of GPU-powered algorithms that help accelerate the development of robotic applications.
It consists off:
Isaac Sim is a virtual robotics laboratory and a high-fidelity 3d world simulator that accelerates research, design, and development by reducing cost and risk.
This helps you test robots in different scenarios.
Robots can be simulated with virtual sensors (RGB, stereo, depth, LIDAR, IMU)
It consists off:
These are basic applications that make use of the NVIDIA Isaac SDK engine to showcase the real power of the NVIDIA Isaac SDK and help you get started quickly.
One of the biggest problems with old simulation software is that you don’t know how your automation will respond to your environment.
NVIDIA Isaac acknowledges that you need an excellent way to simulate what your parts will do whenever you’re automating.
What this accomplishes is that it speeds up your development time because it’s clearing up the unknown unknowns.
This means that robot development is much more rapid deployment. As we mentioned before, you’re now able to get a bigger talent pool of programmers involved in your projects.
The best part of the NVIDIA Isaac SDK is using cloud computing to do all of your development.
Anyone in the world can easily buy an instance like the Amazon Elastic Compute Cloud and develop these programs remotely. This means that even people who don’t have a $2,000 graphics card.
Your average laptop should be able to do the job.
The 2020.1 version brought us a lot of new possibilities for Nvidia Isaac.
Here’s a summary from Nvidia’s official website: