Design Your Robot on Hardware-in-the-Loop with NVIDIA Jetson

Photo of Carter robot.

Hardware-in-the-loop (HIL) testing is a powerful tool used to validate and verify the performance of complex systems, including robotics and computer vision….

Hardware-in-the-loop (HIL) testing is a powerful tool used to validate and verify the performance of complex systems, including robotics and computer vision. This post explores how HIL testing is being used in these fields with the NVIDIA Isaac platform.

The NVIDIA Isaac platform consists of NVIDIA Isaac Sim, a simulator that provides a simulated environment for testing robotics algorithms, and NVIDIA Isaac ROS, hardware-accelerated software optimized for NVIDIA Jetson that contains machine learning, computer vision, and localization algorithms. By using HIL testing with this platform, you can validate and optimize the performance of your robotics software stack, leading to safer, more reliable, and more efficient products.

In this post, we discuss the various components of the HIL system, including the NVIDIA Isaac platform software and hardware.  We examine how they work together to optimize the performance of robotics and computer vision algorithms. We also explore the benefits of using the NVIDIA Isaac platform for HIL testing and compare it to other testing methodologies.

NVIDIA Isaac Sim

NVIDIA Isaac Sim, based on Omniverse, provides a photo-real, physically accurate, virtual environment for testing robotics and computer vision algorithms. It enables you to fine-tune performance without the risk of damaging physical hardware. The simulator is also highly customizable, making it ideal for testing a wide range of scenarios and use cases.

You can create smarter and more advanced robots using NVIDIA Isaac Sim. The platform offers a suite of tools and technologies to help you build sophisticated algorithms that enable robots to perform complex tasks.

NVIDIA Isaac Sim can easily collaborate, share, and import environments and robot models in Universal Scene Description (USD) format by using Omniverse Nucleus and Omniverse Connectors. With an Isaac ROS/ROS 2 interface, full-featured Python scripting, and plug-ins for importing robot and environment models, this integration enables a more efficient and effective approach to robotics simulation.

Figure 1. NVIDIA Isaac Sim stack

You can interact with NVIDIA Isaac Sim working with ROS or ROS 2, or Python. Run NVIDIA Isaac Gym and NVIDIA Isaac Cortex, generate synthetic data, or use it for your digital twin.

NVIDIA Isaac Sim uses a customized version of ROS Noetic internally, built with roscpp for the ROS bridge, to function seamlessly with the Omniverse framework and Python 3.7. This version is compatible with ROS Melodic.

NVIDIA Isaac Sim currently supports ROS 2 Foxy and Humble for the ROS 2 bridge. We recommend using Ubuntu 20.04 for ROS 2.

For more information, see NVIDIA Isaac Sim.

NVIDIA Isaac ROS

Built on top of the Robot Operating System (ROS), NVIDIA Isaac ROS offers a range of advanced features and tools to help you build smarter, more capable robots. The features include advanced mapping and localization capabilities, as well as object detection and tracking. For more information about the latest features, see Isaac ROS Developer Preview 3.

By using Isaac ROS as part of the NVIDIA Isaac platform, you can create sophisticated robotics applications that perform complex tasks with precision and accuracy. With its powerful computer vision and localization algorithms, Isaac ROS is a valuable tool for any developer looking to create advanced robotics applications.

Diagram shows Isaac ROS layered on Isaac Gem, with ROS2 nodes for image processing, DNN, and CV. Figure 2. Isaac ROS and the software layer

Isaac GEMs for ROS are a set of GPU-accelerated ROS 2 packages released for the robotics community and part of the NVIDIA Jetson platform.

Isaac ROS offers a set of packages for perception and AI, as well as complete pipelines known as NVIDIA Isaac Transport for ROS (NITROS). The packages have been optimized for NVIDIA GPUs and the Jetson platform with functionality for image processing and computer vision.

In this post, we include examples of how to run HIL for the following packages:

  • NVIDIA Isaac ROS vslam
  • NVIDIA Isaac ROS apriltag
  • NVIDIA Isaac ROS nvblox
  • NVIDIA Isaac ROS Proximity segmentation

For more information about other Isaac ROS packages and the latest Isaac ROS Developer Preview 3, see NVIDIA Isaac ROS.

Hardware specifications and setup

For this test, you need a workstation or your laptop and an NVIDIA Jetson:

  • An x86/64 machine with Ubuntu 20.04 installed
  • NVIDIA Graphics card with NVIDIA RTX
  • Display
  • Keyboard and mouse
  • NVIDIA Jetson AGX Orin or NVIDIA Jetson Orin NX
  • NVIDIA JetPack 5+ (tested 5.1.1)
  • Router
  • Ethernet cables

Diagram shows a desktop with an NVIDIA graphics card connected by a router to an NVIDIA AGX Orin module.Figure 3. Hardware setup

Using a wired Ethernet connection instead of Wi-Fi is often preferred when transferring large amounts of data between devices, such as an NVIDIA Jetson module and a PC. This is because Ethernet connections offer faster and more reliable data transfer rates, which can be particularly important for real-time data processing and machine-learning tasks.

To establish an Ethernet connection between the Jetson module and your PC, follow these steps:

  • Get an Ethernet cable and a router with free Ethernet ports.
  • Plug one end of the cable into the device’s Ethernet port.
  • Plug the other end of the cable into any free Ethernet port on the router.
  • Turn on the device and wait for it to fully start up.
  • Check that the Ethernet connection works by looking for the Ethernet icon or using a network diagnostic tool, like ifconfig or ipconfig.
  • When your PC and NVIDIA Jetson are ready and connected, follow the installation instructions on the /NVIDIA-AI-IOT/isaac_demo GitHub repository.

    Run the demo and drive

    In the first step, run NVIDIA Isaac Sim on your workstation. The ./isaac_ros.sh script runs a demo with the Carter robot.

    Carter 1.0 is a robotics platform using a differential base from Segway, a Velodyne P16 for 3D range scans, a ZED camera, an IMU, and a Jetson module as the system’s heart. Together with custom mounting brackets, it results in a powerful and robust demonstration platform for the NVIDIA Isaac navigation stack.

    When the simulation starts, you see the stereo camera outputs from NVIDIA Isaac Sim. Using two cameras, the robot is ready to receive the inputs from Isaac ROS running on your NVIDIA Jetson module.

    The Carter view from NVIDIA Isaac SIM. In this frame, the stereo camera output is shown. Figure 4. Carter on NVIDIA Isaac Sim

    Isaac ROS packages to try in this demo

    In this post, we discuss some NVIDIA Isaac ROS packages for AMR robotics or for use with your wheeled robot. We focus on packages for localization, mapping, and AprilTag detection, but you can modify the repository to test other packages as needed.

    Isaac ROS Visual SLAM

    NVIDIA Isaac ROS Visual SLAM uses a combination of visual odometry and simultaneous localization and mapping (SLAM).

    Visual odometry is a technique used to estimate the position of a camera in relation to its starting position. This technique involves iterative processes that analyze two consecutive input frames or stereo pairs to identify sets of keypoints. By matching the keypoints in these two sets, you can determine the camera’s transition and relative rotation between frames.

    SLAM is an approach that enhances the accuracy of visual SLAM by incorporating previously acquired knowledge of a trajectory. By detecting if a current scene has been seen before (a loop in camera movement), it can optimize previously estimated camera poses.

    Screenshot of Isaac ROS with the 3D map generated by nvblox and visualized on rviz.Figure 5. Isaac ROS vslam and nvblox

    Screenshot of terminal window shows the status of the NVIDIA Jetson GPU and RAM on NVIDIA Orin NX when Isaac ROS is running.Figure 6. Isaac ROS vslam and nvblox running status

    Isaac ROS nvblox

    The nvblox package helps build a 3D model of the environment around a robot in real time using sensor observations. This model is used by path planners to create collision-free paths. The package uses NVIDIA CUDA technology to accelerate the process for real-time performance. This repository includes ROS 2 integration for the nvblox library.

    Workflow diagram shows how the NVIDIA Isaac ROS nvblox setup works in this demo. From left to right, the depth and RGB sensor output is used from nvblox mapping library to generate a cost map and a 3D output visualizzable from rviz.Figure 7. Isaac ROS NVblox workflow

    Screenshot shows the 3D mapping and localization running on NVIDIA Jetson Orin NX and the output available when you run on rviz.Figure 8. Isaac ROS vslam output

    The Isaac ROS vslam package is always running on your demo if you followed the /NVIDIA-AI-IOT/isaac_demo instructions.

    NVIDIA Isaac ROS apriltag

    The ROS 2 apriltag package uses an NVIDIA GPU to accelerate detection in an image and publish pose, ID, and other metadata. This package is comparable to the ROS 2 node for CPU AprilTag detection.

    These tags are fiducials to drive a robot or a manipulator to start an action or complete a job from a specific point. They are also used in augmented reality to calibrate the odometry of the visor. These tags are available in many families, but all are simple to print with a desktop printer, such as the ones in Figure 9.

    Picture of six different AprilTag standards that NVIDIA Isaac ROS apriltag can detect.Figure 9. Examples of AprilTags in place of QR codes

    Screenshot shows that the Carter robot can detect two AprilTags in front of the camera. In this view the robot is also localized with NVIDIA Isaac ROS vslam and can generate a map with NVIDIA Isaac ROS nvblox.Figure 10. Demo of Isaac ROS apriltag detection

    Isaac ROS proximity segmentation

    The isaac_ros_bi3d package employs a Bi3D model that has been optimized to carry out stereo-depth estimation through binary classification. This process is used for proximity segmentation, which is useful in identifying whether an obstacle is present within a certain range and helps to prevent collisions while navigating through the environment.

    In this screenshot, Carter is making a 2D map starting from the obstacle detected with Isaac ROS proximity segmentation.Figure 11. Isaac ROS proximity segmentation

    Drive Carter from rviz

    When rviz is read and all is running, rviz shows this output. Figure 12 shows Carter at the center of the map and all blocks.

    Screenshot of rviz and command drive.Figure 12. rviz with map building using Isaac ROS vslam and Isaac ROS nvblox

    In the following video, you can use rviz to drive your robot in all environments and see the map generated from nvblox.

    Video 1. HIL on NVIDIA Orin NX with Isaac ROS vslam and nvblox

    You can also use Foxglove to test the simulation.

    Screenshot shows gridded background and debugging panel for using Foxglove to visualize the output from NVIDIA Isaac ROS.Figure 13. Using Foxglove for simulation testing

    Summary

    In this post, we showed you how to set up hardware-in-the-loop and testing with your NVIDIA Jetson Isaac ROS module and how to try NVIDIA Isaac Sim. Remember to use a wired connection between your desktop and your Jetson module. To show all telemetry raw data, you need a reliable connection.

    Feel free also to test other NVIDIA Isaac ROS packages just added on the /isaac-ros_dev folder. For more information, see the readme.md file at /NVIDIA-AI-IOT/isaac_demo.

    For more information, see the Isaac ROS webinar series.

    Source:: NVIDIA