Jetson Nano review: is it AI for the masses?

The Jetson Nano is the newest development platform for machine learning from Nvidia. Previous iterations of the Jetson platform were aimed at professional developers who wanted to make commercial products on a large scale. They are powerful and yet expensive. With the Jetson Nano, Nvidia has lowered the entry price and paved the way for a Raspberry-Pi-like revolution, this time for machine learning.

The Jetson Nano is a $ 99 single board computer (SBC) that borrows from the Raspberry Pi design language with its small form factor, block of USB ports, microSD card slot, HDMI output, GPIO pins, camera connection (which compatible with the Raspberry Pi camera) and Ethernet port. However, it is not a Raspberry Pi clone. The board has a different size, there is support for Embedded Displayport and there is a huge heat sink!

The Jetson Nano System on Module (SOM) is located under the heat sink, ready for production. The development kit is basically a board (with all ports) for holding the module. In a commercial application, the designers would build their products to accept the SOM, not the board.

Although Nvidia wants to sell many Jetson modules, it also focuses on selling the board (with module) to enthusiasts and hobbyists who may never use the module version, but would like to create projects based on the development kit, just as they do do with the Raspberry Pi.

GPU

When you think of Nvidia you probably think of graphics cards and GPUs, and rightly so. Although graphics processing units are great for 3D gaming, they also prove to be good at machine learning algorithms.

The Jetson Nano has a 128 CUDA core GPU based on the Maxwell architecture. Every generation GPU from Nvidia is based on a new design with a microarchitecture. This central design is then used to create different GPUs (with different core numbers, etc.) for that generation. The Maxwell architecture was first used in the GeForce GTX 750 and the GeForce GTX 750 Ti. A second generation Maxwell GPU was introduced with the GeForce GTX 970.

The original Jetson TX1 used a 1024-GFLOP Maxwell GPU with 256 CUDA cores. The Jetson Nano uses a simplified version of the same processor. According to the boot logs, the Jetson Nano has the same GM20B variant of the second generation of the Maxwell GPU, but with half the CUDA cores.

The Jetson Nano comes with a large collection of CUDA demos from smoke particle simulations to Mandelbrot rendering with a healthy dose of Gaussian blurring, JPEG coding and roadside fog simulations.

The potential for fast and smooth 3D games, such as those based on the different 3D engines released under open source of ID software, is good. I couldn't really find that work yet, but I'm sure it will change.

AI

Having a good GPU for CUDA-based calculations and for gaming is fun, but the real power of the Jetson Nano is when you start using it for machine learning (or AI as the marketing people like to call it).

Nvidia has an open source project called "Jetson Inference" that runs on all its Jetson platforms, including the Nano. It demonstrates various smart machine learning techniques, including object recognition and object detection. For developers, it is an excellent starting point for building real-world machine learning projects. For reviewers, it's a cool way to see what the hardware can do!

Also read: How to build your own digital assistant with Raspberry Pi

The object recognition neural network has around 1000 objects in its repertoire. It can work from still images or live from the camera feed. Similarly, the demonstration of object detection knows about dogs, faces, walking people, planes, bottles and chairs.

When running live from a camera, the object recognition demo can process (and label) at around 17 fps. The object detection demonstration, looking for faces, runs at around 10 fps.

Visionworks is Nvidia & # 39; s SDK for computer vision. It implements and expands the Khronos OpenVX standard and is optimized for CUDA-compatible GPU & # 39; s and SOC & # 39; s, including the Jetson Nano.

There are several different VisionWorks demos available for the Jetson Nano, including function tracking, motion estimation, and video stabilization. These are common tasks required by Robotics and Drones, Autonomous Driving and Intelligent Video Analytics.

Using a 720p HD video feed, tracking functions works with more than 100 fps, while the motion estimation demo can calculate the movement of about six or seven people (and animals) from a 480p feed at 40 fps.

For videographers, the Jetson Nano handheld can stabilize (vibrating) video with more than 50 fps from a 480p input. What these three demos & # 39; s show are real-time computer vision tasks that are performed at high frame rates. A good basis for creating apps in a large number of areas with video input.

The killer demo that Nvidia delivered with my review unit is "DeepStream". Nvidia & # 39; s DeepStream SDK is a forthcoming framework for high-performance streaming analytics applications that can be deployed on-site in stores, smart cities, industrial inspection areas and more.

The DeepStream demo shows real-time video analysis on eight 1080p inputs. Each entry is H.264 encoded and represents typical streams that come on an IP camera. It is an impressive demo that shows real-time object tracking of people and cars with 30 fps over eight video inputs. Remember that this runs on a $ 99 Jetson Nano!

Raspberry Pi Killer?

In addition to a powerful GPU and a number of advanced AI tools, the Jetson Nano is also a fully working desktop computer with a variant on Ubuntu Linux. As a desktop environment it has several advantages over the Raspberry Pi. First, it has 4 GB of RAM. Second, it has a quad-core Cortex-A57-based CPU, and third, it has USB 3.0 (for faster external storage).

While running a full desktop on the Pi can be tricky, the desktop experience offered by the Jetson Nano is much more pleasant. I could easily do Chromium with 5 open tabs; LibreOffice Writer; the IDLE python development environment; and a few terminal windows. This is mainly because the 4 GB RAM, but the start-up time and application performance are also superior to the Raspberry Pi due to the use of Cortex-A57 cores instead of Cortex-A53 cores.

For those interested in some real play numbers. My use thread test tool (here on GitHub) with eight threads that each calculated the first 12,500,000 prime lenses, the Jetson Nano was able to complete the workload in 46 seconds. This is comparable to four minutes on a Raspberry Pi Model 3 and 21 seconds on my Ryzen 5 1600 desktop.

Use the OpenSSL "speed" test, which tests the performance of cryptographic algorithms. The Jetson Nano is at least 2.5 times faster than the Raspberry Pi 3 and peaks 10 times faster, depending on the exact test.

Development environment

Related: Learn how to develop Android apps at the DGiT Academy!

The Jetson Nano is an excellent development environment for an arm. You get access to all standard programming languages ​​such as C, C ++, Python, Java, Javascript, Go and Rust, plus you can even execute some IDEs. I tried Eclipse from the Ubuntu repository, but it could not be started. Ironically, I was able to run a Community build of Visual Studio Code without any problems!

GPIO

One of the most important features of the Raspberry Pi is the set of General Purpose Input and Output (GPIO) pins. This allows you to connect the Pi to external hardware such as LEDs, sensors, motors, screens and more.

The Jetson Nano also has a set of GPIO pins and the good news is that they are compatible with Raspberry Pi. Initial support is limited to the Adafruit Blinka library and to userland control of the pins. All the plumbing, however, is there to provide broad support for many of the available Raspberry Pi-HATs.

To test it all out, I took a Pimoroni Rainbow HAT and connected it to the Jetson. The library (https://github.com/pimoroni/rainbow-hat) for the Rainbow HAT expects a Raspberry Pi along with a number of underlying libraries, so I didn't try to install it, but I did adjust one of the examples scripts that come with the Jetson Nano, so that I could make one of the LED & # 39; s of the board blink on and off via Python.

Power supply

Because of the powerful CPU and the desktop like GPU, the Jetson Nano has a large heat sink and you can also buy an optional fan. The board has different power modes that are controlled via a program that is called nvpmodel. The two most important power modes are the 10W configuration, which uses all four CPU cores and allows the GPU to operate at maximum speed. The other is the 5W mode, which turns off two of the cores and outputs the GPU.

If you use apps that affect the performance of the board, you must ensure that you are using a good power supply. For general use you can use USB for power, as long as the power is at least 2.5A. For powerful tasks you must use a 5V / 4A power supply, which has a separate socket and is switched on via a jumper on the board.

Concluding thoughts

If you look at the Jetson Nano on the Jetson platform in an affordable way, it's great. Instead of spending $ 600 or more for a development kit compatible with Nvidia & # 39; s machine learning offer and working with frameworks such as VisionWorks, you pay only $ 99. What you get is still very capable and capable to perform many interesting machine learning tasks. In addition, it leaves the door open for upgrading to the larger versions of Jetson if needed.

As a direct alternative to the Raspberry Pi, the value proposition is less attractive because the Pi only costs $ 35 (less if you go with one of the Zero models). Price is the key: do I want a Jetson Nano or three Raspberry Pi cards?

If you want something like the Raspberry Pi, but with more processing power, more GPU grunt and four times the RAM, then the Jetson Nano is the answer. Of course it costs more, but you get more.

Bottom line is this: if the Raspberry Pi is good enough for you, stick with it. If you want better performance, if you want hardware-accelerated machine learning, if you want to get into the Jetson ecosystem, buy a Jetson Nano today!

Affiliate disclosure: we can receive a fee for your purchase of products via links on this page. The compensation received never influences the content, topics or posts that are posted in this blog. Consult our disclosure policy for more information.

!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=();t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)(0);s.parentNode.insertBefore(t,s)}(window,
document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);

fbq(‘init’, ‘539715236194816’);
fbq(‘track’, “PageView”);