The NVIDIA Jetson Nano isn’t new. The 4 GB platform was launched last year at a price point of $129, which was then dropped to $99. Now the company has made some tweaks and released a 2 GB version that’s just $59. And although it's in the price range of a Raspberry Pi, it offers the next level of oomph for AI newbies.
NVIDIA was able drop the price into Raspberry Pi territory by optimizing features like the number of USB ports (from four to three) and MIPI CSI-2 camera ports (from three to one). Of course, the biggest of these optimizations is apparent in the name (the reduction in LPDDR4 memory from 4 GB to 2 GB), but none of that restricts most developers in the practical engineering sense. You can run deep neural networks (DNNs), recurrent neural networks (RNNs), long short-term memory (LSTM) networks, and other neural networks, individually or concurrently on the device. You can even train full AI models directly on the Jetson Nano 2 GB Developer Kit itself.
The 100 mm x 80 mm x 29 mm Jetson Nano 2 GB features a 128-core NVIDIA Maxwell GPU supported by a quad-core, 64-bit Arm A57 CPU that clocks in at 1.43 GHz. The aforementioned 2 GB of onboard 64-bit LPDDR4 memory carries 25.6 GB/s bandwidth support, ensuring that the high-performance processor complex is constantly fed a high-volume of data during AI inferencing applications.
The Jetson Nano 2 GB’s logic/memory combination is good enough to outperform similar offerings in its class across a range of classification, segmentation, object detection, pose estimation, and image processing workloads.
On the I/O front, the Developer Kit include a Gigabit Ethernet (GbE) interface, wireless connectivity via an 802.11az wireless adaptor (Wi-Fi dongle included in some regions, $5 discount in others), an HDMI port, 40-pin header for GPIO and serial communications, a 12-pin header for power and related signals, and a four-pin fan header in case the onboard heatsink isn't enough. The kit is powered by a USB-C port.
Like all members of the Jetson developer kit family, the Jetson Nano 2 GB Developer Kit is paired with NVIDIA’s JetPack Software Development Kit (SDK) that includes a Ubuntu Linux distribution and Linux Driver Package (L4T), the CUDA parallel programming model, and APIs and libraries like TensorRT, cuDNN, OpenCV, and Vulkan 1.2 for developing deep learning, computer vision, and other use cases.
It supports most popular AI development frameworks including Caffe2, Keras, MxNet, PyTorch, and TensorFlow. Developers are able to access these environments (as well as Jupyter notebooks) via pre-packaged docker containers available on the Jetson Zoo that can be installed on top of the JetPack SDK. For more advanced users, NVIDIA also provides Python wheels.
The Jetson Nano 2 GB in Action
The NVIDIA Jetson ecosystem currently consists of more than 80,000 active developers, a few of whom were provided early access to the Jetson Nano 2 GB. Some of their projects can be seen on the Jetson Community, which hosts videos, code, demonstrations, and more.
Because the Jetson Nano 2 GB supports on-device training through transfer learning (or the ability to modify a pre-trained network on a new target device), many of these users were able to easily port projects from Jetson Nano 4 GB devices to the latest version of the Developer Kit with little-to-no rework. For others who started from scratch, pre-trained image classification, object detection, segmentation, and pose estimation models that use TensorRT could be downloaded from NVIDIA’s Model Zoo, part of the Jetson Zoo.
The Jeston Nano 2 GB is also compatible with a range of popular cameras, such as the Intel RealSense, Raspberry Pi High-Definition camera, ZED 3D camera, and other USB cameras that help users quickly deploy the Developer Kit as a computer vision node. Many of these devices slot into the onboard 40-pin breakout header, and are backed by out-of-the-box support in the form of drivers and libraries. A couple of these libraries are the Adafruit Blinka library that provides access to the Adafruit peripheral ecosystem, as well as the Jetson GPIO Python Library, both of which are used in the JetBot Robotics example project.
(JetBots are available based on the NVIDIA Jetson Nano 2 GB, Adafruit PiOLED, FeatherWing DC Motor + Stepper, and 16-channel Adafruit PWM).
To help users navigate the complexity of edge AI, NVIDIA has developed a number of courses that will be available through their Deep Learning Institute (DLI). NVIDIA DLI is an online training platform that with Jetson Nano 2 GB tutorials and video walkthroughs that showcase the Developer Kit in a variety of vertical applications. Once the DLI program has been completed, users are eligible for one of two certifications: The Jetson AI Specialist or Jetson AI Educator.
Depending on their application requirements, users can configure the Jetson Nano 2 GB to run in either 5W or 10W operating modes to balance system performance and power consumption. While there currently isn’t a commercial-grade module for available for the Nano 2 GB, more than 100 ecosystem partners can help productize Jetson Nano-based designs.
Getting Started with Jetson Nano 2 GB
The DLI courses offer an easily-accessible starting point for students, makers, and professional engineers working on AI projects. Open-source deep learning projects for beginners include the JetBot demo mentioned earlier, as well as “Getting Started with AI on Jetson Nano” and “Hello AI World.”
- Getting Started with AI on Jetson Nano teaches participants how to properly configure their Jetson Nano and a camera in order to collect imaging data for classification models, annotate it for regression models, train data to create their own models, and run inferences with the models they have created.
- Hello AI World is a demo that runs pre-trained or custom image classification, object detection, and semantic segmentation models, and can be completed in just a few minutes. It also contains sections on how to retrain deep neural network models using PyTorch.
These and other courses, such as the more advanced “Deep Learning ROS Nodes,” will be finalized and made available at the end of October 2020.
To prepare for these and other, more advanced DLI courses, Jetson developers must log on to NVIDIA NGC and then pull containers to the Jetson Nano 2 GB by launching docker via a terminal application on the kit itself. All you have to do is
- Connect USB power supply and USB camera to Jetson Nano 2 GB
- Connect a development PC using a micro-USB cable
- Using a remote access tool such as PuTTY SSH into Nano through port
- Execute the command “
sudo docker pull nvcr.io/ea-linux4tegra/nvdli-nano:r32.4.4” to download the DLI course container
Not only does this pull the necessary containers, it also allows data collected during courses to be stored in a mounted directory so that models aren’t lost when the container shuts down. From there, a getting started guide provides different commands to integrate either USB or CSI cameras. A connection to a JupyterLab server will then be established, and the user can navigate to
http://192.168.55.1:8888, login with
dlinano as the password, and then follow the coursework instructions.
The Jetson Nano 2 GB is currently available for pre-order. For more information on the Jetson Nano 2 GB, check out the NVIDIA webpage, view the video below, or navigate to the resources page that follow.
- Pre-Order NVIDIA Jetson Nano 2 GB: https://developer.nvidia.com/buy-jetson?product=jetson_nano&location=US
- NVIDIA Jetson Nano 2 GB Overview: https://developer.nvidia.com/embedded/jetson-nano-2gb-developer-kit
- NVIDIA Jetson Nano 2 GB Getting Started Guide: https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-2gb-devkit
- JetPack SDK: https://developer.nvidia.com/embedded/jetpack
- Jetson Zoo: https://elinux.org/Jetson_Zoo
- NVIDIA Deep Learning Institute (DLI): nvidia.com/en-us/deep-learning-ai/education/
- Jetson Community: https://developer.nvidia.com/embedded/community/jetson-projects