Guides

Environment Setup

Prerequisites

Before starting, ensure your system meets these requirements:

  • Linux operating system (Ubuntu 20.04 or later recommended)
  • Apropriate NVIDIA GPU
  • Apropriate NVIDIA driver
  • At least 10GB of free disk space

Table of Contents

  1. Install NVIDIA Driver
  2. Install Docker
  3. Install NVIDIA Container Toolkit
  4. Install TensorRT Container
  5. Verification
  6. Troubleshooting

Install NVIDIA Driver

  1. Check if NVIDIA driver is installed:
    nvidia-smi
    
  2. Install NVIDIA drivers:
    sudo apt install nvidia-driver-###  # Use latest recommended version
    sudo reboot
    

Install Docker

  1. Check if docker is install and remove old versions (if any):
    dpkg -l | grep -i docker
    
  2. Install Docker Engine:
    sudo apt-get update
    sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin
    
  3. Verify Docker installation:
    sudo docker run hello-world
    

Install NVIDIA Container Toolkit

  1. Install NVIDIA Container Toolkit (if necessary):

Install TensorRT Container

  1. Pull the TensorRT container with TensorFlow support:
    sudo docker pull nvcr.io/nvidia/tensorflow:##.##-tf#-py3
    
  2. Run the container:
    sudo docker run --gpus all -it --rm nvcr.io/nvidia/tensorflow:##.##-tf#-py3
    

Verification

  1. Inside the container, verify TensorFlow can see the GPU:
    import tensorflow as tf
    print("Num GPUs Available: ", len(tf.config.list_physical_devices('GPU')))
    
  2. Verify TensorRT:
    import tensorflow as tf
    from tensorflow.python.compiler.tensorrt import trt_convert as trt
    print(tf.__version__)
    

Troubleshooting

Common Issues and Solutions

  1. Docker permission denied

  2. NVIDIA driver not found
    sudo reboot
    
  3. Container fails to start with GPU error
    # Check if NVIDIA runtime is properly configured
    sudo docker info | grep -i runtime
    

Additional Resources


Note: Version numbers in this guide may need to be updated based on your system requirements and the latest available versions.

  1. Log-in into a Lambda server using your credentials via ssh.
  2. Once in the Lambda server execute the following command:

    jupyter notebook --no-browser --port=8080

    • This will open the lambda server and will provide a token.
  3. On your local systems type: ssh -L 8080:localhost:8080 <your account>@<lambda server>

    • This command is basically forwarding port 8080 on the lambda server to your local system.
  4. On your system browser go to: https://localhost:8080 and that will access the jupyter notebook interface from the lambda server.

Troubleshooting:

  • Port Usage: If there are other person using that port (e.g. 8080) the step 2 will fail. You need to select another port. If this happens, the new port must be used on step 4.

Hardware Setup

  1. First we ned to update the Firmware on the Jetson Orin Nano. If you are not sure which one you have. Recommend the follow step 2 to upgrade firmware to 36.x or greater. Otherwise, skip to step 4.
  2. In the following link for the Jetson Orin Nano Jetpack 5.13 firmware from step 2.1 (You can follow all the steps from this link and skip to step 5).
  3. Download the 5.13 Jetpack and follow step 2 in the dowload for your operating system. Make sure to flash with the Jetpack 5.13 firmware and not Jetpack 6.x.
    • You need to download Balena Etcher in the guide as well. This flashes the SD card with the firmware. You may need other software depending on your operating system.
  4. Then follow all the steps in write image to micro SD card. Make sure to download latest Jetpack (currently v6.1).
  5. Once the install Jetpack on top of the Jetson Linux:
    apt install nvidia-jetpack