
About the Project
The AutoSDV project, namely the Autoware Software-Defined Vehicle, features an affordable autonomous driving platform with practical vehicle equipment for educational and research institutes. This project allows you to build a self-driving platform at home and use it in real outdoor road environments. Driven by Autoware, the leading open-source software project for autonomous driving, it gives you great flexibility and extensivity on the vehicle software.
AutoSDV provides a complete stack from hardware specifications to software implementation, offering an accessible entry point into real-world autonomous systems using industry-standard tools and practices.
![]() |
![]() |
![]() |
Robin-W Solid-State LiDAR Kit | Velodyne 32C LiDAR Kit | Cube1 LiDAR + MOXA 5G Kit |
Citation
If you use AutoSDV in your research or educational projects, please cite our work using the following BibTeX entry:
@misc{autosdv2025,
author = {Hsiang-Jui Lin, Chi-Sheng Shih},
title = {AutoSDV: A Software-Defined Vehicle Platform for Research and Education},
year = {2025},
institution = {National Taiwan University},
url = {https://github.com/NEWSLabNTU/AutoSDV},
note = {Accessed: 2025-04-28}
}
Getting Started
This section guides you through the process of setting up and using the AutoSDV platform. Follow these chapters sequentially to build a fully functional autonomous vehicle:
- Building the Vehicle - Assembling the physical components
- Software Installation - Setting up the driving software
- Operating the Vehicle - Launching and controlling the system
Building the Vehicle
The recommended vehicle build bases on 16×11×5 inch chassis plus additional sensor and communication mounts, which can be divided into several parts listed below.
The core components includes necessary components to run the Autoware.
- Onboard computer
- Navigation system
- Power supply system
- Powertrain system
- Chassis
The vehicle can be equipped with additional mounts depending on your choice.
- LiDAR sensors
- 5G/LTE communication module
Core Components
The vehicle has three major layers shown in the Figure 1 from top to bottom.
- Yellow: The onboard computer, navigation sensors and additional mounts.
- Red: Power supply system for the onboard computer and sensors in the yellow layer.
- Blue: Powertrain system and power supply for the powertrain.
By using this vehicle, additional sensors and 5G mounts go to the yellow layer, which power supply comes from the red layer. The motors have a separate battery and power supply in the blue layer due to distinct voltage requirements.

Power Supply System
Batteries
There are two batteries for the respective two power supplies, namely the upper power and lower power. The batteries are shown in Figure 2. The upper power provides electricity to the on-board computer and sensors from a 22.2V 6S battery (1), while the lower power provides electricity to the DC motor and powertrain from a 7.4V 2S battery (2).
Both batteries have a yellow XT60 power plug and a white JST-XH connector as shown in Figure 3. The JST-XH connector is plugged to a voltage monitor in Figure 4. It beeps when the voltage becomes low.



The Upper Power Supply
The upper power start up process is shown in Figure 4. First, install the battery on the battery dock. Second, connect battery to the cable. Last, switch on the power supply demonstrated in Figure 5.
Please be cautious that the power switch must be turned off before installing or removing the battery. It's necessary to protect the system from voltage spikes.


The Lower Power Supply
The lower power start up process is shown in Figure 6. The battery is installed in the dock in the bottom layer of the vehicle (1). Then, switch on the power (2).

Docks
The vehicle has three docks to mount your favorite sensors. The figure below shows two kinds of builds with three docks marked on the figure: (1) the front docker, (2) the top dock and (3) the rear dock.
![]() |
![]() |
The details of two builds are described in the table below.
No. | Front Dock | Top Dock | Rear Dock |
---|---|---|---|
1 |
![]() |
![]() |
![]() |
Seyond Robin-W LiDAR | MOXA 5G Module | LiDAR Ethernet Adaptor | |
2 |
![]() |
![]() |
![]() |
Velodyne LiDAR Adaptor | Velodyne 32C LiDAR | Navigation Sensor Kit |
Components and Wiring
The vehicle incorporates essential components such as the chassis, body, onboard computer, among others, along with additional LiDARs and a 5G communication module. For detailed information on these elements and their wiring, please refer to the comprehensive guide in Hardware and Wiring.
Software Installation
Prepare the Onboard Computer
Recommended: NVIDIA Jetson AGX Orin
NVIDIA Jetson AGX Orin 64G is the major platform for the onboard computer. Flash the Jetson box using SDK manager with the following configuration.
-
JetPack SDK with exact version 6.0.
Note that newer releases such as 6.1 and 6.2 are not compatible.
-
Install all CUDA and TensorRT packages in the SDK manager window.
-
Flash the system on the external NVMe SSD disk with size at least 256GB .
It's not recommended to boot on the builtin EMMC due to limited capacity.
Alternative: Ubuntu 22.04
The fresh Ubuntu 22.04 operating system with the following dependencies is preferable.
-
Visit the CUDA Archive and install CUDA 12.3 with "deb (network)" installer type.
-
NVIDIA driver 550 or above is recommended.
-
Proceed the download page and install TensorRT 8.6 GA.
Alternative: Docker Environment
If you don't have access to an NVIDIA Jetson AGX Orin or prefer a containerized approach, you can use our Docker environment. See the Docker Environment Setup page for detailed instructions.
Prepare the Environment (Recommended Method)
Step 1: Run Ansible Automated Setup Script
The project ships an Ansible playbook that configures the environment automatically. The following tasks are done during the process.
- Install ROS Humble.
- Install Autoware 2025.02 binary release and its dependencies.
- Install Blickfeld Scanner Lib required by the Blickfeld Cube 1 LiDAR.
- Download artifacts for Autoware.
- Set default RMW library to Cyclone DDS and optimize system-wide settings.
make setup
Step 2: Install Sensor Drivers
Installation for following packages still need manual manipulation.
-
Visit the release archive and install ZED SDK 4.2
- NVIDIA AGX Orin users install the ZED SDK for JetPack 6.0 GA release.
- Ubuntu 22.04 PC/laptop users install the ZED SDK for Ubuntu 22 release.
-
Install the
innovusion
ROS driver for Seyond Robin-W LiDAR. You may contact the LiDAR vendor to obtain the Debian package.
Prepare the Environment Manually (If Not Using the Recommended Method)
Step 1: Install ROS Humble
Visit this guide and install ROS Humble. Please install both ros-humble-desktop
and ros-dev-tools
.
Step 2: Install Sensor Drivers and SDKs
-
Blickfeld Scanner Library 2.20.6
-
Visit the release archive and install ZED SDK 4.2
- NVIDIA AGX Orin users install the ZED SDK for JetPack 6.0 GA release.
- Ubuntu 22.04 PC/laptop users install the ZED SDK for Ubuntu 22 release.
-
Install the
innovusion
ROS driver for Seyond Robin-W LiDAR. You may contact the LiDAR vendor to obtain the Debian package.
Step 3: Install Autoware
Method 1: Debian Binary Release
Check Autoware 2025.02 binary release. It performs automated system configuration.
After the installation is complete, activate the development environment.
source /opt/autoware/autoware-env
Method 2: Build from Source
Visit the official tutorial and build the Autoware step by step.
After the installation is complete, activate the development environment.
source ~/autoware/install/setup.bash
Build the AutoSDV Project
Download the AutoSDV source repository.
git clone -b 2025.02 --recursive https://github.com/NEWSLabNTU/AutoSDV.git
cd AutoSDV
Assume that Autoware development environment is activated. Build the project in the following steps.
make prepare
make build
After the project is successfully built, activate the development environment.
source install/setup.sh
Docker Environment Setup
If you don't have access to an NVIDIA Jetson AGX Orin or want to try out AutoSDV in a containerized environment first, you can use our Docker setup. This approach provides a consistent development and testing environment regardless of your host system.
Prerequisites
The Docker environment requires the following on your host system:
- Docker with NVIDIA container toolkit installed
- QEMU for ARM64 emulation (if building on x86_64)
- Git for cloning the repository
Getting Started with Docker
Step 1: Clone the Repository
git clone -b 2025.02 --recursive https://github.com/NEWSLabNTU/AutoSDV.git
cd AutoSDV/docker
Step 2: Initial Setup
Before building containers for the first time, run the bootstrap command to set up cross-architecture support:
make bootstrap
This installs required dependencies like QEMU and configures Docker to handle ARM64 images.
Step 3: Build the Docker Image
Build the AutoSDV Docker image with:
make build
This creates a Docker image configured for ARM64 architecture, suitable for Jetson devices. The image will use the current commit of your local repository, clone the repository, and check out that same commit inside the container.
Note: Before building, the system checks if your current commit has been pushed to the remote repository. If not, you'll receive an error message asking you to push your changes first.
Step 4: Run the Container
Launch an interactive shell in the container with:
make run
When you enter the container, you'll have a ready-to-use AutoSDV environment with all dependencies and artifacts installed.
Working Inside the Docker Container
Once inside the container, you'll find the AutoSDV repository at /home/developer/AutoSDV
with all dependencies already installed.
You can run commands just as you would on a regular system:
# Inside the container
cd /home/developer/AutoSDV
# Build if needed (already done during image creation)
# make build
# Launch AutoSDV
ros2 launch autosdv_launch autosdv.launch.yaml
Docker Commands Reference
Here are some useful Docker commands for working with the AutoSDV environment:
Command | Description |
---|---|
make build | Build the Docker image using the current commit hash |
make run | Enter the container shell |
make save | Save the Docker image as a compressed file |
make clean | Remove the Docker image |
Operating the Vehicle
Before start reading this article, please make sure you followed the installation guide and built the project. The project repository has a launch file autosdv.launch.yaml
that defines the set of nodes to be executed and assigned parameters to start the whole driving system.
The Simple Way
The Makefile has a receipt to start the whole system.
make launch
Customize the Launch
You can either modify the launch file directly located here:
AutoSDV/src/autoware/launcher/autosdv_launch/launch/autosdv.launch.yaml
or assign argument values to the launch command. For example, to set launch_sensing_driver
to false.
source install/setup.sh
ros2 launch autosdv_launch autosdv.launch.yaml launch_sensing_driver:=false
Arguments
Argument | Value | Default |
---|---|---|
vehicle_model | The name of the vehicle model. | autosdv_vehicle |
sensor_model | The name of the sensor model. | autosdv_sensor_kit |
map_path | The path to the map data directory. | ./data/COSS-map-planning |
launch_vehicle | Whether to launch the vehicle interface. | true |
launch_system | Whether to launch the system compoment. | false |
launch_map | Whether to launch the map compoment. | false |
launch_sensing | Whether to launch the sensing compoment. | true |
launch_sensing_driver | Whether to launch sensor drivers. | true |
launch_localization | Whether to launch the localization compoment. | false |
launch_perception | Whether to launch the perception compoment. | false |
launch_planning | Whether to launch the planning compoment. | false |
launch_control | Whether to launch the control compoment. | true |
pose_source | The localization method. | eagleye |
Development Guide
This section provides information for developers who want to extend, modify, or contribute to the AutoSDV platform. The guide covers five key areas:
- Source Code Walkthrough - Repository structure and organization
- Version Control - Working with Git superproject and submodules
- Sensor Components and Drivers - Available sensors and their configuration
- Sensor Kit Configuration - Sensor integration and calibration
- Vehicle Interface - Control systems and vehicle parameters
Source Code Walkthrough
The AutoSDV follows the superproject convention. It collects hundreds of packages as Git submodules classified by their functions into directories. It is built atop Autoware plus additional packages specific to the project and constitutes a large ROS workspace.
Here you can visit the GitHub repository:
https://github.com/NEWSLabNTU/AutoSDV.
Directory | Function |
---|---|
AutoSDV/ | |
├── book/ | The source documents for this book. |
├── data/ | It includes data files used or loaded in the runtime. |
├── docker/ | The Docker container build script. |
├── scripts/ | Auxiliary script files. It contains Ansible scripts to set up the environment. |
├── src/ | The source code packages. |
├── Makefile | It includes commonly used recipes. |
└── README.md | The introductory document to get the first impression of the project. |
Source Package Categories
Directory | Function |
---|---|
AutoSDV/src/ | The entry to the Autoware source tree. |
├── core/ | The Autoware.Core library. |
├── launcher/ | Includes launch files to run the entire driving system. |
├── vehicle/ | Powertrain control and kinetic parameters. |
├── param/ | Parameters specific to vehicle models. |
├── sensor_component/ | Sensor drivers and sensing data processors. |
└── sensor_kit/ | Sensor related parameters and launch files. |
Vehicle Interface Packages
AutoSDV/src/vehicle/autosdv_vehicle_launch/
Directory | Function |
---|---|
.../autosdv_vehicle_launch/ | |
├── autosdv_vehicle_interface/ | Powertrain control and its state measurement. |
├── autosdv_vehicle_description/ | Vehicle shape parameters and mesh files. |
└── autosdv_vehicle_launch/ | Launch files to start the vehicle interface. |
The Package for Vehicle-Specific Parameters
The autoware_individual_params
package serves parameters that are specific to different vehicle models. It is located at
AutoSDV/src/param/autoware_individual_params
You can find the parameter directories within this package.
.../autoware_individual_params/individual_params/default/
Directory | Function |
---|---|
.../default/ | |
├── awsim_sensor_kit | Parameters for the AWSIM vehicle. |
└── autosdv_sensor_kit | Parameters for the AutoSDV vehicle. |
├── imu_corrector.param.yaml | |
├── sensor_kit_calibration.yaml | |
└── sensors_calibration.yaml |
Sensor Related Packages
Directory | Function |
---|---|
AutoSDV/src/ | |
├── sensor_component/ | Sensor drivers and preprocessors. |
└── sensor_kit/ | |
└── autosdv_sensor_kit_launch/ | |
├── autosdv_sensor_kit_description/ | Coordinates of each sensor. |
└── autosdv_sensor_kit_launch/ | Additional launch files for sensors. |
Version Control
The project adopts the superproject approach to manage a large number of Git repositories. The AutoSDV repository itself, the said superproject, saves the sub-projects as Git submodules but does not store their actual data. You can learn from the tutorial here to get the impression of superproject.
A Git submodule works as though it is a hyperlink within the mother repository. The mother repository stores information about submodules in the .gitmodules
file. You can list them by:
git submodule status
Notice
The official Autoware adopts a different version control strategy from ours. Do not confuse them.
Download a Repository with Submodules
Always add the --recursive
option when downloading a repository containing submodules.
git clone --recursive https://github.com/NEWSLabNTU/AutoSDV.git
If you forget to add the option, the submodule directories will be empty. You can get the submodule contents afterwards.
cd AutoSDV/
git submodule update --init --recursive
Inspect a Submodule
Let's check the src/autoware/core/autoware.core
submodule for example. Open the .gitmodules
and you can see the section below. It tells the directory location to the submodule and also the upstream URL.
[submodule "src/autoware/core/autoware.core"]
path = src/autoware/core/autoware.core
url = https://github.com/autowarefoundation/autoware.core.git
The path src/autoware/core/autoware.core
is treated as a link file in the viewpoint of the mother repo. It stores the commit hash to the tracked Git repository. You can show the commit hash the command below. If the commit hash changes, we go through the usual git add & commit to save it.
$ git submodule status src/autoware/core/autoware.core
99891401473b5740e640f5a0cc0412c0984b8e0b src/autoware/core/autoware.core (v1.0~1)
Save Changes within a Submodule
To save the changes within a submodule, you must commit the changes both in the submodule repo and in the mother repo in a two-step fashion.
Let's see src/autoware/sensor_kit/autosdv_sensor_kit_launch
submodule for example.
Committed Changes | Pushed to Upstream Repository |
---|---|
Changes within the autosdv_sensor_kit_launch submodule. | autosdv_sensor_kit_launch subproject repository |
New commit hash on the autosdv_sensor_kit_launch submodule | AutoSDV mother repository |
The walk through goes like this.
# Go into the submodule and check to the branch we want to work on.
cd src/autoware/sensor_kit/autosdv_sensor_kit_launch
git checkout main
# Do some work in the submodule.
touch a_new_file # Create a file
# Commit and push to the upstream repo.
git add a_new_file
git commit -m 'Add a new file'
git push
# Go back to the mother repo
cd -
# Save the new commit hash on the submodule and push it to the upstream repo.
git add src/autoware/core/autoware.core
git commit -m 'Update the autoware.core submodule'
git push
Sensor Components and Drivers
The sensor_component
directory contains a collection of drivers and data processors for sensors on the AutoSDV vehicle. They are mostly provided by vendors and existing open source projects.
Notice
The sensor component defines the collection of sensor drivers in Autoware. If you're looking for the composition of the sensor drivers, please refer to Sensor Kit Chapter.
The AutoSDV Autoware adds the following ROS packages along with official packages.
- ZED X Mini camera
- Blickfeld Cube1 LiDAR
- MPU9250 Nine-Axis Motion Sensor
- KY-003 Hall Effect Sensor
ZED X Mini Camera
The ROS 2 package requires ZED SDK 4.2 to be installed on the system. ZED SDK is installed by the setup script described in Installation Guide. The driver package is located at:
src/autoware/sensor_component/external/zed-ros2-wrapper
To run the standalone ZED camera driver,
ros2 launch zed_wrapper zed_camera.launch.py camera_model:=zedxm
Blickfeld Cube1 LiDAR
The IP address of Blickfeld Cube1 LiDAR and Jetson is 192.168.26.26
and 192.168.26.1
, respectively.
The driver package is located at
src/autoware/sensor_component/external/ros2_blickfeld_driver_src-v1.5.5
To run the standalone driver,
ros2 launch blickfeld_driver live_scanner_node.launch.py
MPU9250 Nine-Axis Accelerometer Gyroscope Sensor
MPU9250 measures the motion state of the vehicle, including the linear acceleration, angular acceleration and angular speed. The source package is located at
src/autoware/sensor_component/external/ros2_mpu9250_driver/include/mpu9250driver
To run the standalone driver,
ros2 run mpu9250driver mpu9250driver
Garmin GPS 18x 5Hz
Full specification of the sensor
- Localise the device at first. Execute:
sudo dmesg
- you should find this line: FTDI USB Serial Device converter now attached to ttyUSB0
- Once you find the device, get raw data from it by executing:
sudo cat /dev/ttyUSB0
And you should see data in NMEA format like this:
$GPGSA,A,1,,,,,,,,,,,,,,,*1E
$GPGSV,3,1,10,10,76,279,22,12,34,053,23,23,51,164,46,24,10,053,27*75
$GPGSV,3,2,10,25,65,108,26,28,42,271,25,29,04,141,25,31,15,250,18*7B
$GPGSV,3,3,10,32,44,335,21,26,05,208,00*74
Note: If you see something else, for example binary data, make sure you use the correct baud rate - 9600.
2.1 Check the current baud rate by this command:
stty -F /dev/ttyUSB0
Set the 9600 baud rate by this command:
sudo stty -F /dev/ttyUSB0 9600
- Execute the GPS Deamon (gpsd) for the right device:
sudo /usr/sbin/gpsd -n -G -b /dev/ttyUSB0
To verify the signal, you can open the CLI app:
cgps
or GUI app:
xgps
Wait till you get the proper longitude and latitude coordinates and the status of the GPS must be 3D Fix. If there is 'No Fix', no good signal is being received.
- Since you get the signal, you can launch Autoware and subscribe the topic:
ros2 topic echo /sensing/gnss/garmin/fix
Configuration files in Autoware
If you do not get data on the topic, make sure, the configuration is correct by checking these files:
- Enable the GNSS Driver at:
AutoSDV/src/sensor_kit/autosdv_sensor_kit_launch/autosdv_sensor_kit_launch/launch/sensing.launch.xml
- Enable the Garmin Driver at:
AutoSDV/src/sensor_kit/autosdv_sensor_kit_launch/autosdv_sensor_kit_launch/launch/gnss.launch.xml
- No mistake in the Python script at (the script composes configuration to launch the GNSS Driver):
AutoSDV/src/sensor_component/external/gps_umd/gpsd_client/launch/gpsd_client-launch.py
- Note: the topic fix is remapped as garmin/fix
3.1 Parameters for the Python script is at:
AutoSDV/src/sensor_component/external/gps_umd/gpsd_client/config/gpsd_client.yaml
- GNSS Client connecting to the gpsd (GPS Deamon) is located at:
AutoSDV/src/sensor_component/external/gps_umd/gpsd_client/src/client.cpp
Sensor Kit Configuration
The sensor kit consists of the description package and the launch package. The description package, autosdv_sensor_kit_description
, stores the relative coordinates for each sensor on the vehicle. The launch package, autosdv_sensor_kit_launch
, contains a set of launch files for all kinds of sensors along with their runtime parameters.
Notice
The sensor kit defines the composition and the data paths of sensors. If you're looking per-sensor driver configuration, please refer to Sensor Component Chapter.
The Description Package
The description package stores the coordinates of each sensor installed on the vehicle. It's done by working on these two configuration files.
../autosdv_sensor_kit_description/config/sensor_kit_calibration.yaml
../autosdv_sensor_kit_description/urdf/sensor_kit.xacro
For example, the coordinate for the ZED camera is named as zedxm_camera_link
, which pose parameters are defined in sensor_kit_calibration.yaml
.
sensor_kit_base_link:
zedxm_camera_link: # Zed Camera
x: 0.0
y: 0.0
z: 0.0
roll: 0.0
pitch: 0.0
yaw: 0.0
The sensor_kit.xacro
file has corresponding entries for the coordinate. In the xacro snipplet, it defines a <xacro:zed_camera>
component and a joint from sensor_kit_base_link
to zedxm_camera_link
.
<xacro:zed_camera name="zedxm" model="zedxm" custom_baseline="0" enable_gnss="false">
<origin
xyz="${calibration['sensor_kit_base_link']['zedxm_camera_link']['x']}
${calibration['sensor_kit_base_link']['zedxm_camera_link']['y']}
${calibration['sensor_kit_base_link']['zedxm_camera_link']['z']}"
rpy="${calibration['sensor_kit_base_link']['zedxm_camera_link']['roll']}
${calibration['sensor_kit_base_link']['zedxm_camera_link']['pitch']}
${calibration['sensor_kit_base_link']['zedxm_camera_link']['yaw']}"
/>
</xacro:zed_camera>
<joint name="zedxm_camera_joint" type="fixed">
<origin
xyz="${calibration['sensor_kit_base_link']['zedxm_camera_link']['x']}
${calibration['sensor_kit_base_link']['zedxm_camera_link']['y']}
${calibration['sensor_kit_base_link']['zedxm_camera_link']['z']}"
rpy="${calibration['sensor_kit_base_link']['zedxm_camera_link']['roll']}
${calibration['sensor_kit_base_link']['zedxm_camera_link']['pitch']}
${calibration['sensor_kit_base_link']['zedxm_camera_link']['yaw']}"
/>
<parent link="sensor_kit_base_link"/>
<child link="zedxm_camera_link"/>
</joint>
The Launch Package
The autosdv_sensor_kit_launch
package contains a collection of launch files in the launch
directory. The notable one is sensing.launch.xml
. It is the mother launch file used to start the whole sensing module and all the other launch files are included.
The camera.launch.xml
, gnss.launch.xml
, imu.launch.xml
and lidar.launch.xml
launch files correspond to respective sensing functions. Each of them contains sensing drive execution methods and their parameters.
The pointcloud_preprocessor.launch.py
is the special one that provides the multi-LiDAR fusion feature. It includes a point cloud processor node that subscribes to one or multiple input topics from LiDARs drivers.
parameters=[
{
"input_topics": [
"/sensing/lidar/bf_lidar/points_raw",
],
"output_frame": LaunchConfiguration("base_frame"),
"input_twist_topic_type": "twist",
"publish_synchronized_pointcloud": True,
}
],
Vehicle Interface
The vehicle interface bridges the Autoware control and vehicle actuators. It is served by the autosdv_vehicle_launch
repository located at src/autoware/vehicle/autosdv_vehicle_launch
. It includes the following packages.
-
autosdv_vehicle_description
It provides vehicle appearance parameters.
-
autosdv_vehicle_launch
It provides a launch file that runs necessary nodes to drive the vehicle.
-
autosdv_vehicle_interface
The package provides the node that converts the Autoware control commands to motor power and provides vehicle status reporting nodes for cruise control.
To launch the vehicle interface for the vehicle,
ros2 launch autosdv_vehicle_launch vehicle_interface.launch.xml
The Velocity Reporting Node
The node is implemented in velocity_report.py
. It periodically reads the Hall effect sensor and counts the magnet markers embedded on the wheel in each period. In this way, the rotation speed of the wheel can be measured, and the instant speed can be calculated by multiplying the wheel radius.
The Actuator Node
The node implemented in actuator.py
reads a target speed and controls the motor power to reach to that speed. It uses a PID controller to compute PWM values and applies them on DC motors.
Appendix
This section provides supplementary information for advanced users and specialized applications:
- Hardware Components and Wiring - Detailed component specifications and wiring diagrams
- 5G/LTE Deployment - Cellular connectivity implementation and network setup
Hardware Components and Wiring
Hardware Components
The vehicle is assembled using core components and optional supplementary components. The core components include the chassis and other essential parts. Supplementary components, such as the LiDAR and 5G/LTE module, are optional and can be selected based on your specific requirements.
Core Components
Items |
---|
# Chassis |
Tekno TKR9500 Truck Kit 16×11×5 inch |
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯ |
# Powertrain |
Brushless Motor4274 / 1500kv |
PCA9685 PWM Driver |
DC-DC Converter IN 24V OUT 12V10A |
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯ |
# Computer |
NVIDIA Jetson AGX ORIN Dev. kit 64GB / 32GB |
Micron P3 PLUS 1000GB SSD |
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯ |
# Camera |
ZED X Mini Stereo Camera (Lens Focal Length 2.2mm, with polarizer) |
ZED Link Capture Card |
GMSL2 Fakra Cable F-F(0.3m) |
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯ |
# Navigation Sensors |
KY-003 Hall Effect Sensor |
MPU9250 9-axis Motion Processing Unit |
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯ |
# Battery and Power Supply |
Battery Gens ACE-5000mAh-45C-22.2V-6S1P-XT60 |
Battery Gens ACE-7200mAh-50C-7.4V-2S1P-21 |
Breaker 4230-T110-K0BE-8AB8 |
Table 1. Core Materials for the vehicle.
Supplementary: LiDAR Sensors
When choosing a LiDAR sensor, it depends on the specific localization method and desired vision quality. If point cloud-based NDT localization is used, the Velodyne VLP-32C LiDAR is often selected for its panoramic view. In contrast, solid-state LiDARs offer higher point density, making them better suited for detailed scene and object feature extraction, as well as vision-based localization that collaborates with cameras.
LiDAR Sensor | |
---|---|
(Choose one of below) | |
Seyond Robin-W | Solid-State LiDAR with 120° FOV |
Blickfeld Cube 1 | Solid-State LiDAR with 70° FOV. (EOL) |
Velodyne VLP-32C | Mechanical spinning LiDAR with 360° (EOL) |
Table 2. Recommended LiDAR sensor for the vehicle.
Supplementary: 5G/LTE Communication
The Ataya 5G Harmony kit was successfully deployed on the vehicle and underwent examination by NEWSLab at National Taiwan University. The following table lists the key components of the 5G kit. For more detailed specifications and quotes, please visit Ataya's website. Additionally, consulting the Global Mobile Frequencies Database at Spectrum Monitoring to know available bands in your region.
5G/LTE Kit | |
---|---|
# 5G/LTE | |
Ataya Harmony 5G Core Network Kit | Included within a 28-inch suitcase containing the core router. |
Askey 5G Sub-6 Indoor Small Cell | The base station connected to the core network. |
MOXA CCG-1500 Gateway | Installed on the vehicle as the connector to 5G. |
Table 3. Recommended LiDAR sensor for the vehicle.
Procurement Information
The vehicle can be ordered through Hennes Co., including customizable options for additional parts. Note that batteries are excluded from assembly due to shipping constraints and should be sourced locally. You can request a quote via their Robot Kingdom website.
The optional supplementary parts such as LiDARs and 5G modules are up to your specific needs. It is advised to consult with your local agent for procurement assistance.
Wiring and Pinouts
NVIDIA AGX Orin GPIO/I2C Pinout
The detailed pinout specification can be found on this website.
Device | Label | Pin No. (Upper) | Pin No. (Lower) | Label | Device |
---|---|---|---|---|---|
3.3 VDC Power, 1A max | 1 | 2 | 5.0 VDC Power, 1A max | KY-003 | |
PCA9685 | I2C5_DAT General I2C5 Data I2C Bus 7 | 3 | 4 | 5.0 VDC Power, 1A max | PCA9685 |
PCA9685 | I2C5_CLK General I2C #5 Clock I2C Bus 7 | 5 | 6 | GND | PCA9685 |
MCLK05 Audio Master Clock | 7 | 8 | UART1_TX UART #1 Transmit | ||
GND | 9 | 10 | UART1_RX UART #1 Receive | ||
UART1_RTS UART #1 Request to Send | 11 | 12 | I2S2_CLK Audio I2S #2 Clock | ||
GPIO32 GPIO #32 | 13 | 14 | GND | KY-003 | |
KY-003 | GPIO27 (PWM) | 15 | 16 | GPIO8 | |
MPU9250 | 3.3 VDC Power, 1A max | 17 | 18 | GPIO35 (PWM) | |
SPI1_MOSI SPI #1 Master Out/Slave In | 19 | 20 | GND | MPU9250 | |
SPI1_MISO SPI #1 Master In/Slave Out | 21 | 22 | GPIO17 GPIO | ||
SPI1_SCK SPI #1 Shift Clock | 23 | 24 | SPI1_CS0_N SPI #1 Chip Select #0 | ||
GND | 25 | 26 | SPI1_CS1_N SPI #1 Chip Select #1 | ||
MPU9250 | I2C2_DAT General I2C #2 Data I2C Bus 1 | 27 | 28 | I2C2_CLK General I2C #2 Clock I2C Bus 1 | MPU9250 |
CAN0_DIN CAN #0 Data In | 29 | 30 | GND | ||
CAN0_DOUT CAN #0 Data Out | 31 | 32 | GPIO9 | ||
CAN1_DOUT CAN #1 Data Out | 33 | 34 | GND | ||
I2S_FS AUDIO I2S #2 Left/Right Clock | 35 | 36 | UART1_CTS UART #1 Clear to Send | ||
CAN1_DIN CAN #1 Data In | 37 | 38 | I2S_SDIN Audio I2S #2 Data In | ||
GND | 39 | 40 | I2S_SDOUT Audio I2S #2 Data Out |
Table 4. Pinout for NVIDIA AGX Orin box.
I2C Device Pinout
The wires to a I2C device consists of a pair of VCC/GND for power supply and a pair of SDA/SCL for data transmission. The pair of SDA/SCL associates with a I2C bus number.
Device | Bus No. | VCC | GND | SDA | SCL |
---|---|---|---|---|---|
PCA9685 PWM/servo driver | 7 | 4 | 6 | 3 | 5 |
MPU9250 inertial measurement unit | 1 | 17 | 20 | 27 | 28 |
Table 5. I2C device ports and connected pin numbers.
GPIO Device Pinout
The wires to a GPIO device consists of a pair of VCC/GND for power supply and a GPIO wire for input or output signals.
Device | VCC | GND | GPIO |
---|---|---|---|
KY-003 Hall effect sensor | 2 | 14 | 15 |
Table 6. GPIO device ports and connected pin numbers.
5G/LTE Deployment
Outdoor Setup Example
Figure 1 shows an example setup of the private 5G infrastructure in the outdoor based on Ataya Harmony system. The system has several parts:
- The Ataya 5G core network box ("1" in Figure 1)
- Askey small cell base station ("2" in Figure 1)
- MOXA 5G cellular gateway installed on the vehicle top (Figure 2)


5G signal range can extend up to 600 meters. Compared to Wi-Fi, it maintains stable latency and bandwidth within that area regardless of the distance to the base station.
When setting up 5G equipment, there are several points to consider. First, the 5G receiver in a vehicle must be placed on the exterior of the body and not inside to avoid poor reception. Second, buildings near the base station can affect signal range. Base station antennas usually have directional capabilities, so ensure that the vehicle's activity range is within the antenna's coverage area.
Network Architecture
