Hardware requirements discussion

I draft a hardware requirement, let’s discuss in this thread or in hardware working group meeting.

draft spec analysis

sensor type and interface

  • Camera
Type Interface Software interface(Linux)
LVDS Camera LVDS V4L (/dev/videoX)
CSI Camera Module CSI V4L
USB Camera USB2.0/3.0/3.1 V4L
GigE Camera 1000base-T V4L
Smart Camera CAN/100base-T/AVB AutoSAR/Cust
  • USB/GigE based Camera are not for production purpose, but its easy to use.
  • Most smart camera is ADAS module.
  • at Linux platform, V4L2 is generic interface to get image.
  • Image format maybe different by different sensors, need framework to support transform.
  • the best solution/framework for transform image format could be GStreamer or OpenCV.
  • transport image/video stream to different computer board over Eth, Gstreamer with hardware encode/decode should be much better solution than using ROS/ROS2 to publish image topic. we need avoid transfer raw image by network.
  • Lidar
Type Interface Software interface(Linux)
Mechanical Lidar 100/1000Base-T UDP(support Multicast)
Solid Lidar 100Base-T/AVB UDP(support Multicast)
Solid Lidar (smart) CAN/100Base-T/AVB AutoSAR/Cust/UDP
  • Most Lidars are ethernet based with UDP traffic.
  • in order to improve network performance, Lidar should connect processing board directly or in a dedicated VLAN/LAN by ethernet switch.
  • UDP multicast provide capability of redundancy feature of computing board, but also meet safety requirement of network switch.
  • GPS/IMU
Interface Sensor SW Interface(Linux)
UART GNSS NMEA
IMU Cust
GNSS+INS NMEA/Cust (including IMU raw data or fused)
CAN GNSS Cust
IMU Cust
GNSS+INS Cust
I2C(onboard) GNSS/IMU NMEA/Cust
PPS Pin GNSS
  • Most GPS provide UART interface. (can use usb-uart for development env.)
  • Some IMU use CAN bus (2.0 or FD).
  • ROS application expect /imu /fix and original data of NMEA.
  • In many use cases, GPS/IMU are connected to MCU, by using IRQ to process high frequency IMU data.
  • Radar/Sonar
Sensor Interface SW interface(MCU)
Radar CAN
Sonar CAN/Lin
  • most radar/sonar are connected to MCU.
  • Summary the interfaces
Type Interface Automotive requirement
Onboard interface CSI/MIPI support
I2C support
UART(some cases) support
Off board interface LVDS support
UART
CAN/Lin/Flexray support
100Base-T1/AVB(TSN) support (two wire)
1000Base-T1 support (few vendor)
1000Base-T(RJ45)
USB
  • Most demo cars are using UART/USB/1000base-T
  • LVDS/CAN should must be supported if the ECU/computer board are targeting real product.
  • 100Base-T1(TSN) maybe the best solution for highspeed data exchange who is targeting real product. (Tesla use 100base-T1 to interconnect media part and AP)

sensor requirement and configuration

based on scenario and sensor performance, should involve sensor vendor to discuss.

computing unit

Silicon type:

  • MPU with RT-Core
  • MPU without RT-Core
  • Standalone MCU

Board Architecture:

  • should follow 96auto spec.
  • at least one MCU/RT-Core onboard, prefer with lock-step features.

Reference function design of boards:

  • considering the capability of current SoC in automotive world, defined several type of boards:
    • P1 which focus on sensor connection and sensor fusion. It could be a kind of gateway or domain controller which has good network capabilities.
    • P2 which focus on camera data processing. It could be a kind of domain controller or smart camera which support ADAS features.
    • P3/P4 are used for main software stack, including later fusion, planning, decision maker.
  • All these part should be connected by ethernet, 100Base-T1 for automotive or 1000base-T are OK.
  • Detail network design for deployment is not in the scope.
  • You can combine different function in one board or single chip. it depends on the capability of your chip and board.

For board hardware interface:

  • Type:

    • UART/USB/1000base-T must be supported for current POC setup.
    • 100Base-T1 with TSN capability and CAN 2.0/FD must be supported.
    • LVDS or onboard CSI camera depends on the capability of image processing including ISP/CV/Encode/DNN.
    • Need support PPS input from GNSS.
    • Should has solution to synchronize sensors.
  • Configuration:

    • If connectint GPS/IMU to MPU, it need 1-2 UART. And need extra UART console for debugging.
    • At least 2 ethernet interface to isolate different LAN.

For MPU side:

  • Operation system

    • Linux kernel with PREEMPT-RT.
    • Suggest support Debian/Ubuntu filesystem, Yocto as an option.
    • It’s optional to support QNX and Vxworks.
  • Performance capability

    • Should support the performance requirement of PCL if support Lidar data processing.
    • Should support the performance requirement of OpenCV if supporting Camera/Image data processing.
    • Should support the performance requirement of DNN if supporting Deep learning. Do we need to choose an unified DNN inference framework? like TVM which has auto tuning feature to support different platform

    Autoware should define the dependency of 3rd party software.

  • Low level hardware acceleration capability (Optional)

    • OpenCL, which can power PCL2.0 and OpenCV3.x
    • OpenVX
    • Gstreamer encode/decode, support MJPG and H.264/265
    • DNN accelerator
    • CV accelerator (with OpenCV or OpenVX wrapper)
    • Neon for ARM
    • OpenMP

    low level acceleration capabilities are optional, depends on performance benchmark requirements.

For RT-Core/MCU side:

  • Interconnection with MPU and other board

    • interconnect MPU by onboard ethernet and/or I2C/UART.
    • Onchip RT-Core will use RPC tech provided by Silicon vendor.
    • If don’t have ethernet interface/IP stack, should has agent on MPU to export capability on middleware.
    • must have capability to monitor MPU’s status.
  • Operation system

    • AutoSAR Classic
    • FreeRTOS or other RTOS
    • Prefer support Micro-ROS
  • Feature could be realized inside MCU:

    Autoware do not define which kind of features could be realized by MCU/RT-Core yet.

Interconnection and hardware redundancy

Hardware redundancy is not in scope, there are different realizations and we will not choose one. but we believe the interface requirement and RT-Core/MCU requirement will provide the capability to setup user’s own redundancy solution.

time synchronize and deterministic

  • global time sync
    • Use GNSS time info as NTP source.
    • Use PPS from GNSS as PTP source.
    • Support RTC.
  • time stamp of sensor data provided by driver
    • time/space synchronize of sensors will be developed by user, provide PPS dispatch feature for sensor usage.
    • from device driver side, need to clarify the correct time stamp.
    • most sensor data are generated in a time slice, and depends on sensor venders’ realization.
  • calculate latency between senser data is generated to algorithm output.
    • well designed time synchronize infrastructure and device drivers will help prediction algorithm.

example with autocore’s PCU

using AutoCore’s PCU as an example to demonstrate hardware design. it’s designed for P1 or P3 usage.

  • it use NXP LS1046A as MPU, provide much connectivity and ARM A72 CPU for generic algorithm.
  • it use TMS570 as MCU with opensource FreeRTOS, provide capability of real-time/safety features, and CAN.
  • it has onboard 5port switch.
  • it has extend capabilities for acceleration by M.2 and miniPCIE.

below is default ethernet configuration of this board.

GitHub - autocore-ai/autocore_pcu_doc: Repository for documents of Perception computer unit

2 Likes

Hi,

Thank you for this draft. Is it for Autoware hardware requirements?

Do you have any suggestion for hardware supplier for a self driving car? We are planning to buy a car and all the sensors for research at Aalto University. We will use Autoware in this project. I found AutonomousStuff and DataSpeed to provide all the sensors and do the installation and support. Do you have any other suggestion or do you know any other supplier?

Thanks

@kargarisaac have you checked AutoCore’s PCU for the computing? as @cheng.chen mentions above

Yes. I look for a company that can provide all the sensors like AutonomousStuff.

Hi, if I may, we at StreetDrone supply fully integrated autonomous vehicles, including sensors, compute and Autoware.AI stack. We can supply full support too and support a range of vehicle form factors.


Please let me know if you’d like to discuss with us- send me a message and I’ll let you know my email.

Hi @hollywn, I would be interested in understanding a bit more in details the actual setup you have for helping our adopter to identify hardware required for such ODD. You could check what details I’m talking about by looking at this page where we are starting collecting this information.


Please feel free to contact me for any questions.
Thanks !