Your behavior appears to be a little unusual. Please verify that you are not a bot.


Autonomy assembled: Driverless kits to hit the road in 2020

August 4, 2017  - By

A major new global-scale venture by China’s Internet giant Baidu aims to put artificial intelligence behind the wheel of fully autonomous vehicles on the road by 2020.

Regulatory considerations aside, the technical challenges are considerable, but like its U.S. counterpart Google, Baidu is pushing a big pile of chips onto its artificial intelligence (AI) bet.

Similar to Android, it has made much of the Apollo program’s code, which is completely open-source and available on Github.

The ecosystem, launched at the Baidu developers conference in Beijing in April, has enlisted at least 50 partners worldwide, with more anticipated.

A key participant is AutonomouStuff, which started out as an autonomous components supplier, but lately self-transformed into a full-fledged system integrator, with core GNSS and inertial capabilities drawn from manufacturers in the positioning, navigation and timing (PNT) industry.

Other Apollo partners include major Chinese auto manufacturers; tier 1 suppliers such as Bosch, Continental Automotive and ZF Friedrichshafen AG; components providers such as NVIDIA and Microsoft Cloud; mapper TomTom; and drive-sharing companies.

AutonomouStuff kitted out two standard Lincoln MKZ sedans for demonstration drives at the Beijing conference, with one technician completing each vehicle in about three hours — a task that would normally take a team of workers up to six weeks. The two Lincolns then drove simultaneously, driverless, around a test track.

The technology has been developed to be transferrable to other vehicles. Models already demonstrated include the Ford Fusion, a street-legal golf-cart-type electric vehicle called the Polaris GEM, and an off-road Ranger buggy platform.

AutonomousStuff presents the Apollo kit at the Baidu developer’s conference in April. (Photo: AutonomousStuff)

How It Works

Each car is modified by adding lasers, camera, radar sensors, GPS and inertial measurement unit (IMU), a drive-by-wire computer interface and computer engine.

Laser Sensors. A 64-beam lidar sensor on the roof gives a 360-degree field of vision for mapping, and lidar localization algorithms drawing on more than 2.2 million points of data per second generate a point cloud giving distance, angle and intensity values. This data is integrated with data from the GPS and IMU to generate a base map. Two smaller lidar sensors on the front corners of the vehicle provide obstacle detection and tracking.

Rotating four-beam laser sensors with 110-degree view and 200-meter range cover blind spots and facilitate fusing all raw data into one scan. Together, they detect other cars, trucks, bikes, pedestrians and background objects, and generate detailed data on their position, motion and shape. Distance and angular resolution data are used to offset camera and radar data.

Cameras. The platform uses two visible-light cameras mounted on the windshield, relying on laser sensors for nighttime operation. An image-processing chip provides real-time detection of lanes, vehicles and pedestrians, and measures dynamic distances from the vehicle.

Radar. Five radar sensors provide object detection, with various placements around the vehicle, and varying ranges and fields of view. Jointly, they provide a 360-degree bubble around the car.

Navigation. The kits provide GPS navigation combined with a tightly coupled IMU to provide data when GPS is not available.

Together, this provides accuracy to 2 cm, according to the company, when used with a real-time kinematic (RTK) base station; this obviously limits vehicle range. Another option is to use correction data from satellite-based correction services such as TerraStar, yielding achievable accuracies on the order of 4 cm.

Documentation

The aim of the Apollo project is to enable partners and customers to develop their own self-driving systems. The information supplied by Baidu encompasses a complete set of end-to-end instructions to convert a regular car to an autonomous-driving vehicle:

Software Instructions. A set of files that contain:

  • architecture of the classes and the files within each class.
  • code instructions for:
    • coordinate system
    • third-party libraries
    • calibration table.

Hardware Documents. Instructions to install the hardware and software for the vehicle include:

  • Vehicle:
    • industrial PC (IPC)
    • GPS
    • inertial measurement unit (IMU)
    • controller area network (CAN) card
    • hard drive
    • GPS antenna
    • GPS receiver
  • Software:
    • Ubuntu Linux
    • Apollo Linux kernel
  • Hardware reference guides:
    • vehicle
    • IPC
    • GPS
    • CAN card

https://youtu.be/eiSfP-Rn6n4

Manufacturers

The AutonomouStuff Apollo kit incorporates a choice, depending on user needs, of a selection of NovAtel GNSS receivers, including the ProPak6 GNSS receiver and the SPAN-IGM-A1 GNSS+IMU combined system, IMUs such as the IMU-ISA-100C incorporating Northrop-Grumman Litef GMBH’s inertial measurement technology, and antennas such as the GNSS-703-GGG-HV high vibration triple-frequency GPS, GLONASS, BeiDou, and Galileo antenna.

A 64-beam Velodyne lidar sensor and 16-beam HDL-16E provide laser data.

The onboard computer system is the AStuff Nebula embedded controller, an IPC powered by an Intel Skylake core i7-6700 CPU. The CAN card used for the IPC is the ESD CAN-PCIe/402.