6
SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION Mark Forbes Director of Marketing Content

SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

Mark ForbesDirector of Marketing Content

Page 2: SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

www.tasking.com

SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

There was a time when a dedicated engine control unit controlled the fuel injection while a transmission control unit handled

shifting. With the growth of advanced driver assistance systems (ADAS), connected car technology, and autonomous vehicles,

today’s cars not only have far more sensors and electronic controls than before, but they are metamorphosing into highly

integrated systems (see Figure 1). Embedded solutions benefit from coalescing functions and individual control units into

domain control units (DCUs), which saves both space and money. But the real advantage comes from merging data from

multiple sensors and integrating DCUs to provide an accurate 360-degree perception of a vehicle’s position, state, and

environment as well as faster, more accurate vehicle response to that environment.

Far from a flash in the pan, this trend represents significant revenue potential for automotive developers. The sale of autonomous

car components is expected to grow from $42 billion in 2025 to $77 billion by 2035, and autonomous car components could

represent 25 percent of the worldwide market by 2035. (1) Currently, the sensor-fusion market is virtually untapped, and it’s

pretty exciting to be at the forefront of developing technology that can make the world’s roads safer through ADAS features

like lane keeping, adaptive cruise control, collision avoidance, and autonomous braking.

Most automotive applications are subject to the highest level of safety regulation, and involve multiple “moving parts”:

� Short- and long-range radar, ultrasound, and Lidar sensors and video and infrared cameras

� Sensors that monitor speed, yaw rate, steering angle, and longitudinal and lateral acceleration

� High-performance, multi-core microprocessors designed for automotive applications

� Vehicle-to-vehicle (V2V) and vehicle-to-everything (V2X) communication systems

� Input from GPS and global navigation satellite system (GNSS)

And yet, even these together are not enough. Just as our various senses—hands, nose, eyes, tongue, and ears—don’t work

without the brain, sensors are useless without software algorithms and applications to “make sense of the sensors” – to integrate

dozens of sensors and process a tidal wave of data with disparate formats. Data filters, machine learning, deep learning, and

artificial intelligence (AI) all play a part. In fact, the number of AI systems in vehicles is predicted to jump from 7 million in 2015

to 122 million by 2025. (2) To ensure the integrity of the software that runs these AI systems, that software should be developed

using Automotive SPICE® (ASPICE)-certified tools – because ultimately, our lives depend on the AI functioning correctly.

Figure 1. As advanced driver assistance systems (ADAS) and autonomous vehicle functions become more sophisticated, integration and fusion of sensor data and the evolution of individual electronic control units (ECUs) into domain control units (DCUs) is imperative

Page 3: SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

www.tasking.com

SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

WHY SENSOR FUSION?

Which would make you feel safer: walking around with your

eyes closed and ears covered, and a clothespin on your nose, or

experiencing all the sights, sounds, and smells with every fiber

of your being? The 1970s car was essentially blind and deaf. In

the 1990s cars had a few sensors distributed throughout the car.

Today, cars have dozens of sensors both internal and external.

Internal sensors include transducers, load cells, accelerometers,

linear variable differential transformers (LVDTs), torque meters,

temperature sensors, gyroscopes, fluid property analyzers, and

miniature pressure and force sensors. Data from these sensors

can be integrated with external sensors like cameras (driver-,

front-, and rear-facing), GPS, radar, microphones, and Lidar; the

end result is a fusion of data into a coherent map of the car’s surroundings. DCUs simplify this fusion by merging various

individual control units into logical groupings for better performance and data analysis. Real-time access to all this data and the

ability to fuse and comprehend its meaning enables cars to see and hear, and to react to their surroundings.

A car that can see and hear may seem futuristic, but it is not – it is happening today and is largely possible due to “sensor

fusion” – ADAS DCUs connect sensors such as cameras and radar with braking and other control systems to enable cars to

“see” the world more effectively, evaluate situations more correctly, and make better decisions (see Figure 2). For example,

video cameras identify objects, while Lidar provides distance from an object. Combining data increases accuracy and decreases

response time, thereby improving vehicle safety.

Using a combination of sensors mitigates each sensor type’s limitations. Video cameras and Lidar don’t work very well in

inclement weather; radar works well in heavy rain but produces low image quality. Sensor fusion provides redundancy that

enables ADAS functions to meet strict safety regulations. For example, the European New Car Assessment Programme (Euro

NCAP) requires 12 radar sensors at safety levels 4 and 5. And as V2V and V2X communication gains ground, cars will be able to

use sensor fusion to learn from each other – even from cars halfway across the world. [1]

WHERE THE RUBBER MEETS THE ROAD

Here’s a peek into just one ADAS feature in

development that uses sensor fusion.

New wireless self-powered accelerometers

and strain gauges in tires can be integrated

with pressure and temperature sensors. In

fusion with other sensors, these sensors will be

able to estimate tire-road friction coefficients

and tire forces, an important variable in active

safety systems. (8)

Figure 2. Sensor fusion combines real-time data from cameras, Lidar, radar, GPS, and other sources, processes it through DCUs, and enables fast and accurate response from the vehicle’s actuators like braking and transmission systems.

Page 4: SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

www.tasking.com

SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

The sensor-fusion module market is predicted to grow 80 percent in the next ten years. In 2015, just 4 percent of the new

vehicle platforms included sensor-fusion modules for surround-view park assistance and safety-critical functions; however, by

2025, 21 percent will include them. The 20-percent compound annual growth rate (CAGR) for sensor-fusion modules between

2015 and 2025 is one of the highest growth rates for components used in the automotive industry. (3) Different types of

sensors will proliferate faster than others (see Figure 3). For example, HIS predicts that cameras and radar sensors will grow

4.5x by 2025, while Lidar lags behind but begins to represent a significant market share by 2030. (4)

Figure 3. The sensor-fusion market is predicted to grow significantly over the next 15 years.

MORE AUTOMATION, MORE CONNECTIVITY… MORE SOFTWARE

Automation, connectivity, and increasingly integrated automotive embedded electronics solutions using sensor fusion and

DCUs are poised to generate dramatic improvements in driver safety as well as comfort and convenience. As a result, the

market is accelerating: Gartner forecasts a market of 250 million connected cars on the road by 2020 (5), while some estimates

indicate that connected car data, and the new business models that it engenders, could be worth $1.5 trillion a year by 2030. (6)

Sensor fusion requires hardware that is optimized for automotive applications, such as high-performance, safety-certified

multicore processors, and sensors that meet strict environmental standards and that are compact, cost-effective, and available

in flexible form factors. But beyond that, ADAS and autonomous driving demands some of the most complex, real-time

computing capabilities ever developed (see Figure 4). Sensor-fusion software must be capable of a high level of abstraction to

support flexible integration of DCUs, different kinds of sensors, and algorithms. Developers must guard against data overload

and latency – most sensor fusion processing must be done in real time. And the algorithms themselves must be able to

accurately react to every possible outcome of human behavior (including irrational behavior). To sum it up, the software that

drives the hardware that drives the car must be able to handle volume, veracity, and variety.

Page 5: SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

www.tasking.com

SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

The mathematics and engineering concepts involved in sensor fusion are intricate. For example, “9-axis sensor fusion” involves

integrating data from the 3-axis earth magnetic field, 3-axis linear acceleration, and the 3-axis angular rate. Co-variance,

quaternions, situational analysis, pattern recognition, digital signal processing, control theory, Kalman filtering (also known as

linear quadratic estimation or LQE), and statistical estimation are just some of the areas with which sensor-fusion developers

must be familiar.

The fusion of image and non-image data is particularly challenging. Some OEMs and tier-one suppliers are working together

with academia to address this challenge, as can be seen in Daimler’s collaboration with the Karlsruhe Institute of Technology

and the University of Ulm. (7) The growth of automotive telematics, in-vehicle infotainment (IVI), ADAS, and autonomous driving

systems is also attracting the attention of standards bodies.

For example, the Mobile Industry Processor Interface (MIPI) Alliance creates standard interface specifications for mobile and

mobile-influenced products, which enables developers to efficiently interconnect essential components, including sensors. The

MIPI specifications for physical interconnects are aligned with MIPI members, such as ARM and Intel, so adopting these standards

helps ensure compatibility of the various sensors with the rest of the hardware they are connected to. The MIPI specifications

address only the interface technology, such as signaling characteristics and protocols; they do not standardize entire application

processors or peripherals. MIPI interface specifications are agnostic to air interface or wireless telecommunication standards;

therefore, MIPI-compliant products are applicable to all network technologies. By enabling products which share common MIPI

interfaces, system integration becomes easier than in the past.

The MIPI Alliance and its members are collaborating and coordinating to maximize MIPI adoption in the automotive industry.

The ubiquity and power of MIPI can simplify development and deployment of ADAS features that use sensors like those used

in intelligent mirrors and surround-view systems. For these, a MIPI interface can connect the image sensor to the control

module; then an Ethernet link can connect the control module to a DCU or central processor, which would provide the ADAS

functionality and drive a display over the Display Serial Interface. (9)

Standardized interfaces and APIs, as well as certified libraries that are highly optimized for the target architecture, can make

it easier and faster to develop embedded solutions. For example, the time-critical sections of ADAS code often involve arrays

(vectors and matrices) of floats and doubles, which are manipulated using linear algebraic operations. Therefore, an optimized

LAPACK library is crucial – often, these

libraries are as much as an order of

magnitude faster than open-source

offerings or in-house implementations.

(10) To further ensure sensor-fusion

code is high-performance and error-free,

developers should choose compilers and

other software development tools that

are ASPICE-certified and tightly integrated

with the microprocessors and sensors

and with acceleration and other code

libraries.

Figure 4. Sensor fusion – the aggregation and correlation of multiple sensor inputs – requires complex, fast, and accurate embedded software in addition to high-performance sensors and

microprocessors. Developing such software can be easier and more efficient with the right software development tools.

Page 6: SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

www.tasking.com

SOFTWARE PLAYS A CRUCIAL ROLE IN SENSOR FUSION

KEYS TO SENSOR-FUSION SUCCESS

Sensor fusion is a critical part of enabling safe ADAS and ultimately, autonomous vehicles. It requires robust sensors and

multi-core, fast microprocessors optimized for automotive applications. But alone, these are not sufficient. It is well-written

software that gives the hardware a voice. Best practices, such as choosing a compiler that is ASPICE-certified and making

sure software development tools are tightly coupled with the hardware to maximize performance and safety, can result in

embedded solutions that can handle navigation, mapping, tracking, obstacle avoidance, and path planning. Such solutions will

also be flexible, usable, modular, and reliable.

REFERENCES

(1) What Will it Take for Ford, Uber and Others to Meet Their Challenging New Targets for Driverless Cars? http://www.

kvhmobileworld.kvh.com/ford-uber-challenging-new-targets-for-driverless-cars/

(2) AI is already driving the future of connected cars, https://venturebeat.com/2017/03/22/why-ai-will-drive-the-future-of-

connected-cars-starting-now/

(3) How sensor fusion impacts the automotive ecosystem, http://www.electronics-eetimes.com/news/how-sensor-fusion-

impacts-automotive-ecosystem/page/0/1

(4) Automotive Conference Call 11 October 2016, http://www.infineon.com/

dgdl?fileId=5546d461579365280157b3707fa500f8

(5) By 2020, a Quarter Billion Connected Vehicles Will Enable New In-Vehicle Services and Automated Driving Capabilities,

http://www.gartner.com/newsroom/id/2970017

(6) Disruptive trends that will transform the auto industry, http://www.mckinsey.com/industries/high-tech/our-insights/

disruptive-trends-that-will-transform-the-auto-industry

(7) Advanced driver-assistance systems: Challenges and opportunities ahead, http://www.mckinsey.com/industries/

semiconductors/our-insights/advanced-driver-assistance-systems-challenges-and-opportunities-ahead

(8) Advanced Driver Assistance Through Massive Sensor Fusion, http://www.ieeecss.org/sites/ieeecss.org/files/documents/

IoCT-Part4-12AdvancedDriverAssistance-LR.pdf

(9) The advantages of MIPI specifications in mobile, automotive and multimedia applications, http://www.techdesignforums.

com/practice/technique/advantages-of-mipi-specifications-in-mobile-automotive-and-multimedia-applications/

(10) Optimizing compilers for ADAS applications, www.edn.com/Pdf/ViewPdf?contentItemId=4457916