Automotive

History of ADAS, Part Three: from mechanical systems to software-defined vehicles

9th November 2022
Sam Holland
0

The previous two articles (Part One and Part Two) in this History of ADAS series explored the origins and background information of various advanced driver assistance systems. This third and final article of the series from Dan Clement, Senior Principal Applications Marketing Engineer at onsemi, discusses driver monitoring and drowsiness detection, surround-view, and mirror replacement. This is before ultimately moving on to future automotive technology trends, including the software-defined vehicle and virtual and augmented reality.

Drowsiness detector and driver monitoring

As with many ADAS, the first drowsiness detector was mechanical in nature: they were rumble strips installed between lanes and on the roadsides. While not a perfect solution, it was the first one that made a measurable difference in reducing tiredness-related car crash statistics.

A report published by US National Highway Traffic Safety Administration (NHTSA) in 1998 provides comprehensive coverage about drowsy driving. The report documented lab and in-vehicle drowsiness measurement tools. At the time of the research, it was already common to measure physiological signals to detect drowsiness. Unfortunately, this was only practical in a lab setting given that each person had to be studied and the data individually calibrated.

However, the report mentioned that in-vehicle systems were being studied, such as eye-closure monitors, steering sensors, and lane tracking devices. Sadly though, these devices were not commercially available back then due to limitations in the technologies.

Perhaps the first known electronic drowsiness detector was the steering angle sensor. The first commercially available systems appeared in the early 2000s. Consider that a steering wheel sensor can track how far and fast the steering wheel turns. By itself, this information is not helpful. However, when such a system is combined with a software algorithm that uses this data in addition to speed, stability control (yaw and pitch), and even camera information, it can construct a reliable estimation of drowsiness.

Typically, these systems only work at motorway speeds and only measure small instances of steering. This is as the many steering wheel turns of stop-and go city driving would confuse the algorithm. These technologies use the initial phase of each trip to calibrate the driver as a baseline. In view of the rise in autonomous driving (at least the lower levels of it), one question that arises is how useful driver monitoring can be when the car itself has control.

A depiction of a driver monitoring system: an eye scanner for drowsiness detection

A new solution later emerged, which would work for both a human or autonomous driver, to solve this problem. This solution is called a driver monitoring system (DMS), and while it was explored in the late '90s, it was not production-ready until the 2020s. A DMS uses a camera, computer vision, and processing, to look for facial and eye cues to ensure that the driver is attentive and engaged. Sounds simple enough, but the algorithms are complicated to implement reliably.

Euro NCAP now mandates these DMS systems: all new vehicles by 2024 must have a DMS system to get the highest crash and safety ratings. There are quite a few different proprietary solutions that range from low end to high end. Unfortunately for the automakers, this market is highly cost-sensitive, and customers do not want to pay extra for this feature.

Another trend is to extend the driver monitoring to include all occupants, and this type of monitoring is called occupant monitoring systems (OMS). Automakers are excited about this solution because they can monetise it as comfort and convenience features. Drivers can use gesture and facial recognition to customise vehicle settings. Video call or social media apps can apply the OMS in Internet-connected cars.

There are also many safety and security features that utilise OMS, such as an alert system when a child is left unattended in a vehicle, and they can even prompt the car to turn off airbags in empty seats.

DMS typically use near-infrared imaging with a global shutter, whereas OMS typically use a rolling shutter with visible light. Most automakers want to combine the DMS and OMS into DOMS (driver and occupant monitoring systems) to lower costs and make a smaller solution.

An example of an occupany monitoring system. Image credit: Business Wire

High-performance image sensors from onsemi are implementing a novel system solution that allows one rolling shutter image sensor used for both DMS and OMS applications, making an ideal and very cost-efficient solution for DOMS.

One challenge is that the DMS is typically mounted on the steering column or the dashboard, while OMS is better placed above the vehicle's rearview mirror or on its pillars. Despite the extra design complexity, it is becoming increasingly common for those systems to be combined in the interest of cost savings.

Surround-view and mirror replacement

Surround-view cameras are visible light cameras placed on the outside of the vehicle to increase driver visibility while reversing or parking. There are typically four cameras: one in the front, one in the rear, and two on the side of the vehicle.

These cameras all have wide-angle lenses and create a ‘fisheye’ type of image. Image processing and advanced algorithms merge the four images and a picture of the car. The resulting bird’s eye view is shown on a screen in a dashboard display and mimics a cameraabove the vehicle for the driver to visualise the surroundings fully.

The first vehicle to have a surround-view system was the 2007 Infiniti EX35, co-developed by Infinity and Nissan. The original system only offered this bird’s eye view, but modern systems can provide multiple views. When combined with ultrasonic sensors around the car, the surround view system effectively avoids collisions and fits in tight parking spaces.

The ultrasonic sensors also help alert the driver to pedestrians or moving objects. Some more advanced systems can also ‘see through’ the vehicle’s hood or even the back of a trailer while towing.

Besides parking, the side cameras can replace the traditional side mirrors for use during driving. As cameras can be made very small, the size of the mirrors can also decrease, which makes the vehicle more aerodynamic, leading to a 4% reduction in fuel mileage or an increased range in an electric car.

While mirror replacement works technically, many drivers dislike it since they are used to driving with real side mirrors. Another hurdle is that some countries require the side mirrors as a backup, which negates the benefits provided by removing the mirror.

The software-defined vehicle

The term ‘software-defined vehicle’ has become a descriptor for a massive change in the design of cars. Significant research and development has been pouring into the pursuit of autonomous vehicles. The amount of computing needed for all the sensors that support autonomous driving has necessitated significant changes in how vehicle systems and sensors are controlled and used.

Before 2015, the more traditional approach involved having a central CAN (controller areanetwork) or LIN (local interconnent network) architecture between modules that contained  their own local processor. As the vehicle and its systems became more complex and required more centralised computing, domain controllers became more common out of necessity.

Plus, to combine all the various sensor inputs, process the data, and send instructions to multiple safety systems (to implement the ADAS functions), this requires an ADAS controller. Data rates are also increasing, requiring higher-speed data transfer protocols.

Eventually, there will be centralised computing due to the plethora of processing demands thatare necessary for autonomous driving and other advanced systems. Consumers demand more digital features and longer-term value through software updates. Automakers are also very interested in this model because they must do agile developments with over-the-air software updates to implement new features and patches for bugs.

Along with new feature development, they also can create new revenue streams through services and subscriptions. The entire automotive industry is going through this evolution as we speak. In fact, some auto companies are going on record as software companies instead of car companies.

Virtual and augmented reality

There is a lot of excitement around virtual reality (VR) and augmented reality (AR) and what is commonly known as the Metaverse. The Metaverse is a 3D version of the web where everything has a digital twin, and entirely new experiences, immersion, and collaboration are all possible. There are already vehicles on the road with head-up displays (HUDs).

The HUD will eventually overlay more and more digital content as AR for the driver. For example, consider 3D navigation projection and perhaps even 3D video calls. And how about AR weather forecasts? There won’t be a glass windshield in the extreme case: the front will be a solid unit. Instead of a glass windshield, giant screens will replace the space with a completely virtual view. AUTO: ADAS Level 5 autonomous driving could even replace the windshield altogether.

Concluding the History of ADAS series

This series of articles has gone back to the original mechanical cruise control called the Speedostat, through the evolution of both mechanical and then electrical ADAS, all the way up to the software-defined vehicle and the vision of fully autonomous driving and Metaverse immersion.

The current transformation in the automotive industry is quite literally electrifying, and technology is changing fast. While we are excited by the new paradigms and changes, it’s also helpful to review the past and the fascinating histories of the systems that we take for granted today.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier