Engineers Garage

  • Electronic Projects & Tutorials
    • Electronic Projects
      • Arduino Projects
      • AVR
      • Raspberry pi
      • ESP8266
      • BeagleBone
      • 8051 Microcontroller
      • ARM
      • PIC Microcontroller
      • STM32
    • Tutorials
      • Audio Electronics
      • Battery Management
      • Brainwave
      • Electric Vehicles
      • EMI/EMC/RFI
      • Hardware Filters
      • IoT tutorials
      • Power Tutorials
      • Python
      • Sensors
      • USB
      • VHDL
    • Circuit Design
    • Project Videos
    • Components
  • Articles
    • Tech Articles
    • Insight
    • Invention Stories
    • How to
    • What Is
  • News
    • Electronic Product News
    • Business News
    • Company/Start-up News
    • DIY Reviews
    • Guest Post
  • Forums
    • EDABoard.com
    • Electro-Tech-Online
    • EG Forum Archive
  • DigiKey Store
    • Cables, Wires
    • Connectors, Interconnect
    • Discrete
    • Electromechanical
    • Embedded Computers
    • Enclosures, Hardware, Office
    • Integrated Circuits (ICs)
    • Isolators
    • LED/Optoelectronics
    • Passive
    • Power, Circuit Protection
    • Programmers
    • RF, Wireless
    • Semiconductors
    • Sensors, Transducers
    • Test Products
    • Tools
  • Learn
    • eBooks/Tech Tips
    • Design Guides
    • Learning Center
    • Tech Toolboxes
    • Webinars & Digital Events
  • Resources
    • Digital Issues
    • EE Training Days
    • LEAP Awards
    • Podcasts
    • Webinars / Digital Events
    • White Papers
    • Engineering Diversity & Inclusion
    • DesignFast
  • Guest Post Guidelines
  • Advertise
  • Subscribe

What are the sensors used in self-driving cars?

By Nikhil Agnihotri February 22, 2023

Automotive engineers have already developed semi-autonomous vehicles. Fully self-driving vehicles are not far from reality. According to recent research, autonomous driving (AD) could create $300 billion to $400 billion in revenue by 2035.

The self-driving car not only showcases how advanced technology is, but it’s also a subject of controversy. There are valid concerns about safety, faulty tech, hacking, and the potential loss of driving jobs. Conversely, the opposite might be true. AD could lead to safer rides, greater convenience, and more productivity or free time. Instead of wasting hours in traffic, the “drivers” of the future could spend their commutes working, reading, or catching up on a TV series.

One critical component of self-driving vehicles is sensor technology — heterogeneous sensors, to be precise. The sensor data is trained using artificial intelligence (AI) and machine learning (ML) to observe and respond to the surroundings. Based on AI and ML algorithms, a vehicle uses the sensors to find the ideal route, decide where or where not to drive, detect nearby objects, pedestrians, or other vehicles to avoid collisions, and react to unexpected scenarios.

There have been two primary efforts in the development of self-driving cars.

1. Using cameras and computer vision for driving
2. Employing sensor fusion (i.e. using heterogeneous sensors to make the car see, listen, and sense its surroundings)

Most engineers have determined that AD can only be successful with vehicle cameras and computer vision. Instead, sensor fusion is the safest, most reliable choice.

There are four major sensor technologies used in self-driving vehicles:

  • Cameras
  • LIDAR
  • Radar
  • Sonar

Thanks to sensor fusion technology and rapidly improving AI, self-driving vehicles have begun to gain recognition as a real future possibility. It’s projected that by 2030, about 12% of vehicles registrations worldwide will be AD.

As far as lost driving jobs, similar concerns were raised when computers were introduced, and we know this technology has generated millions of jobs, globally. It’s likely that self-driving vehicles will also add to the necessity of skill-based jobs in the automotive industry.

Let’s explore the sensor technologies that are enabling autonomous driving.

The camera
Cameras are already used in vehicles for reverse parking, reverse driving, adaptive cruise control, and lane departure warnings.Self-driving vehicles use high-resolution color image cameras to obtain a 360-degree view of their surroundings. The images may be collected as multi-dimensional data, from different angles, and/or as video segments. Different image- and video-capturing methods are currently being tested, along with the use of AI technology. It’s necessary to ensure reliable on-the-road decision-making is possible for safe driving. These are resource-intensive tasks.

Such cameras do show potential, especially with advanced AI and ML. The high-resolution cameras can properly detect and recognize objects, sense the movement of other vehicles, determine a route, and visualize their 3D surroundings. They approximate human eyes, allowing a vehicle to drive similarly to one manned by a real person. 

But there are drawbacks. For example, a camera’s visibility depends upon environmental conditions. As a camera is a passive sensor, it’s unreliable under low visibility conditions. Infrared cameras might be an option, but these images must be interpreted by AI and ML, which is still in the works.

Two types of camera sensors are used for AD: mono or stereo camera. The mono camera has a single lens and image sensor. It can only take two-dimensional images, which can recognize objects, people, and traffic signals. However, 2D images are not useful in determining the depth or distance of objects. To do so would require highly complex ML algorithms that have questionable results.

A stereo camera has two lenses and two image sensors. It takes two images simultaneously from different angles. After processing the images, the camera can determine the depth or distance of an object, making it the better choice for AD — except for the visibility issues in low light.

Some developers are combining mono cameras with distance-measuring techniques, such as LIDAR or radar, and sensor fusion to predict traffic conditions accurately.

Cameras certainly offer an important role in AD. However, they will require help.

LIDAR
LIDAR is one of the prominent technologies enabling self-driving vehicles. It’s an imaging technology used for geospatial sensing since the ’80s. Self-driving cars would typically have a rotating LIDAR sensor mounted on the roof.

Two types of LIDAR sensors can be used for AD. One is the mechanically rotating LIDAR system mounted on a vehicle’s roof. But these systems are typically costly and sensitive to vibrations. Solid-state LIDAR is another option that requires no rotation. They are the preferred choice for self-driving cars.

A LIDAR sensor is an active sensor. It works based on the time-of-flight principle, emitting thousands of infrared laser beams to its surroundings and detecting the reflected pulses using a photo-detector. The LIDAR system measures the time taken between the emission of the laser beam and its detection by the photo detector.

Based on the time spent between emission and detection, the distance is calculated as the laser beam travels with the speed of light. A three-dimensional point cloud is created based on the distance covered by different pulses. The reflected pulses are recorded as point clouds (i.e. a set of points in space representing a 3D object).

Such LIDAR systems are highly accurate and can detect extremely small objects. However, like visible-light cameras, LIDAR is unreliable in low-light visibility as the reflection of the laser pulses can be affected by weather conditions. Another drawback is the cost, which is in the thousands.

But LIDAR still holds promise for AD as new development are tried and tested.

Radar
Radar sensors are already used in many vehicles for adaptive cruise control, driver assistance, collision avoidance, and automatic braking. Typically, 77GHz radar for long-range detection or 24 GHz radar for short-range detection is used. The short-range radar (24 GHz) goes up to 30 meters. It’s cost-effective for collision avoidance and parking assistance. The long-range (77 GHz) goes up to 250 meters. It’s used for object detection, adaptive cruise control, and assisted braking.

Radar is excellent at detecting metal objects. It can be used with cameras to accurately monitor the movement of surrounding vehicles and detect potential obstructions. 

Radar has limited capability for self-driving because it’s unable to classify objects. The radar data can detect objects but cannot recognize them. At best, low-resolution radar can support mono cameras and LIDAR or stereo cameras to deal with low-visibility situations.

Sonar
Sonar technologies are also being tested for AD. Passive sonar listens for sounds from surrounding objects and estimates an object’s distance from them. Active sonar emits sound waves and detects echoes to estimate the distance of nearby objects based on the time-of-flight principle.

Sonar can operate in low visibility, but it has more drawbacks than advantages for self-driving vehicles. The speed of sound limits the operation of sonar in real-time for safe AD. Also, sonar can give false positives. Lastly, it can detect large objects at short range but cannot recognize or classify them. Sonar is only useful for collision avoidance in unexpected conditions.

Inertial sensors
Inertial sensors, such as accelerometers and gyroscopes, are highly useful in enabling self-driving. The inertial sensors can be used to track the movement and orientation of a vehicle. They can be used to signal a vehicle to stabilize on rough roads or to take action to avoid a potential accident.

GPS
A self-correcting GPS is one vital requirement for self-driving. Using a satellite-based triangulation technique, GPS lets a vehicle precisely locate the car in three-dimensional space.

Sometimes GPS signals are unavailable or interfered with due to obstacles or spoofing. In such cases, self-driving vehicles must rely on a local cellular network and data from inertial sensors to accurately track the car’s position.

Conclusion
Self-driving vehicles typically use several heterogeneous sensors. One advantage of using many sensors is for backup — if one sensor fails, another one can compensate for it. A sensor-fusion technique will be necessary for fully autonomous vehicles using data from different sensors to determine the surroundings.

Currently, multiple approaches are being tested in the development of AD. One relies on stereo cameras to fully enable self-driving. Another approach uses mono cameras to provide a 360-degree vision, incorporating LIDAR or radar technology to sense distance. A third approach uses stereo cameras with radar sensors.

Cameras with sensors will likely be required for AD to effectively classify and recognize objects. The radar and LIDAR technologies can assist in using sensor fusion to render a weather-resistant self-driving solution. They can add to a 3D element, ensuring a better understanding of the driving environment.

The sonar or ultrasonic sensors will also offer a key role as they’re weather-resistant and reasonably cost-effective, providing an effective collision avoidance and emergency handling solution. The self-driving cars will ultimately rely on some combination of all these technologies.

 

You may also like:


  • What is LiDAR and how does it work?

  • What are inertial sensors?

  • Smart mobility and the technologies paving the way

  • The Google Driverless Car

  • How a driverless car sees the road

  • Technological advances in smart city development

Filed Under: Sensors, Tech Articles, Tutorials
Tagged With: pic
 

Next Article

← Previous Article
Next Article →

Questions related to this article?
👉Ask and discuss on EDAboard.com and Electro-Tech-Online.com forums.



Tell Us What You Think!! Cancel reply

You must be logged in to post a comment.

EE TECH TOOLBOX

“ee
Tech Toolbox: Internet of Things
Explore practical strategies for minimizing attack surfaces, managing memory efficiently, and securing firmware. Download now to ensure your IoT implementations remain secure, efficient, and future-ready.

EE Learning Center

EE Learning Center
“engineers
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

HAVE A QUESTION?

Have a technical question about an article or other engineering questions? Check out our engineering forums EDABoard.com and Electro-Tech-Online.com where you can get those questions asked and answered by your peers!


RSS EDABOARD.com Discussions

  • Reducing "shoot-through" in offline Full Bridge SMPS?
  • High Side current sensing
  • How to simulate power electronics converter in PSpice?
  • Voltage mode pushpull is a nonsense SMPS?
  • Layout IRN reduction in Comparator

RSS Electro-Tech-Online.com Discussions

  • Back to the old BASIC days
  • Parts required for a personal project
  • PIC KIT 3 not able to program dsPIC
  • Failure of polypropylene motor-run capacitors
  • Siemens large industrial PLC parts

Featured – RPi Python Programming (27 Part)

  • RPi Python Programming 21: The SIM900A AT commands
  • RPi Python Programming 22: Calls & SMS using a SIM900A GSM-GPRS modem
  • RPi Python Programming 23: Interfacing a NEO-6MV2 GPS module with Raspberry Pi
  • RPi Python Programming 24: I2C explained
  • RPi Python Programming 25 – Synchronous serial communication in Raspberry Pi using I2C protocol
  • RPi Python Programming 26 – Interfacing ADXL345 accelerometer sensor with Raspberry Pi

Recent Articles

  • What is AWS IoT Core and when should you use it?
  • AC-DC power supply extends voltage range to 800 V DC
  • Infineon’s inductive sensor integrates coil system driver, signal conditioning circuits and DSP
  • Arm Cortex-M23 MCU delivers 87.5 µA/MHz active mode
  • STMicroelectronics releases automotive amplifiers with in-play open-load detection

EE ENGINEERING TRAINING DAYS

engineering

Submit a Guest Post

submit a guest post
Engineers Garage
  • Analog IC TIps
  • Connector Tips
  • Battery Power Tips
  • DesignFast
  • EDABoard Forums
  • EE World Online
  • Electro-Tech-Online Forums
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips
  • 5G Technology World
  • Subscribe to our newsletter
  • About Us
  • Contact Us
  • Advertise

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy

Search Engineers Garage

  • Electronic Projects & Tutorials
    • Electronic Projects
      • Arduino Projects
      • AVR
      • Raspberry pi
      • ESP8266
      • BeagleBone
      • 8051 Microcontroller
      • ARM
      • PIC Microcontroller
      • STM32
    • Tutorials
      • Audio Electronics
      • Battery Management
      • Brainwave
      • Electric Vehicles
      • EMI/EMC/RFI
      • Hardware Filters
      • IoT tutorials
      • Power Tutorials
      • Python
      • Sensors
      • USB
      • VHDL
    • Circuit Design
    • Project Videos
    • Components
  • Articles
    • Tech Articles
    • Insight
    • Invention Stories
    • How to
    • What Is
  • News
    • Electronic Product News
    • Business News
    • Company/Start-up News
    • DIY Reviews
    • Guest Post
  • Forums
    • EDABoard.com
    • Electro-Tech-Online
    • EG Forum Archive
  • DigiKey Store
    • Cables, Wires
    • Connectors, Interconnect
    • Discrete
    • Electromechanical
    • Embedded Computers
    • Enclosures, Hardware, Office
    • Integrated Circuits (ICs)
    • Isolators
    • LED/Optoelectronics
    • Passive
    • Power, Circuit Protection
    • Programmers
    • RF, Wireless
    • Semiconductors
    • Sensors, Transducers
    • Test Products
    • Tools
  • Learn
    • eBooks/Tech Tips
    • Design Guides
    • Learning Center
    • Tech Toolboxes
    • Webinars & Digital Events
  • Resources
    • Digital Issues
    • EE Training Days
    • LEAP Awards
    • Podcasts
    • Webinars / Digital Events
    • White Papers
    • Engineering Diversity & Inclusion
    • DesignFast
  • Guest Post Guidelines
  • Advertise
  • Subscribe