Self-driving cars used to be a science fiction fantasy. Now they’re right out there… if we’re lucky enough to catch them on the road. An autonomous car, also known as a driverless car, is a vehicle that can operate on its own without the need for human interaction. Autonomous cars are equipped with cameras and sensors to help them navigate their surroundings.
The benefits of autonomous cars are pretty evident. There will be less accidents and lives lost due to human error (cars don’t get drowsy or have bad days!). The roads will be less congested due to the optimization of routes through intelligent algorithmic planning, and thereby enabling more logistic transportation and utility operations to be performed during off-peak hours such as at night. However, significant safety concerns are still holding this technology back. These safety concerns will need to be addressed before we can use autonomous vehicles to solve these problems.
Self-driving car being tested in a tight public setting
That’s all well and good, but first how do these AV (autonomous vehicles) work? Well using ourselves as a reference, how do we navigate around our environment? We need our body and our legs to move, our eyes to view the obstacles on our path, and our brain to determine how to move next. If body is represented by the vehicle, then the eyes of the vehicles are its cameras and sensors.
Eyes of the car
The common “eyes of the vehicle” would include sensors such as cameras, radars and lidar. The cameras are usually placed on all sides of the vehicle, on the front and back and the sides as well. This allows a 360-visual image of its surroundings. Some vehicles have less than a 360 view but a longer range of visuals. Others might even incorporate fish-eye lens for a more extensive view especially for complex procedures like parking. Cameras don’t deteriorate as quickly as human eyes, and may be exceptionally more powerful, but limitations still exist. Would it still be able to distinguish traffic lights and signs in heavy rain?
Radar sensors
Radar sensors can assist in those scenarios for object detection. Radars use radio waves for detecting the speed and location of cars on a real-time basis. These days they’re often the first choice in collision avoidance. Radars use a self-contained transmitter and receiver to send out radio waves, which bounce off the surrounding objects. Detecting the return of these waves provides details about an object’s distance, size, and speed. Like cameras, these radar sensors are placed around the car at every angle. They’re able to determine speed and distance of the objects but they can’t distinguish between different types of vehicles.
Light Detection and Ranging
LiDar sensors have also become common in autonomous vehicles to supplement radar equipment. LiDar uses laser technology that beams out ultrasonic waves to measure distances-way beyond what humans can see or hear. LiDAR uses light signals with lower wavelengths than RADAR making LiDAR much more accurate and precise for detecting smaller objects and creating 3D projections. LiDar, radar and cameras provide a vast amount of data for the vehicle to “look” around and navigate its surroundings.
Point Cloud created by Velodyne Lidar’s Alpha Prime sensor
Pedestrian detection system made by the Statistical Visual Computing Lab at UC San Diego.
What good are eyes without a brain to understand what you’re looking at?
To move around obstacles, define its position in the world with pinpoint accuracy, define and plan its trajectory and control or steer its angle and acceleration, a self-driving car makes use of deep learning in neural networks. Neural Networks are inspired by the human brain, with the basic units being neurons. Deep learning neural networks (DNN) learn by experience, and by having more data. For example, if a DNN is shown many images of stop signs, it will eventually learn to identify them without human supervision! For self-driving cars, they may need many DNNs as there are multiple tasks for it to handle. This can include looking for pedestrians, traffic signs, traffic lights, obstacles, and many others.
Conclusion
Well, that’s a beginner’s overview of how a self-driving car works! There is still a lot to improve on for self-driving cars, but the benefits are clear and the push for their commercialization is strong. There is a fear that jobs will be replaced due to this. This includes grab drivers, food delivery and online shopping deliveries to say the least. But there can be an argument made that new technology would make jobs as well. One day in the far future, car accidents may not belong in the dictionary anymore. Until then, we’ll have to stick with human drivers for now.
Learn more about Data Science & Artificial Intelligence and how to get started on the journey!
Aventis Learning Group has great courses to start you off with knowledge in Artificial Intelligence! To begin with, we have a Graduate Diploma in Data science & Artificial intelligence that includes learning some of the machine learning models that help you have a personal taste of prediction, with python! This can eventually help you begin your path in Deep Learning and Neural Networks!
After lots of practice, you will eventually master python and machine learning! This may lead to our Master of Science in Artificial Intelligence awarded by London Metropolitan University (UK)! The future is here and you have a part to play in it!
For more information about our A.I related courses click here!
Graduate Diploma in Data Science & Artificial Intelligence (aventis.edu.sg)
Master of Science in Artificial Intelligence (AI) (aventis.edu.sg)
References