When we think about driverless cars, the first question that comes to our mind is, why do we need it? Imagine someone who does not have the ability to drive, like aged group, under age kids or a blind person. How will this technology help them? Everyone would be safe and reach their destination hassle free.
Driverless cars are similar to any other cars, but it can steer itself, can detect traffic, stop and go accordingly, and accelerate itself to a safe speed. The time which is spent in driving could be used to do other valuable work on the go.
The technology behind Tesla’s and Google’s driverless cars is bit different. Google is working on the project since 2009. Google tested their first self-driving technology with Toyota Prius on freeways in California. Later they did many more tests on freeways and shifted their test sites to traffic prone areas with the complex environments. Google has developed its own self driving cars and they have even used this technology on other cars also.
Technological difference between Tesla’s and Google’s cars
While Google uses LIDAR technology, Tesla goes for the combination of passive optics and RADAR. Self-driving cars should be able to view what’s around it, like humans. Google’s LIDAR (light detection and ranging), is a remote sensing technology that uses laser to navigate safely. Time needed to receive the reflected laser beam determines the distance between two or more cars and even objects. It can determine up to 60 meters. Google integrates Google maps, artificial intelligence and hardware technology to navigate safely in traffic. Some experts consider LIDAR as most accurate, but it comes with the huge price around $80,000 for just one sensor. Google maps interact with GPS and acts like a database; i.e. traffic reports, upcoming intersections, nearby collisions, directions.
Tesla launched autopilot software in 2014. This system provides with a degree of autonomy. Elon Musk, the billionaire, has promised to deliver a full autonomous car from the start of 2018. Tesla uses high tech camera sensors plus it has 12 ultra-sonic sensors that provide 360-degree vision and a forward looking camera, RADAR system to help it enable its semi-autonomous autopilot system. The sensor calculates the possibility of collisions and cameras detects road features and pedestrians movement.
There have been a few cases where safety is compromised. In some case, Google claimed responsibility for it and in other case didn’t. Even Tesla’s self-driving cars are not completely safe. In early 2016, there were few accidents which created a buzz in the media.
Since we have seen all positive sides of the driver-less cars, let’s talk about its limitations. This system is found to have some drawbacks when the weather conditions are not favorable, like snowfall or rain. These cars steer on pre-programmed route, so it cannot detect route if there is any temporary change in traffic lights. Signals given by traffic police are not recognized by these cars. It even steers unnecessarily when small debris or objects which are quite harmless comes on the way.
While a poll conducted by a group of researchers from the University of Michigan, shows, Americans are not yet ready to accept this concept. So whether Tesla or Google has to put in more effort to convince their customers to put their life in the hands of technology and work on improving lacunas which have created doubts in peoples mind.
By – Rajan