Andrew Ng and Drive.ai announce the launch of their first driverless car.
It will undergo a testing period of 6 months along with a human driver till the system is completely self-sustained.
Their long term plan is to eliminate the concept of drivers completely.
The idea of having driverless cars has been around in the world of machine learning for quite some time now. Having cars that don’t need any human controlling it has always been an interesting application of AI that has traditionally been hard to achieve though as there are numerous parameters that need to be taken into consideration.
Andrew Ng, one the most popular and important figures in the machine learning, deep learning and AI community has recently announced that with his help Drive.ai, a company that specialises in driverless cars will release their first self-driving car in Texas, USA. if you know about Andrew Ng you also know how much this concept means to him. You can learn more about the company here
Drive.ai was founded by some of Ng’s graduate students in the year 2015 who used machine learning and deep learning algorithms to train the cars to learn and correct themselves.
The car will not be available to the public or to purchase, at least initially. Instead it will undergo a 6 month testing period where it can pick up and drop people over small distances which human drivers consider a waste of effort because of the relatively short distance. The customers can book a ride via Drive.ai’s mobile application. But during the testing period there will be a human driver present in the car just in case something goes wrong. Once the car is fully prepared to handle itself it will be available to the general public as well. The long term goal is to make this service available throughout the city via any route and eliminate the concept of drivers completely!
In related news, another company from the UK, Wayve, recently designed their own self driving car which can learn to drive within 15-20 minutes. you can read more about that here
How this works?
A full software stack was developed for self learning, which utilises in-house perception, motion planning, mapping, localisation, fleet management software, mobile app, communications and more. With these, the team was able to solve any dependencies between the system.
Computer vision is not yet advanced enough to comprehend hand gestures like for example, when a traffic police asks the car to stop. In order to overcome this, developers had to come up with a realistic road map in order to interpret hand gestures of a construction worker waving for a car to stop or proceed. This is one of a kind system that no one else has been able to build so far.
It also uses external panels to communicate with pedestrians, telling them that it is safe to cross.
The company also released a video to promote its launch. You can check it out here:
So let’s summarise:
It’s fascinating to see how far we’ve come in the field of machine learning and AI, especially when you have a guru like Andrew Ng himself involved in a project. You know it’s going to be something out of this world
Personally, we can’t wait to see one of these bad boys in action oursleves.