As the commercial production of self-driving vehicles edges closer and closer, the rush to develop fully autonomous car control systems with the capacity to handle driving in any environment has also picked up. Big names in the industry are also in the race along with a number of small players who are pushing to come up with a formula that would convert normal vehicles into autonomous self-driving cars. Almotive is one such company in the race. We recently talked to Pocsveiler Lorant, a senior member of the Almotive team, on how self-driving vehicles view, interpret, and react to the environment around them.
To see, you have to open your eyes
Like a human, an autonomous vehicle also has to see what is happening in the surrounding environment so as to be able to effectively maneuver around it. According to the Almotive team, the first autonomous vehicles have to be vision based since the current roads have been designed with humans in mind.
So how can this be achieved? According to Lorant, a camera-first approach is the best way through which this can actualized. The current road networks are specifically designed for humans and as such have a lot of visual cues such as traffic signs and road demarcation lines that can be picked up and be interpreted by an autonomous vehicle.
For this to work, cameras have to be mounted on the roof, tail, and nose of the vehicle. The use of radar and sensors also has to be incorporated as a backup feature for the vehicle system to build up a true picture of the surrounding environment. Most players in the industry prefer the use of expensive Lidar sensors, but Almotive has taken a different approach: they are making use of cost-effective sensors that meet the recommended resolution. This is considered as being advantageous as it makes it easier for manufacturers to match their hardware with the software
Interpreting what it's seeing
The autonomous vehicle also has to correctly interpret what it’s seeing. The Almotive team has equipped the vehicle with a recognition engine that breaks down data from the sensors and passes it through specialized segmentation software. The software then correctly identifies the various objects from the data feed. The Almotive system can be able to identify 100 different classes of objects but as of now is only making use of 25.
According to the Almotive team, the software has the capacity to discern every object captured on its frame in terms of size, distance, and angle. This is then compared to data from a GPS location engine so as to create a clearer picture of the surrounding.
``The GPS location engine is specifically meant to determine where the vehicle is’,’ says Pocveiller. ``GPS location services is the first step to achieving this though it does not give pin-point accuracy in less mapped regions.’’
The Almotive team is also developing a mapping system that will highlight landmarks and traffic signs so as to improve accuracy. The vision first mapping system will have navigation data that will help the autonomous vehicle system to confirm what the car is actually seeing.
Moving through crowds
Once the vehicle has identified everything, the next step would be to chart a course. The motion engine performs this task. It picks up the motion of objects basing it on where they have been, their current location, and using this data calculates where they will be heading next. This system functions much like a normal driver in a traffic situation and helps deals with pedestrians and vehicles. The direction where the vehicle will be heading based on this calculations is usually displayed in the in-car screen by use of an arrow
After processing all this information, the vehicle has to then put the thoughts into action. The control engine fitted within the vehicle is tasked with this and as a norm automatically operates the gas pedal, brakes and steering wheel so as to achieve the desired motion.
We were not able to see the vehicle operating by itself during our visit: the company does not have a driverless license in California. However, we were able to see a simulation of the same and it was quite remarkable – you would never have guessed the vehicle was actually driving itself.
According to Pocveiler, the Almotive system will be in place for actual testing on a highway environment at the end of this year and in a city environment within the end of next year.