Electronics and Semiconductors

Q&A: AImotive's Vision-first Approach Propels Its Self-driving Car Strategy

25 September 2017

AImotive has established it will work on a vision-first strategy when it comes to autonomous driving. Source: AImotiveAImotive has established it will work on a vision-first strategy when it comes to autonomous driving. Source: AImotiveSince the explosion in autonomous driving started, companies have been coming out of the woodwork to gain a toehold in a technology that isn’t even available on the market yet.

These companies run the gamut of automotive OEMs, automotive equipment vendors, chipmakers and software companies. AImotive is developing camera-first, artificial intelligence-based Level 5 (fully autonomous) self-driving technology that hopes to mimic the visual capabilities of human drivers, and which the company hopes to scale to make autonomous driving a reality for car makers.

With its software stacks, including aiDrive, aiKit and aiWare, the company is accelerating training, testing and verification processes for future self-driving vehicles. Recently, the company formed an agreement with VeriSilicon to use its aiWare in a design for autonomous-driving test chips. The chips will be fabricated on GlobalFoundries 22FDX semiconductor process.

AImotive’s CEO, Laszlo Kishonti, sat down to chat with Electronics360 about the company’s vision-first technology, the differences between Level 4 and Level 5 autonomy, its collaborative efforts with other vendors and future vehicle-as-a-service options.

1) Tell us a bit about how AImotive got started and why you changed the company name from AdasWorks.

AdasWorks was a spinoff of Kishonti Ltd., a global leader in performance optimization services at that time, founded by myself. A couple of years ago, we partnered up with a global chip provider to prove that embedded processors were capable of running the complex algorithms for future self-driving technologies. The demonstration was a success, and we realized a huge change was on the horizon for the automotive industry, one that just so happened to fit our core competencies perfectly.

We established a small group within Kishonti, focusing on camera-based detection algorithms, optimized on embedded processors, which a few years later led to the founding of AdasWorks in July of 2015. AdasWorks operated as an independent company, focusing on automotive computer vision, artificial intelligence and navigation technologies. The startup’s aim was to revolutionize the automotive industry by developing cost-effective advanced driver assistance systems (ADAS) software for next-generation application processors. We merged Kishonti Ltd. one year later into AdasWorks, as the company grew rapidly and there was an urgent need for skilled and experienced computer engineers.

It soon became evident there were no general requirements from the automotive industry for the performance and precision of recognition modules, nor how other components (localization, motion planning, control) would use detection outputs. To accelerate our development process and fit the industry needs, we started working on a prototype self-driving car, also focusing on development tools and processing hardware, so that we understood the ecosystem of self-driving and how individual components relied on each other. It was at the end of 2016 when we realized we addressed all the core components of self-driving development, going way beyond what you can define with ADAS functionalities. We had our full-stack, AI-based self-driving software and framework, and this is why the name of AImotive reflected our technologies and vision, better than AdasWorks.

2) There are now numerous companies that are developing autonomous driving capabilities, what makes your technology different?

AImotive CEO Laszlo Kishonti.AImotive CEO Laszlo Kishonti.Companies that are working on autonomous driving capabilities are usually distinguished by the sensor setup, algorithms and development tools. AImotive chose a vision-first approach, considering cameras as primary sensors, as opposed to the widely-spread LIDAR-heavy approach that is used by most companies in the sector.

This is a game-changer in reaching a scalable and affordable self-driving technology. We are deploying AI in the majority of components in our full-stack software suite, and developing a customized toolkit for accelerating training, testing and verification processes. We even have a team working on a hardware accelerator IP for neural networks based computations.

We have extensive knowledge in all aspects of self-driving software technology, algorithms and processing hardware. What really differentiates us is the way we handle this complete ecosystem, and how we are capable of filling in any gaps in order to accelerate development and enhance productization.

3) How much of a difference is there between Level 4 self-driving technology and what you are focused on in Level 5 autonomous technology?

Level 4, or “fully autonomous” vehicles will be able to handle all safety-critical driving functions in a limited operation design domain (ODD). This means if the vehicle faces a challenge during driving that it cannot solve, it will simply move to a safe state and ask for help. On the other hand, Level 5 technology refers to “anytime, anywhere” driving, which theoretically means there is no situation the car would not be able to solve.

The main difference lies in the decision-making process. It can only become robust if it is handled by a well-trained AI engine and there is a possibility of training for the “corner cases” and uncommon situations. This is a key for scalability, and this is how you can avoid starting the whole development over, if, say, you move from highways to cities. Our approach here is an integration of our virtual simulation environment, where safe and rapid testing, training and verification of our motion planning algorithms can be carried out.

4) AImotive claims the newly-released artificial intelligence-optimized hardware IP, aiWare, is 20 times more efficient than other hardware acceleration solutions. Why is this important?

Neural network (NN) architectures are the key elements of AI-based recognition and decision-making. These architectures and the corresponding operations are evolving so rapidly, the hardware industry can barely keep up the pace in terms of optimization and hardware acceleration.

As a result, developers are left with more general purpose computational units (e.g. GPUs), requiring hundreds of watts of power per processing unit. This scenario is also unacceptable from the final product’s point of view, where every single watt counts. Having worked with the relevant network architectures for autonomous driving on the research side, we have an overview of what an NN accelerator IP should focus on. Optimizing hardware for the most important operations can lower the power consumption of the NN processing below 100 watts, which is a favorable number.

5) While aiWare is available now, are we going to be seeing any other technologies coming from the company this year? What about next year?

The evaluation of aiWare through our field-programmable gate array (FPGA) evaluation kit is available now and the first batch of our test chips will be introduced in Q1 2018, in collaboration with our partners Verisilicon Holdings Co. and GlobalFoundries.

Currently, we are focusing on our full-stack software suite, aiDrive, conducting extensive public road testing in Hungary and California for our highway autopilot application. In addition, aiSim, our simulation environment, will debut as part of our product portfolio.

6) In your company’s opinion, how long is it going to take to get a Level-5 autonomous technology installed in vehicles available to consumers?

Camera sensors placed on the outside of the car for testing autonomous driving. Source: AImotiveCamera sensors placed on the outside of the car for testing autonomous driving. Source: AImotivePublic and industry opinion on this question has changed a lot in the past two years, ranging from very optimistic to very pessimistic forecasts. The technology is evolving at a much faster pace than we expected some years ago, but new challenges arise every day as well.

Our opinion is that the technology, in terms of individual prototype functionalities, will be ready by 2019. When it will be productized and introduced to vehicles largely depends on future integrators (most likely OEMs), legal issues and verification processes, which might delay the deployment anytime between 5 and 20 years.

7) Are you seeing more support develop for the Neural Network Exchange Format (NNEF), and how is AImotive helping to enable this standard?

Originally, AImotive initiated the Khronos Working Group to create the new NN data format standard, NNEF. We actively contribute to the specifications as leading editors.

NNEF is designed to simplify the process of using a tool to create a network and running that trained network on other toolkits or inference engines. This could reduce deployment friction and encourage a richer mix of cross-platform deep learning tools, engines and applications. We are receiving much feedback and support from our partners from within the Working Group, and much interest from future integrators.

We are working towards a provisional release, which we would like to get out before the end of this year, seeking public feedback before finalization, and we are very optimistic about the acceptance of the standard.

8) How does your collaboration compare to, say, the collaboration that Intel, BMW and Mobileye have established?

In our collaborations, we strive to provide large transparency to our software stack and modules. aiDrive is built up from four software engines — recognition, location, motion and control — each of which contains modules that are interchangeable and can be individually tested or modified according to our partner’s requirements.

While many of our offered functionalities are ready for product development, primarily we are technology providers — adapting to the partner’s needs instead of providing closed, black-box functionalities.

9) Can you tell us your vision about how vehicle-as-a-service will work with autonomous driving?

Ever since the dawn of civilization, mobility has been a hot topic for inventors and tech pioneers. It is no exaggeration that self-driving cars will contribute largely to general development in a modern civilization. The concept of Vehicle-as-a-Service (VaaS) is going to be merged with autonomous vehicles, as an extended taxi service, optimized goods delivery and cheaper public transportation, all without unpredictable drivers.

Most importantly, autonomous driving will change the way we live, work and travel, which will ultimately lead to a change of cityscapes, as VaaS through autonomous vehicles will allow us to do whatever we’d like while commuting.

10) What’s next for AImotive?

AImotive has grown a lot in the past two years, extending our core team of 15 to more than 150 employees in four countries.

Still, we haven’t lost focus, and our technology has reached the maturity for productization in L3+ self-driving technologies. Gaining our public road test licenses in California, Finland and Hungary — with efforts to extend it to more countries as well — our fleet has started functionality testing in various traffic scenarios, locations and operation domains.

In the next months, our team will do its best to prove that our technology is safe, robust and scalable, not only in terms of individual functionalities and simulated scenarios, but in real-world autonomous driving.

To contact the author of this article, email PBrown@globalspec.com


Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the GlobalSpec
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement
Find Free Electronics Datasheets
Advertisement