Netradyne’s intuitive vision-based technology

Complete narrative of driving experience and driver behavior

Established in 2015 by Mr. Avneesh Agrawal and Mr. David Julian, Netradyne Works towards creating a ground-breaking Artificial Intelligence (AI) platform to disrupt the commercial vehicle safety and driver analytics industry. The company has advanced quickly with its technology innovation centers setup in San Diego, California, and Bangalore. We caught up with Mr. Teja Gudena, Vice President, Engineering – Devices, and Mr. Prasant Kumar, Sr. Manager, Business Development, Netradyne Technologies India Pvt. Ltd., to know more about the company’s product, technical know-how and plans for India.

Mr. Teja Gudena, Vice President, Engineering – Devices, Netradyne Technologies India Pvt. Ltd.

Excerpts:

How is Netradyne using AI for safety & security for commercial vehicle drivers and fleet operators?

Teja Gudena (TG): Netradyne pioneers to combine A.I. with vision in order to detect, reason and determine the causalities of an event. This feature allows the fleet to utilize all available resources to provide superior performance and address at-risk driving to avoid any severe situation. Until the launch of our innovative Driveri platform, the legacy video systems were being monitored continuously by the fleet managers in order to identify any major events. Netradyne focuses on deep learning technology in order to provide a cost effective, fast moving business process which allows the fleet managers to get a better understanding of the driver environment.

The product is based on an Internet of Things platform that intends to help large fleet operators ensure safety of the drivers as well as the vehicle. It comes with 4 embedded cameras that covers 360-degree views of the surrounding, and has a very advanced sensor processing unit to identify at risk events and ensure compliance to driving rules and fleet instructions. It is available as an aftermarket product, and can be installed behind the rear-view mirror inside the vehicle cabin, and draws power from the battery. While the driver is driving, Driveri keeps recording the video both from the inside, outside and side views. The machine learning algorithms sitting on the device run the vision processing program in real time to determine any distracted driving behavior – whether the driver is sleepy, speaking on the phone, eating food or looking sideways for a prolonged time-based on the inward camera. The road facing camera identifies any traffic violation – crossing the red light, lane change, speeding in a lower speed limit zone; and at risk driving behavior – impending collision, harsh and reckless braking and acceleration in real time and gives audio alerts to take corrective action. The device is also connected to the cloud and the fleet managers get access to the dash board. Any event of driver’s misbehavior or violation, creates a one minute video clip on the device and sent to the cloud for the fleet managers to review.

The first and foremost advantage of the product is that it records in real time and any event can be reported immediately to the fleet manager. The second advantage is that the data processing is done on the device itself and events are uploaded to the cloud as it happens. The device has the capacity to store up to 50 hours of recording. The fleet manager can also cross check if somebody complains to the manager on call about any errant behavior within the 50 hour time period.

How does the fleet operator benefit from the product?

Prasant Kumar (PK): Usually the fleet carry goods worth millions, if there is any accident there is loss of down time, loss of inventory and legal troubles. Using artificial intelligence we are also working on a mechanism that can protect the driver from potentially entering into an accident. It monitors the driving pattern and pre-alerts the driver. Upon legal trouble, the videos (complete recording) can be used to re-construct the accident scenario by the fleet operators.

Are Indian drivers and fleet operators ready for artificial intelligence-based monitoring or should there be some manual process that can be controlled?

TG: Drivers may not want to do it themselves. It is also not good to deviate them from their task hence everything is done using the intelligence. However, there is one feature for driver to operate manually with the help of two buttons below the device. If he sees something dangerous or accidental he can press the button and immediately the event gets recorded and uploaded on the cloud.

Mr. Prasant Kumar, Sr. Manager, Business Development, Netradyne Technologies India Pvt. Ltd

Why do you think AI will work in India and how do you intend to make this favorable for fleet operators in India?

TG: If the fleet operators compare the cost versus the benefit they get, if even one untoward event occurs, it pretty much recovers the cost of the device. One of the features we have is the Green Zone score for the drivers. It is intended towards the driver to motivate them to use this device and see themselves on how they drive. However, if they resist or are not comfortable, the inward camera can be closed. In India majority of commercial vehicle accidents happen because of driver’s fatigue. The device, if put ON, can alert him if he is feeling sleepy.

Another benefit is the ADAS – Advance Driver Assistance Systems. Our product is pretty much capable of most of the ADAS features and one of the main features deployed is the collision warning system. We are also running trials with a few customers in India. When you are about to collide with the vehicle in front of you, the device will give you warning.

Does Netradyne fancy technical collaboration with OEMs in India too? If yes at what stage would you want to get involved in?

TG: We built the product, the software and the analytics, and we provide complete end to end solutions. In India, we are targeting 3 segments of customers: First, fleet operators – all of who carry valuable inventories and are ready to invest. Second of course are the OEMs as they will use our technology as an aftermarket installation or officially use it by integrating it in their design. The Government is already regulating the ADAS collision warning features, speed governance and speed alert features. OEMs have the options to work with an experienced subsystem supplier like us or develop on their own. We would ideally like to start as early as possible during their regular R&D procedure or initial phase of designing the vehicle. The advantage for the OEM of involving us at the initial phases is that the device can be presented as a customized feature of their vehicle. We can also connect to the vehicle CAN bus system which leads to preventive maintenance. The third segment are the ride sharing companies of the likes of Ola and Uber.

How is Netradyne India poised in terms of localization – both in terms of technical engineering and human resources?

TG: In India, we have 3 divisions, namely, Device, Cloud and Analytics Engineering. Device and cloud related activity is carried out from Bangalore. We have world class deep learning engineers in San Diego and all software activities are carried inhouse. PCB manufacturing as well as couple of mechanical vendors are in China but the assembly happens in Gurugram. For our camera vision we work with Omni Vision. The front camera which we are using is a 120 dB HDR – high end automotive camera sensor used in premium cars. We are the only company to have commercialized it so far in India. HDR capabilities of high end sensor has night vision features and reduces opposite light reflection, and distractions.

From the technology and design point of view San Diego and Bangalore team are on the same platform. The reason for focusing on Bangalore is because of the engineering talent available at effective cost, probably one fourth the cost of doing so in San Diego. It gives us an edge to compete in India even though we have been active in the US. In India, we need to customize as per Indian market and hence we need to re-do some of the analytics and retrain some of the vision-based algorithm for Indian driving conditions. In a nutshell, we are the only company with local presence and R&D and know-how of both the worlds.

Given the gamut of in-built features and analytics you offer with your device, how do you plan to charge your customers?

PK: Now-a-days fleets have some basic monitoring system which has made them aware of how our advance A.I. based security and safety features work. It helps us to advance our level of services. We charge a one-time device cost along with a per month SAAS model that covers bandwidth, data storage, email notifications and cloud services. A fleet operator need not worry about anything.