Whether looking at the legislative, production scaling, or research cost arguments, there has never been a single greater disrupter for the automotive industry than the impending arrival of driverless cars. The implications are huge for the ride-sharing and trucking industries alone and the effects of decisions made now will impact millions of jobs in the future. For the average consumer, the appeal of a vehicle driving them to work or making them money once they've arrived is understandably well-received.
Excitement in an industry always spurns development; AV's are no different. Fueled by the lucrative potential of this wide-open market, industry heavyweights and Silicon Valley startups alike have set ambitious development goals, and many have accomplished incredible feats as a result. But tech culture preaches disruption: be the first to market and shake up the status quo along the way. Disruption and safety in this culture are often competing objectives, not because they are inherently so, but because the novelty and hype do not extol the pairing of the two. Safety, to be blunt, is not sexy in a world where VC money demands flashier and flashier demonstrations.
At PolySync, we challenge the industry to see things differently. Here, there is no priority higher than the functional safety of both the software we write and the hardware we build.
In 2019, I completed Udacity's Self-Driving Cars Engineering NanoDegree. It was a great learning experience and taught me a lot more about how driverless cars work from a top-down end-to-end manner. You can review some of the work I submitted below:
Use an Extended Kalman Filter (EKF) to estimate the state of a moving object of interest with noisy LiDAR and RADAR measurements. Obtain RMSE values that are lower than a given tolerance.
Design a highway driving path planner and test it in simulation with traffic, telemetry, and sensor fusion data. The car must satisfy several velocity, acceleration, and jerk motion constraints while also avoiding collisions, staying within lane markers (other than lane changes), and achieving a high average velocity in traffic by changing lanes and obeying the posted speed limit.
Implement a 2-D particle filter in C++, which is given a map and some initial localization information, similar to what a GPS would provide. At each time step your filter will also get noisy observation and control data.
Build a CNN in Keras to clone driving behavior. Train, validate and test a model that outputs a steering angle to an autonomous vehicle.
Use openCV to write a software pipeline to identify the lane boundaries in a video. Apply a distortion correction to raw images, perspective transforms to rectify binary images, detect lane pixels and fit them to find the lane boundary, and determine the curvature of the lane and vehicle position with respect to center.
Build a PID (proportional/integral/differential) controller, tune PID hyperparameters and test in simulator. The simulator provides cross-track error (CTE), speed, and steering angle data via uWS. The controller must respond with steering and throttle commands to drive the car reliably around the simulator track. Expand on this project by building a simulated controller in ROS.