https://sites.google.com/amre-amer.com/resume/home/
https://github.com/Amre-Amer/AutoDrive
A three-day project to create a simple Autonomous Driving Simulator Platform for learning basic concepts of autonomous driving and related concepts. The focus here is to add a layer of safety. Auto drive decides whether or not to SLOW DOWN or not. This is in addition to a separate navigation system. Another key feature is the focus on non-visual meta data as the data format.
1500 words as neural network inputs. Each frame of auto drive history is translated into this Scene Language. Enough information is included to remotely drive. Information is inferred from each frame of sensor data, and between frames. Information includes:
Using a vocabulary of 1000's of words to describe each frame's scene and agents. Differential information is also gathered by comparing adjacent frames. The information is used to validate sensor data, compare scene with historical data, evaluate the best move to continue.
All the info we need is in the environment and our experience. Sensors pick up low level info (distance), but lose higher level info (door might open).
This technique helps identify lost information that is critical to successful driving.
Goal: Get enough sensor information, related remotely to successfully drive. This data transfer is verbal or textual, not graphic, not mathematical.
Text-based validation provides an independent way to verify sensor data, and driving decisions.
Knowing how or why a system makes a decision is useful, especially when it is in a human readable form.
This is key for neural networks.
History appears on top with matched history in yellow.
Here we have 5 scenes, with 5 historical entries above.
Each of the scenes has a vehicle with auto drive enabled. Sensor readings are added for each frame.
Sensor readings are used to decide whether or not to slow down.
Historical data can be used to "look ahead" and possibly brake earlier and smoother.
Bird's eye view of 5 sessions with shared history.
Here is a code snapshot of the classes.
This is a MonoBehavior script that interacts directly with the Unity environment, utilizing Update and Time features to run the simulator. This is the top level script.
Scene managers do things to all the scenes, like advance, reset, and load. The scene manager also stores the auto drive history, acting as the network.
A scene is a driving session with vehicles, environment. If a vehicle has auto drive, then its sensor data is also part of the scene. A scene has a duration or a number of frames. Each frame is a moment in time with all available data for that moment.
Auto drive history is a collection of previously driven scenes or made up scenes. Auto drive history is used to match existing situations and look ahead to see if SLOW is needed.
Auto drive can be on or off. Auto drive requests sensor data, analyzes it, and decides on whether to slow down or not. Auto drive communicates with auto drive history.
Sensors generate data. Sensors have a range. Sensors can determine an agent's identity, position, trajectory, etc. Sensors are managed by auto drive.
Scripts manage vehicles and other environment agents. Scripts can be manually created for what if scenarios. Scripts can be generated automatically from sensor data. Scripts can be generated from video or text information. Scripts and data make up a scene.
Data is generated by sensors. This data is all the sensor info happening within a certain time frame. The order of detection should not be lost in the future. This data is stored as history. The focus here is that data at this level is not visual, not graphic, not numeric, but TEXT.
A vehicle can be a car, truck, bicycle. It can have an auto drive. It can move. It has a velocity and position. It can be a police car, a truck, etc.