Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car.
This is the system integration project repo of Team Omnibot programming a real self-driving car. Boilerplate code and instructions from Udacity: CarND-Capstone. Team members are from July 2017 cohort accomplishing the final term 3 project due May 21st, 2018.
| Name | Github | Contribution | |
|---|---|---|---|
| Henrique Nery (TL) | @nery_henrique | Waypointer | nery.henrique@gmail.com |
| Shane Dominic | @sdshdomi | DBW Node | shane.dominic@toyota-motorsport.com |
| Davinder Chandhok | @davinderc | Classifier | davinder.sc1@gmail.com |
| Keith Lee | @TonyLee33063 | Detector | lixiaoma218@126.com |
| Chalid Mannaa | @tochalid | Integration | tochalid@gmail.com |
The goal is to implement different ROS nodes for core functionality of the autonomous drive-by-wire (dbw) vehicle system, including traffic light detection, control, and waypoint following, and run the code on Udacity self driving car "Carla". The code will be tested on simulator first, then on Carla. The key features are:
- Detection of traffic lights colours - RED, YELLOW, GREEN
- Stopping car on RED, moving on GREEN, ignore YELLOW
- Following waypoints sending the control command on drive-by-wire, for throttle, brake, and steering
The system architecture consists of Perception, Planning and Control nodes that communicate via function calls. All code base can be found in the source folder (path_to_project_repo)/ros/src/ and improvements where made to following files:
tl_detector/tl_detector.pytl_detector/light_classification_model/tl_classfier.pywaypointer_updater/waypoint_updater.pytwist_controller/dbw_node.pytwist_controller/twist_controller.py
This node takes in data from the /image_color, /current_pose, and /base_waypoints topics and publishes the locations to stop for red traffic lights to the /traffic_waypoint topic. The /current_pose topic provides the vehicle's current position, and /base_waypoints provides a complete list of waypoints the car will be following.
The purpose of this node is to update the target velocity property of each waypoint based on traffic light and obstacle detection data. This node will subscribe to the /base_waypoints, /current_pose, /obstacle_waypoint, and /traffic_waypoint topics, and publish a list of waypoints ahead of the car with target velocities to the /final_waypoints topic.
The dbw_node subscribes to the /current_velocity topic along with the /twist_cmd topic to receive target linear and angular velocities. Additionally, this node will subscribe to /vehicle/dbw_enabled, which indicates if the car is under dbw or driver control. This node will publish throttle, brake, and steering commands to the /vehicle/throttle_cmd, /vehicle/brake_cmd, and /vehicle/steering_cmd topics.
Node Interaction Diagram using ROS Framework Features
The team followed the implementation approach suggested by Udacity:
- Waypoint Updater Node (Partial): Complete a partial waypoint updater which subscribes to
/base_waypointsand/current_poseand publishes to/final_waypoints. - DBW Node: Once your waypoint updater is publishing
/final_waypoints, thewaypoint_followernode will start publishing messages to the/twist_cmdtopic. At this point, you have everything needed to build thedbw_node. After completing this step, the car should drive in the simulator, ignoring the traffic lights. - Traffic Light Detection: This can be split into 2 parts:
- Detection: Detect the traffic light and its color from the
/image_color. The topic/vehicle/traffic_lightscontains the exact location and status of all traffic lights in simulator, so you can test your output. - Waypoint publishing: Once you have correctly identified the traffic light and determined its position, you can convert it to a waypoint index and publish it.
- Waypoint Updater (Full): Use
/traffic_waypointto change the waypoint target velocities before publishing to/final_waypoints. Your car should now stop at red traffic lights and move when they are green.
The system is tested in the simulator and after deployed to a car. The real-world testing provides ROS bag information that can be analyzed offline for optimizatin purpose.
- Clone the project repository
git clone https://github.com/udacity/CarND-Capstone.git- Install python dependencies
cd CarND-Capstone
pip install -r requirements.txt- Make and run styx
cd ros
catkin_make
source devel/setup.sh
roslaunch launch/styx.launch- Run the simulator
Following requirements must be fulfilled in the simulator.
- The
launch/styx.launchandlaunch/site.launchfiles are used to test code in the simulator and on the vehicle respectively. - Smoothly follow waypoints in the simulator.
- Respect the target top speed set for the waypoints'
twist.twist.linear.xinwaypoint_loader.py. The vehicle adheres to the kph target top speed set inwaypoint_loader/launch/waypoint_loader.launch. - Stop at traffic lights when needed.
- Stop and restart PID controllers depending on the state of /vehicle/dbw_enabled.
- Publish throttle, steering, and brake commands at 50hz.
There are two findings for further optimization:
- The performance of the traffic light detector FCN is relevant for the overall performance of the vehicle to master the test course. A compute intensive FCN (Perception) impacts the other components (Planning, Controller) -> design least complex model with highest accuracy, depending on available HW make a balanced trade-off
- The isolated controller component seems sufficient but the PID can be improved. -> test and optimize the 3 throttle params Kp=.5, Ki=.005, Kd=.0 in twist_controller.py depending on waypointer implementation and available HW
- This video shows 7min the ref test ride with setting the light.state from simulator in file tl_detector.py -> get_light_state.
- This video shows 7min test ride using a simpler FCN to classify traffic light state.
- This video shows 7min test ride using a more complex FCN to classify traffic light state and some fixes
- This video shows 2min test lot riding 7 circles.
- Download training bag that was recorded on the Udacity self-driving car.
- Unzip the file
unzip traffic_light_bag_file.zip- Play the bag file
rosbag play -l traffic_light_bag_file/traffic_light_training.bag- Launch your project in site mode
cd CarND-Capstone/ros
roslaunch launch/site.launch- Confirm that traffic light detection works on real life images
To replay the ROS bag, use the following steps:
- Download the rviz config file.
- Open a terminal and start
roscore. - Open another terminal, run
rosbag play -l <path_to_your.bag> - Open one more terminal and run
rviz. You can change the RViz configuration to the Udacity download by navigating to your config file fromFile > Open Configin RViz. Alternatively, if you'd like to make the Udacity config file your default, you can replace the rviz config file found in~/.rviz/default.rviz.
- Traffic Light Detection and Classification
- Tensorflow Object Detection
- Github Commands
- ROS Bags
- Udacity Introduction Video
Please use one of the two installation options, either native or docker installation.
-
Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahir. Ubuntu downloads can be found here.
-
If using a Virtual Machine to install Ubuntu, use the following configuration as minimum:
- 2 CPU
- 2 GB system memory
- 25 GB of free hard drive space
The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this.
-
Follow these instructions to install ROS
- ROS Kinetic if you have Ubuntu 16.04.
- ROS Indigo if you have Ubuntu 14.04.
-
- Use this option to install the SDK on a workstation that already has ROS installed: One Line SDK Install (binary)
-
Download the Udacity Simulator.
Build the docker container
docker build . -t capstoneRun the docker file
docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstoneTo set up port forwarding, please refer to the instructions from term 2




