π¬ Lab6: IMU-Based Navigation#
π Objectives#
Students should be able to describe how IMU data is used in ROS2 and interpret IMU messages.
Students should be able to set up and analyze IMU (
/imu
) and odometry (/odom
) topics in ROS2.Students should be able to modify a ROS2 gamepad node to relinquish and regain robot control using button presses.
Students should be able to implement a ROS2 Python node that navigates a robot to a goal using odometry and IMU data.
Students should be able to evaluate and analyze errors in robot navigation and suggest potential improvements.
π Overview#
In real-world robotics, an Inertial Measurement Unit (IMU) is essential for tracking motion by measuring orientation, angular velocity, and linear acceleration. The ICM-20648 6-Axis MEMS MotionTracking Device from TDK integrates a 3-axis gyroscope, a 3-axis accelerometer, and a Digital Motion Processor (DMP). This IMU is embedded in the OpenCR1.0 board, an open-source robot controller featuring an ARM Cortex-M7 processor. The OpenCR board applies an Extended Kalman Filter (EKF) to fuse sensor data, generating IMU estimates at 170 Hz.

The IMU significantly enhances navigation accuracy when combined with odometry. It helps estimate the robot's position over time and improves localization.

As we discussed earlier, multiple sensors work together to estimate the TurtleBot3βs attitude and heading. These sensors are highly sensitive to magnetic fields, which can vary depending on the location and the device. Even everyday electronic components, including those on the OpenCR board, generate small magnetic fields. Although the IMU is placed at the center of the robot for optimal performance, it is still exposed to various magnetic interferences.
Fortunately, the TurtleBot3 developers anticipated this issue. Each time you run the serial node to connect to the robot, the IMU automatically calibrates itself, ensuring more accurate readings.
π± Pre-Lab: Testing the IMU#
Step 1: Launch the TurtleBot3 Simulation#
Open a terminal and start the TurtleBot3 simulation in Gazebo:
$ ros2 launch turtlebot3_gazebo turtlebot_world.launch
Step 2: Verify Communication with the Master#
Ensure the TurtleBot3 is properly communicating with the Master by listing active topics:
$ ros2 topic list
You should see the following topics, including
/imu
and/odom
:/battery_state /cmd_vel /imu /joint_states /magnetic_field /odom /tf_static
Step 3: Examine the /imu
and /odom
Topics#
Run the
gamepad
andjoy
nodes.Observe the IMU and odometry messages while moving the robot:
$ ros2 topic echo /imu $ ros2 topic echo /odom
Pay close attention to the
pose
field. In future labs, be mindful not to confuse the different pose hierarchies within the Imu and Odom messages.
Step 4: Visualizing Data in rqt
#
Open
rqt
:$ rqt
Monitor
/imu
and/odom
data inrqt
while moving the robot.Compare the pose data from Gazebo and
rqt
, noting how simulation noise affects readings.Monitor the
/imu
and/odom
topics usingrqt
as you move the robot in Gazebo.In Gazebo, go to
Models
>burger
, then in theproperty
section, selectpose
to view the robotβs position and orientation in roll-pitch-yaw format.In
rqt
, enable theTopic Monitor
and activate the/imu
and/odom
topics to track orientation and position in real time.Note that orientation in
rqt
is shown using quaternions.The pose values in Gazebo and rqt are not the same. Gazebo simulation publishes position and orientation with noise. By default, Gazebo adds Gaussian noise to the data generated by its sensors to simulate real-world conditions.
π» Lab Procedure#
In this lab, you will create a ROS2 Python package that enables the TurtleBot3 to navigate to a desired location and orientation using data from the IMU (/imu
) and ODOM (/odom
) topics.
π Update .bashrc
for Easier Builds#
To avoid common mistakes when running colcon build
, follow these steps to streamline your workflow. This will ensure that colcon build
is always run in the correct directory and that the necessary setup file is sourced automatically.
Use
gedit
to open the.bashrc
file:$ gedit ~/.bashrc
At the end of the file, add the following function:
# Function to build with optional arguments function ccbuild() { cd ~/master_ws && colcon build --symlink-install "$@" source ~/master_ws/install/setup.bash } # Export the function to make it available in the shell export -f ccbuild
Save the file, exit the editor, and restart your terminal for the changes to take effect.
Instead of manually navigating to
~/master_ws
, runningcolcon build --symlink-install
, and sourcinginstall/setup.bash
, you can now simply run:$ ccbuild
This ensures the build process is executed correctly every time and and sources the necessary setup file.
You can also pass arguments to
colcon build
through theccbuild
function. For example:$ ccbuild --packages-select lab6_nav
This builds only the
lab6_nav
package, saving time by skipping previously built packages.By following these steps, youβll streamline your workflow and minimize build-related errors.
π Testing IMU on the Physical Robot#
To test the IMU on the physical robot, follow these steps:
Use SSH to launch the
robot.launch.py
file on the robot.Start the
gamepad
andjoy
nodes on your master computer.As you move the robot in Gazebo, monitor the
/imu
and/odom
topics using:$ ros2 topic echo /imu
$ ros2 topic echo /odom
The raw output of these topics can be overwhelming. Use the following commands to filter and display only the relevant sections:
ros2 topic echo /imu | grep -A 4 'orientation'
ros2 topic echo /odom | grep -A 3 'position'
The
-A
option ingrep
stands for βafter context.β It displays the specified number of lines following the matching line.For
/imu
, this command shows theorientation
section and the next 4 lines.For
/odom
, it shows theposition
section and the next 3 lines.
Convert quaternions to Euler angles: Both topics use quaternions to represent orientation, which are not human-readable. Later, we will convert these quaternions into Euler angles for easier interpretation.
π Modify gamepad.py
to Include Control Relinquishment#
In this section, youβll modify the gamepad.py
script to add functionality for relinquishing and regaining control of the robot. This will allow users to toggle control states using specific gamepad buttons.
Locate and open the
gamepad.py
file.In the constructor of the
Gamepad
class, add the following attributes:# Flag to track control status (default: True) self.has_control = True # Create a publisher for control relinquishment messages. # - Publishes to the 'ctrl_relinq' topic. # - Uses Bool messages to indicate control status. # - Queue size of 1 ensures only the latest control state is kept. self.ctrl_pub = self.create_publisher(Bool, 'ctrl_relinq', 1)
In the
joy_callback
method, implement the following logic to toggle control using buttons A (Green) and B (Red):# Create a new Bool message for "control relinquishment status" relinquish = Bool() # TODO: Check if the RC (Remote Control) has control and button A (Green) is pressed # Set control flag to False (relinquish control) # Change "control status" to relinquished. # Publish the control status # Log status update self.get_logger().info("RC has relinquished control.") # TODO: Check if RC does not have control and button B (Red) is pressed # Set control flag to True (regain control) # Set control status to regained. # Publish the control status # Log status update self.get_logger().info("RC has taken over control.") # If control is relinquished, stop further processing if not self.has_control: return
This is a common safety practice in robotics to halt autonomous tasks a robot is executing. Another safety practice is to integrating an easily accessible emergency stop switch that cuts off the main power to the robot.
Test the gamepad node:
Run the gamepad node.
Open another terminal and monitor the
cmd_vel
topic.Press button A on the gamepad and verify that control is relinquished (no movement commands should be published by the gamepad).
Press button B and verify that control is regained (movement commands should be published again).
Check the
ctrl_relinq
topic to confirm that control status messages are being published.
Create a New ROS2 Package#
Open a terminal and navigate to the
src
directory of your workspace:$ cd ~/master_ws/src/ece387_ws
Use the following command to create a new ROS2 package named
lab6_nav
:$ ros2 pkg create lab6_nav --build-type ament_python --dependencies rclpy geometry_msgs nav_msgs sensor_msgs
Dependencies:
rclpy
: ROS2 Python API.geometry_msgs
: ForTwist
messages (velocity commands).nav_msgs
: ForOdometry
messages.sensor_msgs
: ForIMU
messages.
Download the
move2goal.py
file and save it in thelab6_nav
package directory:~/master_ws/src/ece387_ws/lab6_nav/lab6_nav/
Open the
setup.py
file and modify theentry_points
section to include themove2goal
script:entry_points={ 'console_scripts': [ 'move2goal = lab6_nav.move2goal:main', ], },
Build the package using the
ccbuild
command:ccbuild
Implement the move2goal.py
Script#
The move2goal.py
script will control the TurtleBot3 to move to a specified goal location and orientation. Follow these steps to complete the implementation:
Initialize publishers and subscribers.
Publisher: Create a publisher for
/cmd_vel
to send velocity commands to the robot.Subscribers: Subscribe to:
/odom
to track the robotβs position./imu
to get orientation data./ctrl_relinq
to check if the node has control.
Implement the
odom_callback
functionExtract the
x
andy
position from the receivedOdometry
message.
Implement the
imu_callback
functionExtract the quaternion orientation from the
Imu
message.Convert the quaternion to Euler angles using the
euler_from_quaternion
function.Update the
yaw
value.
Implement the
ctrl_relinq_callback
functionUpdate the
self.has_control
flag based on the receivedBool
message.Log a message indicating whether control is taken or relinquished.
Implement the
control_loop
functionCompute the angle to the goal and normalize it to the range [-\(\pi\), \(\pi\)].
Compute the distance to the goal using the Euclidean distance formula.
Complete the state machine logic: The state machine consists of four states:
ROTATE_TO_GOAL:
If the angle error is greater than
0.05
radians, rotate the robot with an angular speed proportional to the error (0.3 * angle_error
). Adjust the proportional coefficient as needed.If the angle error is within the threshold, transition to
MOVE_FORWARD
.
MOVE_FORWARD:
Move the robot forward at a constant speed of
0.15
(adjustable, but the robotβs maximum speed is0.22
).If the distance to the goal is less than
0.15
, transition toROTATE_TO_FINAL
.
ROTATE_TO_FINAL:
Compute the final orientation error and normalize it.
If the error is greater than
0.05
radians, rotate the robot with a speed proportional to the error (0.5 * final_angle_error
).If the error is within the threshold, transition to
GOAL_REACHED
.
GOAL_REACHED:
Log a message confirming the goal has been reached.
Stop publishing movement commands.
Publish velocity commands: Ensure
cmd_vel
messages are published correctly in each state of the state machine.
Build and Test the Package#
Install
tf-transformations
using$ sudo apt install ros-humble-tf-transformations
Build the
lab6_nav
package using:$ ccbuild --packages-select lab6_nav
Demo the robot:
Run the
move2goal
node and demonstrate the robot moving to the goal location(-0.61, 0.61)
in meters.Rotate the robot to face
0Β°
.Hint: Most floor and ceiling tiles in the U.S. are 1β by 1β or 2β by 2β. Use this to estimate distances. (1 foot = 30.48 cm).
π Deliverables#
[15 Points] Complete the
move2goal.py
Script:Ensure the script is fully functional and implements all required features.
Push your code to GitHub and confirm that it has been successfully uploaded. NOTE: If the instructor canβt find your code in your repository, you will receive a grade of 0 for the coding part.
[15 Points] Demonstration:
Show the robot successfully navigating to the goal location and orientation in a real-world setup.
[20 Points] Performance Analysis:
Examine the distance and angle errors printed by
move2goal.py
.Discuss the robotβs performance:
Can the robot navigate to a farther distance, such as
(-3, 3)
? Why or why not?After the robot reaches the goal, restart the demo without rebooting the robot. Does it work as before? If not, why doesnβt it work anymore?
Suggest improvements to enhance the robotβs navigation capabilities.
Note 1: This deliverable is worth 20 points, more than the other deliverables. Therefore, a detailed and rigorous analysis is expected. Treat this as the analysis section of your project report.
Note 2: Provide quantitative analysis, not qualitative. You donβt need to include plots of measured data, but you should provide the measured data for your analysis. For example, instead of saying, βThe robotβs orientation drifted quite a bit as it traveled almost one meter,β you should say, βWith its initial orientation of 43.5\(^\circ\), the robot gradually drifted to 39.3\(^\circ\) after traveling 0.73 meters.β For your project, you will need to provide plots for the measured data.