University of Agder, 2020 - MAS506 Course
This project involved the assembly and programming of a self-balancing robot. The system ran on a MyRIO embedded device programmed in LabVIEW. Several controllers were designed and tuned to make the robot balance, compensate for wheel friction, and control the position and heading. The robot was also programmed to follow a black line using computer vision.
Equipment Used
- 1x MyRIO embedded device
- 2x Electric motors with gearbox and encoder
- 1x Motor controller
- 1x Gyro (Digilent PmodGYRO breakout board)
- 1x 3s 11.1V 2200mAh LiPo battery
- 1x Trust Exis Webcam
- 2x Wheels with shaft connection
Project Components
Motor Control System
The motor's velocity was controlled by a PWM signal, making it possible to steplessly control the velocity. Each motor was equipped with an optical encoder for measuring position and velocity. Friction compensation was implemented to account for differences between the motors.
Inertial Measurement Unit
An IMU consisting of the built-in accelerometer on the MyRIO and a Digilent PmodGYRO breakout board was used to measure the pitch and yaw angles. A complementary filter was implemented to fuse the sensors, providing accurate angle measurements.
Control System
A cascade controller was implemented for balancing the robot. The cascade controller used the robot's pitch angle and angular velocity as inputs and produced the desired wheel velocity as output. Additional controllers for velocity and turn rate were also implemented.
Camera Vision System
A Trust Exis webcam was used for line detection. The camera image was processed to detect black lines on the floor, allowing the robot to follow paths. Image processing included grayscale conversion, region of interest selection, and binary thresholding.

Figure: Control system diagram showing the interconnection of heading control, velocity control, balance control, and friction compensation systems that enable the self-balancing robot to maintain stability while following designated paths.
Complementary Filter for Sensor Fusion
A complementary filter was used to fuse the accelerometer and gyroscope data, combining the advantages of each sensor while minimizing their disadvantages. The filter used a balance parameter alpha of 0.98, which showed the best results for the accelerometer's long-term accuracy and the gyroscope's short-term precision.
Cascade Controller Design
The balancing controller was implemented as a cascade controller with an outer PID controller using the pitch angle as feedback and an inner P controller using the angular velocity from the gyroscope. This design provided stable balance and responsive control.
Center of Gravity Optimization
By strategically placing the battery at the top of the chassis, the center of gravity was raised, significantly improving stability. This made the robot easier to balance, similar to how a broom is easier to balance with the mass at the top.
Results and Achievements
- The robot could successfully balance itself autonomously with minimal drift
- Linear velocity could be controlled in both forward and backward directions
- The robot could perform turns with a maximum rate of approximately 490°/s
- Line following was successfully implemented with the camera vision system
- The robot could recover from small disturbances and maintain balance
Conclusion
The project successfully demonstrated the implementation of a two-wheeled balancing robot with line-following capabilities. The combination of sensor fusion, control theory, and computer vision resulted in a stable and responsive system capable of autonomous navigation along predefined paths.