How to Automatically Tune Robot Wheels and Cameras Using "Chessboards" and QR Codes

How to Automatically Tune Robot Wheels and Cameras Using

The development of LETI researchers will increase the calibration speed and accuracy of equipment that allows driverless wheeled vehicles to move in the right direction and navigate in space.

23.06.2022 19

Today, driverless cars, taxis, and delivery vans are being tested in cities around the world. Since the number of robots in cities is expected to increase rapidly in the future, scientists have a challenge of developing algorithms, methods, and systems for such machines to interact with each other and the environment effectively and safely for society.

To create and refine intelligent systems for robots, researchers at MIT (USA) developed the Duckietown project in 2016. It is a scaled-down model of an urban transportation environment, which includes roads with markings, vehicles, traffic lights, road signs, driverless wheeled robots, and pedestrian ducks. Over time, the project became international, and model cities for testing drones opened at universities in different countries. In 2021, the Duckietown polygon was launched at LETI.

"Accurate movement of driverless robots requires adjustment of the camera, on whose parameters the computer vision depends, as well as calibration of the driving wheels, which is essential for turns and straight-line movement. Today, the calibration of these parameters in robots is done manually by operators. Calibrating a single robot is done relatively quickly. However, if we want to run multiple drones at once, we need a lot of time and operators. To solve this problem, we developed a method that allows the robot to adjust the camera and wheels independently and quickly. We have successfully tested this method at our Duckietown polygon."

Anton Filatov, Assistant of the Department of Mathematical Support and Application of Computers at LETI

Experiments were conducted on three-wheeled robots with a 2:1 wheel configuration, in which the two drive wheels are the front ones (such robots are used in all Duckietown projects). Regular equipment adjustments are required on the drones because, during mass production of cameras and wheels, the resulting products will inevitably vary slightly. These small inaccuracies will cause the wheels to be different sizes and the cameras to take slightly different pictures. Without calibration of the robot, these factors can lead to tribulations with driving direction and disorientation in space.

Scientists at LETI have developed two consecutive stages of the camera and wheel adjustment that the robot can perform independently. First, the camera is calibrated, which is done by taking photos of several horizontally arranged chessboards with known dimensions of squares. The robot takes a series of pictures as it rotates 360 degrees. The distances between the squares are calculated, taking into account the distortion of the picture created by the wide-angle lens.

Then the robot moves to a small area with markings on its surface, similar to QR codes; they act as coordinates. The initial position of the robot does not matter and can be random. To calibrate the wheels, it is required to drive back and forth over any part of the site several times, shooting codes that the camera has passed through. This allows calculating the robot's position and the degree of deviation when driving in a straight line. After this algorithm is finished, the robot can start executing its tasks independently.

"Tests have shown that this approach is comparable to manual calibration methods in accuracy. It can replace humans in this task. Thus, on the one hand, we reduce the risks associated with the human factor, and on the other hand, we can quickly and accurately tune several robots at once. If necessary, we can adapt the proposed method not only for tuning cameras of small three-wheeled robots but also for computer vision systems in full-size driverless cars," explains Anton Filatov.

The study was published in the Applied Sciences journal.