Case Studies
Intelligent robot with uEye+ industrial camera eliminates the need for herbicides
POSTED 09/24/2021
Goodbye weed
An autonomous robot removes weeds in a purely mechanical way, without the use of environmentally harmful herbicides. With the help of neural networks and a uEye+ industrial camera, the robot makes farming more sustainable and reduces the farmer's effort to a minimum. The image data provide information e.g. for yield estimation or early detection of plant diseases.
Plant protection products are an integral part of today's agriculture. They are intended to protect crops and keep out weeds and pests. In 2019, according to the German Federal Environment Agency, the group of herbicides (weed killers) accounted for 50.6 % of the plant protection products dispensed. Unfortunately, in addition to the desired effect, these also harbour numerous risks for humans and the environment. In this way, they endanger biodiversity and water quality of adjacent areas. Biological diversity is declining. The Rowesys student project was launched in summer 2019 at the Swiss Federal Institute of Technology (ETH) Zurich with the aim of developing a robot for more sustainable agriculture without the excessive use of herbicides. On the underside of the robot is a uEye+ industrial camera with GigE Vision Interface from IDS Imaging Development Systems GmbH.
Application
The team, consisting of ten students of electrical engineering, mechanical engineering and industrial design, constructed a first functional prototype called Rosie within only nine months. The robot consists of an outer shell that protects all components inside the housing. Its interior has two powerful batteries and a control box that contains most of the system's electrical components. A structure made of aluminium profiles forms the skeleton of the robot. With the help of four small, individually sprung goose-foot spades, Rosie removes weeds from between the sown plants in a purely mechanical way. They are pulled through the soil behind the robot to the left and right of the rows of plants. This means that the use of environmentally harmful herbicides can be dispensed with. Since the robot navigates autonomously through the field, the farmer's effort is reduced to a minimum. At the same time, smooth, intuitive control is also possible via a joystick.
In order to navigate autonomously through a field, Rosie must first be able to perceive the exact position as well as the end of a row of plants. For this purpose, the robot has several cameras and sensors as well as algorithms for path and position recognition. Based on the data obtained, it always moves along the recognised row within the field. When it reaches the end, it switches to turning mode and finds the next row. This is done until the entire field has been worked. Thanks to four motors, each wheel can be controlled individually. This makes Rosie very mobile and allows, among other things, a full turn on the spot.
Rosie removes weeds in a purely mechanical way, without the use of environmentally harmful herbicides
The team is currently working on the further development and optimisation of the system. Various monitoring tasks are on the agenda. The aim is to observe plants over a longer period of time in order to provide the farmer or breeder e.g. with information on their growth behaviour. For this purpose, Rosie was additionally equipped with a robust IDS industrial camera. "Using the images, we distinguish between weeds and crops for herbicide-free weed removal." An active tool below the robot should then also be able to remove weeds directly around the plants. "In the future, much more information will also be extracted with the help of intelligent image processing - from data for yield estimation to early detection of plant diseases," explains Timo Schönegg, ETH Zurich - Rowesys focus project.
The camera is mounted on the underside of the robot and is oriented vertically downwards. While the robot drives through the fields, it captures the plants from the top. By means of neural networks, the above analyses can be carried out depending on the application. In the future, this information should help farmers to be able to take measures at an early stage, e.g. against fungal infestation. Rosie has so far been used for field trials in sugar beet, wheat and maize fields.
While the robot drives through the fields, the camera records the maize plants from above
Camera
The camera model used must meet special specifications for use in the field. "For our evaluations, we need a camera with a high-resolution colour sensor and low exposure time for sharp images, despite the constant movement of the robot. Since it can also be exposed to harsh conditions in the fields, depending on the weather, it also has to be protected against dust and splash water," explains Timo Schönegg. Other selection criteria were compatibility with Linux/ROS, a compact size and fast data transmission for short response times. The possibility of using wide-angle lenses also had to be given.
Team Rowesys decided on a uEye+ of the typeGV-5040FA-C-HQ。“我们很满意我们的相机chosen. Our high demands were fully met and we were able to collect a lot of data in the past months, which we will use to further train our software," Schönegg affirms. The model is part of the uEye FA family, making it particularly robust and ideally suited for demanding environments thanks to IP65/67 protection. Since functions such as pixel pre-processing, LUT or gamma are already integrated into the camera, the required computing power is reduced.
For excellent image quality - even in low light or when taking pictures of fast-moving subjects - theGV-5040FA-C-HQfeatures the IMX273 global shutter CMOS sensor from Sony's Pregius range. The latter scores especially with sensitivity and a high dynamic range. The resolution of the sensor is 1.58 MPixel (1456 x 1088 px pixels) with a frame rate of 78.0 fps.
The robot is controlled and the sensors are read out via the Robot Operating System Library ROS in the programming language C++. Since a functional ROS driver already exists for the IDS camera, its integration was simple and required little effort. For further image processing, the ETH team uses algorithms developed in-house to distinguish between weeds and crops as described above. "Machine learning algorithms for yield estimation and disease detection with the data now recorded are currently being planned," says Timo Schönegg.
Outlook
Agricultural robots will play an increasingly important role in the future. In order to be able to reliably perform the various tasks in any environment and in different light and weather conditions, they need not only good image processing software, but above all reliable cameras. The use of such camera systems on future robots will be very diverse - from 3D to multispectral cameras. The Rowesys team will continue to work in particular on intelligent, camera-based solutions in the future and contribute to more sustainable and efficient agriculture with innovative technologies. Goodbye weeds! Bye-bye herbicides! Hello environment!