S3-CAV focuses on making the ability to perceive the local environment in 3D accessible to farmers transnationally. We devise a sensor framework which we populate with various vision-based and proprioceptive sensors, and combine their real-time input with stored data to provide short-loop safety responses and data sufficient to precisely control an application device in 3D.
The detailed data from the sensors is also sent to a commercial cloud-based Precision Farming Management Information System, where it is combined with stored data from earlier passes and other sources to produce human-readable maps with semantic overlays showing crop health, crop maturity, field traversability, irrigation networks, etc., whatever is relevant and requested. The versatility of this general sensor framework makes it truly transnational -- sensors and overlays can be adapted to specific crops, climates and geographical conditions.
Factors that hinder widespread adoption of Precision Farming methods include those relating to the perceived cost and complexity in getting started, and myths that PF is only feasible for large row crop farms. We address complexity by proposing a general PF solution with a common interface for data- and farm management, auto guidance, data visualization, and action planning, integrated into an existing commercial FMIS. We address the row crop preconception by building our initial test system to work in vineyards and olive groves.
Adoption of our system will be encouraged by being able to input data from any (open format) map-based source, and by the output from our system being compatible with most currently-used agricultural devices, everything that uses ISOXML, and the maps will be displayed on a standard Android tablet PC. Any map-based sensor data in any open format can be incorporated.