Have a question?

Robotics :: Autonomous Guidance

Annual Progress Report:

Objectives:

  1. Development of improved sensing, fruit detection, data archiving, machine intelligence, and visual servo control capabilities.
  2. Development of autonomous vehicle guidance and navigation as an enhancement to harvesting performance and operator effectiveness.

Current work

Fruit Detection System Development.
We are currently exploring sensor technologies, which should improve our ability to locate fruit in the tree canopy. This will overcome some of the difficulties associated with traditional machine vision approaches attempted in the past. We will specifically look at the use of NIR and FIR imaging technologies and their ability to improve fruit detection when combined with visible spectrum sensing. In addition, we are developing canopy mapping technology which will allow us to develop 3D image models of the canopy to assist harvesting.

Autonomous Vehicle Guidance and Navigation
The use of vehicle navigation systems in agriculture is becoming more widely accepted in traditional field crops, yet adoption in orchard crops has been delayed due to problems associated with DGPS in high tree canopies. UF has been exploring technologies, which can autonomously navigate equipment in the citrus grove. This could present a number of advantages to 24 hr/day harvesting. a) reduces worker fatigue, b) improves machine steering precision, c) eliminates night time vision challenges to observe tree line and safety hazards, d) provides opportunity for precise machine-machine steering and speed synchronization for both the harvesters and the goats, and e) affords the potential that one or more of these machines could be un-manned. We will continue to develop vehicle navigation systems that could be deployed on MH systems. Initial studies will develop technology on smaller equipment platforms, such as a John Deere 'Gator'. Eventually, the technologies will be adapted to full-scale harvesting systems. There are numerous other application such as; autonomous spraying, mowing, scouting, and so on.

Results:

  1. A new strategy has been developed for the visual servo control problem where the camera can not be a priori positioned to the desired position/orientation to teach the robot the task space. This technology has direct application to citrus harvesting. A “teach by zooming (TBZ)” approach is formulated where the objective is to position/orient a camera based on a reference image obtained by another camera. Simulations have been carried out which demonstrate a satisfactory performance of the developed controller. We are in the process of enhancing the visual servo interface to the RRC 1207 manipulator arm. A major hurdle is to improve the communication rates between the vision processor and the manipulator processor. Progress has been demonstrated toward this goal.
     
  2. The 3D mapping and image registration project consist of four primary processes; first the sensory data (currently vision, ladar, and DGPS) from the grove should be collected, filtered, and preprocessed; next the data should be prepared for data archival and database generation; then off-line batch processing can be conducted to explore the data file for conditions of interest; and finally create a virtual reality where the grove information can be queried for features or conditions of interest. We have recently completed the majority of the sensory system requirements, and are currently exploring technologies for the database and virtual reality generation.
     
  3. Research and development continues on vehicle guidance and navigation in the citrus grove. Control algorithms are currently being developed to fuse machine vision and laser radar perception to improve guidance control and provide obstacle detection. In addition, control approaches are being sought out, which could be used to autonomously navigate both vehicles in a canopy shake and catch systems. Operators would then be free to monitor overall performance, rather then manually driving the vehicles.

Publications:

  • S. Mehta, W. Dixon, T. Burks, and S. Gupta, “Teach by Zooming Visual Servo Control for an Uncalibrated Camera System”, Proceedings of American Institute of Aeronautics and Astronautics Guidance Navigation and Controls Conference, AIAA-2005-6095, San Francisco, 2005.
  • Subramanian, V., T.F. Burks, and . A.A. Arroyo. 2006. Machine Vision and Laser Radar-based Vehicle Guidance System for Citrus Grove Navigation: A Performance Comparison. Computers and Electronics in Agriculture (accepted with revision 1/24/06)
  • Sivaraman B., T. Burks, and J. Schueller. Using Modern Robot Synthesis and Analysis Tools for the Design of Agricultural Manipulators. Agricultural Engineering International: the CIGR Ejournal. Invited Overview Paper No. 2. Vol. VIII. January, 2006.
  • Flood, S.J., T.F. Burks, and A.A. Teixeira. 2005. Physical Properties of Oranges in Response to Applied Gripping Forces for Robotic Harvesting. Transactions of ASAE (submitted 12/20/05).
  • Sivaraman, B., and T.F. Burks. 2005. Geometric Performance Indices for Analysis and Synthesis of Manipulators for Robotic Harvesting. Transactions of ASAE (submitted 12/20/05)
  • Subramanian, V., T.F. Burks, and . A.A. Arroyo. 2006. Machine Vision and Laser Radar-based Vehicle Guidance System for Citrus Grove Navigation: A Performance Comparison. Computers and Electronics in Agriculture (revision re-submitted 3/10/06)

For more information

Contact:
Tom Burks


Visits since 05/21/2014