Source: UNIVERSITY OF NEBRASKA submitted to
PAPM EAGER: TRANSITIONING TO THE NEXT GENERATION PLANT PHENOTYPING ROBOTS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
NEW
Funding Source
Reporting Frequency
Annual
Accession No.
1011763
Grant No.
2017-67007-25941
Project No.
NEB-21-169
Proposal No.
2016-10994
Multistate No.
(N/A)
Program Code
A5172
Project Start Date
Nov 15, 2016
Project End Date
Nov 14, 2018
Grant Year
2017
Project Director
Ge, Y.
Recipient Organization
UNIVERSITY OF NEBRASKA
(N/A)
LINCOLN,NE 68583
Performing Department
Biological Systems Engineering
Non Technical Summary
High throughput plant phenotyping (HTPP), the use of holistic and large-scale approaches to collect plant phenotypic information, bears the promise to spark a new green revolution. It will be an important part of the solution to the global food security challenge by 2050 when the world population is likely to exceed 9.7 billion. As the current state of the art in HTPP, digital imaging greatly enhances our ability to capture plant phenotypes. However, the disadvantage of digital imaging is also obvious. Phenotypes are measured in terms of image pixel count or pixel intensity. These image data, by themselves, convey little information of biological relevance. To maximize the utility of the image data, they have to be analyzed and interpreted jointly with manually measured plant physiological or chemical traits, which are still slow and expensive to collect. The overall goal of this project is to develop next generation plant phenotyping robots to enable autonomous and in vivo (and human-like) plant physiological and chemical trait measurements. The phenotyping robots will greatly improve the throughput and capacity, and at the same time substantially reduce the cost of plant phenotyping.The overall goal of the plant phenotyping robot will be realized with three research thrusts. Firstly, novel robotic grippers that integrate specialized plant sensors will be designed and developed. Two specific plant sensors being considered are (i) a Y-shape bifurcating fiber optics sensing head for a variety of optically sensed plant physiological and chemical traits, and (ii) a leaf porometer for stomatal conductance measurement. Secondly, a novel robotic vision system that combines a RGB camera and a Time-of-Flight 3D depth camera will be constructed. Novel Image processing algorithms will be developed for plant leaf segmentation and localization. The algorithm will also rank the most suitable plant leaves for automated sensing, and calculate the approaching vector of the robotic gripper for successful leaf grasping and sensing. Thirdly, the developed plant phenotyping robot will be tested in the high throughput imaging greenhouse at University of Nebraska-Lincoln. Different corn and soybean lines with known susceptibility to water and nutrient stresses will be used to demonstrate and validate the throughput, accuracy and capacity of the phenotyping robot.The plant phenotyping robots will be a critical enabling technology to advance the science of plant phenomics and allow better genomics - phenomics analysis for trait discovery and crop improvement, which is an indispensable part of the overall solution to the long term food and energy security problems facing our society.This project will create an interdisciplinary environment where the graduate and undergrad students will receive training in both plant science and engineering robotics. The undergrads will be employed to work on the project through BSE-PIE, a program initiated by the team to attract students with engineering background to conduct research on plant phenotyping. Using the material from this project, the PIs will develop new course modules to expose students to this new frontier of automated plant phenotyping. A robotic competition focusing on autonomous plant phenotyping will be created within PI's professional society. Finally, results and findings from this project will be broadly disseminated through conferences, peer reviewed publications, and open collaborative platforms including iPlant Collaborative and Robotic Operating System.
Animal Health Component
0%
Research Effort Categories
Basic
20%
Applied
40%
Developmental
40%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
2011510202050%
2031820108050%
Goals / Objectives
The overall goal of this project is to develop automated robotic systems that can realize in vivo, human-like plant phenotyping in the greenhouse.There are three specific objectives.1) Integrate specialized plant sensors with the robotic gripper and manipulator for in vivo, human-like plant sensing;2) Develop the robotic vision system to guide the robotic arm for plant leaf approaching, grasping, and sensing;3) Evaluate and validate the plant phenotyping robot.
Project Methods
Objective 1: Integrate specialized plant sensors with the robotic gripper and manipulator for in vivo, human-like plant sensing. We will develop two types of robotic grippers integrated with specialized plant sensors. The first one will integrate a Y-shape bifurcating fiber optical sensing head that allows the retrieval of optical properties from plant leaves. This design will allow the measurement of a wide array of optical traits that are closely associated with plant chemical and physiological traits (such as water content, nitrogen, pigments, and photosynthesis). The second one will integrate an off-the-shelf leaf porometer from Decagon Devices to measure stomatal conductance and gas exchange. The grippers will then be integrated with a robotic manipulator.Objective 2: Develop the robotic vision system to guide the robotic arm for plant leaf approaching, grasping, and sensing. We will develop a RGB and 3D-depth camera based vision system for the plant phenotyping robot. Novel image processing algorithms will be developed to (1) segment individual plant leaves from background, (2) form a ranking of the suitability of each segmented leaf on the plant for automated sensing, and (3) compute the approaching vector of each grasping point. After the vision system is developed, "eye-hand" coordination will also be tested extensively for the system to make sure the robotic manipulator/gripper in Obj.1 and vision in Obj. 2 work harmonically and cooperatively.Objective 3: Evaluate and validate the phenotyping robot. We will conduct experiments to evaluate and validate the phenotyping robot in the high throughput phenotyping greenhouse at the University of Nebraska-Lincoln. The validation test will be conducted using maize and soybean. The validation experiments will include two forms of stress common under agronomic conditions: drought and nitrogen limitation. Similar measurements by human operators will be carried out concurrent with the measurements by the phenotyping robot. In addition, all plants will be imaged using five different imaging modules in the greenhouse. Data collected from the validation tests will be analyzed to answer the following key questions regarding the overall hypothesis (1) what is the success rate (η) and average cycle time (t) of the phenotyping robot for automated leaf grasping and sensing, and how do they compare to a human operator? Answering these questions addresses the throughput requirement. We expect η and t will vary substantially according to plant species, canopy architecture, and developmental stages. (2) How do the leaf trait measurements by the phenotyping robot compare to those by human operators? Correlation analysis will be conducted with these two sets of measurements. Answering this question addresses the accuracy requirement of the phenotyping robot. (3) Correlation analysis and comparison will also be made between the plant images and the robotic measurements vs. human measurements; and we will attempt to identify existing or novel image-based traits (or combinations of traits) which are reliable predictors of the physiological and biochemical traits measured by the phenotyping robot. This will evaluate the usefulness of data collected by the phenotyping robot in complementing plant images to improve the capacity of plant phenotyping.

Progress 11/15/16 to 11/14/17

Outputs
Target Audience:The Target Audience reached by the efforts include: (1) Nebraska farmers and growers with an interest in agricultural technology. On Feb/2/2017, PI Ge gave a presentation titled "High throughput plant phenotyping research at University of Nebraska-Lincoln" at Nebraska Agricultural Technologies Association Conference. About 30 growers from all over Nebraska attended the presentation. (2) On Mar/4/2017, PI Ge was invited to give a presentation titled "High throughput plant phenotyping in greenhouse and field - Translational pipelines from gene discovery to crop improvement" at Iowa State University's R.F. Baker Plant Breeding Symposium. About 150 graduate students in plant breeding and professionals from industries listened to the presentation. (3) On Apr/7/2017, PI Ge was invited to give a presentation titled "Engineering instruments and robotics for high throughput plant phenotyping" at Predictive Crop Design: Genome-to-Phenome Symposium (hosted by NSF and Nebraska EPSCOR). About 200 scientists (primarily plant scientists) from universities in NE, KS, MO, ND, SD, and IA attended the Symposium and the presentation. (4) On Apr/10/2017, PI Ge was given a presentation titled "Advanced imaging for phenotyping water-related crop traits" at 2017 Water for Food Global Conference. The seminar was attended by 50 scholars from the world with an interest in agricultural water use and management. (5) July 17-19, PI Ge and graduate student Abbas Atefi attended the annual meeting of American Society of Agricultural and Biological Engineers (ASABE). Atefi made a presentation titled "Development of a robotic system to grasp plant leaves for phenotyping" in a technical session. The presentation was attended by ~60 people with a technical interest in agricultural and biological engineering. (6) Also at this ASABE meeting, Co-PI Santosh and PI Ge led a UNL robotics team to participate in the student robotics competition. Graduate student Abbas Atefi (who is supported by this grant) served as the team captain; and the team members were from BSE-PIE (Biological Systems Engineering - Programming Instrumentation and Electronics) initiated by Santosh with the support from this grant. The team leveraged a lot of knowledge and experience (such as machine vision and gripper actuation system) from this project. There were 17 student teams from all over the world in the competition, and UNL team won the second place. Over more than 500 professionals and students watched the competition. (7) On Aug 4-5, PI Ge attended the American Society of Plant Breeders meeting and made a poster presentation titled "High throughput plant phenotyping robot". Also in this meeting, Ge gave a 5-min flash talk to introduce the project at NIFA Workshop: Plant Breeding, Engineering, Cyber Physical Systems, and Breakthrough Technologies. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?1.5 graduate students have been trained in this project (Abbas Atefi and Ujjwol Bhandari). They have been responsible for the day to day operation of the project regarding the plant phenotyping robot design and fabrication, its greenhouse testing, and data collection and analysis. Two undergraduate students have also been trained in the project (Ema Muslic and Yanni Yang, both are female). Muslic has been in UNL's work study program. Their responsibility mainly included caring the plants in the greenhouse and assisting the graduate students in data collection during the experiment. The UNL robotic team from the BSE-PIE program was a six-member team. They were Abbas Atefi, Piyush Pandey (graduate student), CheeTown Liew (Undergrad from EE), Hesan Sdt (Undergrad from EE), Lucas Renker (Undergrad from EE), and Jenny Wynn (Undergrad from BSE). The undergraduate students in the team received significant training from the graduate students and faculty advisors (Pitla and Ge) in machine vision, image processing, and robotic design for plant detection and analysis. In addition, the team also engaged in a senior design capstone project on agricultural robotics, and provided significant training to them. The four-member capstone team included: John Shook, Purity Muhia (female), Alec Fuelberth and Karlie Knoepfler (female), all BSE undergraduate students. How have the results been disseminated to communities of interest?The project results have been widely disseminated by the research team. Research presentations were made to the following groups: (1) a group of farmers in Nebraska interested in agricultural technologies; (2) agricultural engineers from the world; (3) graduate students in plant breeding at Iowa State University; (4) plant scientists from Midwest states; (5) National Association of Plant Breeders and (6) scientists from the world in irrigation and crop water use. Internal to UNL, a few presentations were given (by Ge and Schnable) to a wide group of scientists who are interested in plant phenomics. What do you plan to do during the next reporting period to accomplish the goals?The PIs plan to do the following items during the next reporting period to accomplish the goals. First, we will further design and develop the robotized gripper. Currently the integrated sensor can measure leaf reflectance and temperature. As proposed, we will build the gripper to measure two more traits at the leaf and whole plant level: (1) the gas exchange rate / stomatal conductance by integrating a commercial stomatal conductance sensor; and (2) stalk thickness of maize/sorghum by integrated a mechanical sensor. We already have initial blueprint of these designs; and development of the gripper will start immediately the next reporting period. Second, we will further improve the machine vision and image processing algorithm so that it will be more efficient for plant leaf segmentation and grasp point localization, in particular for bushy canopies of soybean and wheat. One potential solution is to merge a high resolution RGB camera with the 3D TOF camera, and use vegetation pixels in RGB images to improve segmentation and localization. Third, we will analyze data from the second evaluation of the phenotyping robot and prepare a manuscript for publication. The targeted journal for this publication will be "Plant Methods" or "Computers and Electronics in Agriculture". Last, we will plan a final large-scale evaluation of the phenotyping robot when it is all completed. We will use well-characterized diversity panels for the evaluation (such as Maize and Soybean Nested Association Mapping Population, Co-PI Schnable will ensure the availability of seeds). Plants will grow under control and low water/nitrogen conditions. When they reach full vegetative stage, robotized measurements will be taken on these plants together with an array of conventional measurements. There will be two major objectives in data analysis: (1) to see how well the robotized trait measurements are correlated with conventional measurements; and (2) how the measurements by the phenotyping robot can be used to detect genetic controls (to use GWAS or QTL analysis) of these traits. The second objective will be important to test the overall hypothesis that the phenotyping robot can be a useful tool for high throughput plant phenotyping and plant genotype-phenotype research. Finally we plan to prepare a manuscript to publish the final phenotyping robot and the evaluation results.

Impacts
What was accomplished under these goals? Please use the link here to see a video clip on the plant phenotyping robot in action. https://unl.box.com/s/du8gqbcb0kmbscnlrvhyhcpytyoqxhme Please use the link here for a slide set on the pictures showing the progress: https://unl.box.com/s/vgdbga0tqvsdhmh4xete463uw97yw1l0 Objective 1: Integrate specialized plant sensors with the robotic gripper and manipulator for in vivo, human-like plant sensing A robotic gripper that integrated (1) an optical fiber cable coupled to a portable VisNIR spectrometer for leaf reflectance measurement, and (2) a thermistor for leaf temperature measurement was successfully made. The gripper went through a few rounds of design, redesign and improvement so that it was coupled to a four degree-of-freedom MICO2 Robotic Arm (KINOVA Inc. from Quebec, Canada). The final gripper was 3D printed. We also developed the inverse kinematics of the system. Objective 2: Develop the robotic vision system to guide the robotic arm for plant leaf approaching, grasping, and sensing A SR4500 TOF (Time of Flight) camera (Mesa Imaging at Zurich, Switzerland) was used as the main sensor for the vision system. The camera capture 3D depth image that contained 4 layers of information (the X, Y, and Z coordinates and the reflectance intensity of each pixel). Using the Z (depth) information the plant pixels were first segmented from the background. Then the image processing algorithm went over four major steps: stem detection and removal; leaf identification, leaf angle calculation; and leaf ranking. Combining Obj. 1 and Obj. 2, we developed a GUI (Graphic User Interface) to coordinate the vision and robotic gripping modules (eye-hand coordination) and to take robotized measurements from test plants. The GUI was developed in MATLAB. Objective 3: Evaluate and validate the plant phenotyping robot. We had completed two rounds of evaluation of the prototype phenotyping robot at UNL's high throughput plant phenotyping greenhouse. The first round of evaluation was conducted in Jun 2017. The experiment used 10 maize, 10 sorghum, 10 soybean, and 10 wheat plants. The purpose of this experiment was to evaluate how the plant phenotyping robotic system would perform on different plant sizes and plant canopy complexity. All plants were grown under normal conditions and no stress was imposed. It was found during this test that the robotic performance was satisfactory for maize and sorghum plants; whereas the performance matrices were lower for soybean and wheat. Both soybean and wheat had bush canopy that made the identification and localization of suitable grasping points quite challenging. In addition, wheat plants had narrow leaf blades that made robotized grasping difficult. However, these problems were expected at the beginning of the project. A decision was made to continue a second round of evaluation focusing on maize and sorghum plants, while the team will continue to work on the machine vision and image processing system (obj. 1) to improve the performance on soybean and wheat. The second round of evaluation started on Oct 10 and focused on 48 maize and 48 sorghum plants. The experiment was a 2x2 factorial treatment design. The first factor was water treatment with two levels (water-limited vs. control) and the second factor was nutrient treatment with two levels (high nutrient vs. nutrient). The goal was to create large differences in leaf physiological properties (such as temperature, water content, and reflectance) so that they can be detected by robotized measurement. In parallel to the robotized measurements, we also took measurements with handheld sensors (an ASD VisNIR spectrometer, a fluorimeter, and an IR radiometer) by a student from the same plant leaves. These human-based measurements will later be used to assess the accuracy of the robotized measurement. After all the measurements, the plants were destructively sampled. Leaves were harvested and dried in an oven for water content. The dried leaf samples were then sent to Midwest Laboratory for nutrient analysis (Nitrogen, Phosphorus, and Potassium). These lab-based destructive measurements will later be used to correlate with the leaf-level reflectance by the phenotyping robot. Analysis of data from the second evaluation experiment is ongoing.

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2017 Citation: Atefi, A., Ge, Y., Pitla, S., Schnable, J., 2017. Development of a robotic system to grasp plant leave for phenotyping. 2017 ASABE Annual Meeting Abstract.