Source: FAUXSEE INNOVATIONS LLC submitted to
ROBOFINDTM - SAFE & EFFICIENT RURAL TRAVEL FOR THE BLIND
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
TERMINATED
Funding Source
Reporting Frequency
Annual
Accession No.
1006195
Grant No.
2015-33610-23455
Project No.
ARKW-2015-00600
Proposal No.
2015-00600
Multistate No.
(N/A)
Program Code
8.6
Project Start Date
Jun 1, 2015
Project End Date
Jan 31, 2017
Grant Year
2015
Project Director
Ballard, D. R.
Recipient Organization
FAUXSEE INNOVATIONS LLC
4030 COLUMBIA ROAD 5
MAGNOLIA,AR 71753
Performing Department
(N/A)
Non Technical Summary
The project will prove the feasibility of providing safer and more effective travel for blind individuals who live in rural communities. To that end we will:1. Determine the technologies can be used to provide geo-location information for a blind user navigating in a rural environment. While there are some "talking GPS" devices available, the mapping information provided is not adequate. Existing GPS devices know the location of streets and major structures but the mapping information usually does not include the areas and structures on a farm or ranch - i.e., house, barn, storage shed, silo, holding pen, etc.). RoboFind™ will allow mapping all of the structures of interest on a farm, ranch or rural community and then effectively communicate that information to the blind user. The user can then navigate safely.2. Determine how other technologies can be used to provide the blind with information about the local environment. In particular we are interested in locating fences, livestock, large pieces of farm equipment, gates, doors in buildings, etc. Radio Frequency Identification (RFID) and other forms of Automatic Identification and Data Capture (AIDC) are technologies that have potential for use in providing this additional information. RoboFind™ will detect these location and identification tags and communicate this information to the user.3. Make the necessary changes/additions in our existing Roboglasses™ product so that this product can be used to best effect in rural environments. Roboglasses™ uses haptic feedback to communicate the distance and location of objects detected using ultrasonic sensors. Fusing this information with GPS and RFID information will provide the blind user with a coherent view of the local environment. In addition, we will add audio feedback to Roboglasses™ so that some location and obstacles information can be communicated by voice to the user (e.g., south barn door is 20 feet ahead and to the right".Today, 46% of blind and visually impaired people experience head injury at least once month, sight impaired individuals are 1.5 times more likely to be obese, 81% of blind adults are unemployed and 4 million blind individuals in the U.S. over 25 years old do not have a high school diploma or GED. The Roboglasses™ and RoboFind™ devices will provide safe and more efficient travel for blind individuals of which 4.5 million live in rural communities in the United States. These individuals will be able to navigate outside the confines of their immediate environment to work, attend school, and enjoy more recreation and wellness activities.
Animal Health Component
0%
Research Effort Categories
Basic
0%
Applied
30%
Developmental
70%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
72372202020100%
Goals / Objectives
The goal of this SBIR research effort is to address the specific problems that blind rural inhabitants incur and develop a solution that will allow the blind to continue to live and work in a rural environment. To that end, we are proposing the development of a new product that will work with our Roboglasses™ product and provide navigation, object identification and other capabilities to the blind user. We term this new innovation - RoboFind™ because it can be used to help a blind wearer find his way, find a particular location or even a particular object.The primary objective of the proposed research is to determine feasibility of leveraging new technology that will enable blind users to navigate, avoid hazards and obstructions and travel safely in a rural environment. We will determine the feasibility of augmenting Roboglasses with a GPS navigation capability coupled with strategic use of RFID tags to improve the life and work efficiency of blind and sight-impaired users. The technical objective of the proposed work effort can be summarized as the development of an add-on device for use with Roboglasses™ that augments Roboglasses™ by providing direction and object identification services to the user.To meet this primary objective a number of sub-objectives must be accomplished. These include:• Determining methods of fusing information from existing Roboglasses™ sensors (ultrasonics, accelerometers, gyros, and compass) with GPS navigation and RFID tag detection information. This fused information must provide a state-space description of the user's current location and environment. The Phase I effort will develop, debug and test this fusion software.• Determining the algorithms required for producing an accurate state space description of the environment as well as safe travel and navigation instructions. We will code the required algorithms and test them in a rural environment.• Determining the algorithms needed for calculating a safe route of travel. These route-planning algorithms will utilize GPS mapping information and waypoint information entered by the user. Route planning is made more difficult in rural environments because the traveler often will not follow established roads found on traditional GPS maps. Rather travel will be shortest distance across fields, pastures, woodlots, etc.• Using only audio and haptic feedback, determine how to communicate route information to the user along with notification of departures from the planned route, features of interest encountered as identified by associated RFID tags, hazards in the travel path and other information. Our baseline design concept assumes that the user will provide commands to the system using simple capacitive touch buttons and slider input devices. These input devices will be located on the Roboglasses™ frame and stems. The Roboglasses™ and RoboFind™ systems will use a bone conduction speaker mounted in the glasses stems to convey audio information about the route, location, hazards, system status, etc. One of the major challenges of this research is developing a simple gesture-based system that will be easy to use, effective and not disruptive to the wearer. We will also use the haptic (touch feedback) devices in the Roboglasses™ to provide status and warning cues.• Determine the kind of user support services that must be provided by the proposed system. We must identify the kinds of services necessary to enable operation of the device. Support services include loading GPS data (waypoints) and map data, RFID tag programming. Tags have a unique ID associated with the tags identification (e.g., hay barn, stall) and/or location (northeast door) as well as capturing and entering audio information to be played back to the user ("Now at northeast door of dairy barn."). We must determine the exact nature of the services required and how they will be provided. This will require careful design to allow a blind user to utilize these services.
Project Methods
This project is being conducted as a product research and development effort. This includes the development of both hardware and software. To that end, the project will be initiated with a requirements definition phase that will clearly identify the functional and performance requirements of the RoboFind™ system. After the requirements are clearly defined we will continue the system development activities by developing the RFID, Bluetooth and GPS subsystems.The development effort will utilize off-the-shelf processors and I/O devices to minimize the amount of hardware that will have to be developed on Phase I. Phase I will focus on developing the software required for implementation of the system. We will use industry standard rapid software prototyping techniques.At the completion of the hardware and software component development, we will evaluate RoboFind™ using it with our Roboglasses product. The planned evaluation demonstrations will occur on a local small farm (navigating across open terrain, encountering fences, out buildings, etc. as well as livestock. A second evaluation will be conducted on the campus of Southern Arkansas University evaluating the devices in both their farm and campus environments (fences, milking barns, row crops, multiple buildings and traffic and other urban hazards)..

Progress 06/01/15 to 01/31/17

Outputs
Target Audience:Our target audience are those blind or visually impaired (BVI) individuals who live and/or work in rural areas. Our research is focused on providing a product that can be used by those with total loss of eyesight and those with some degree of vision. Also, our product - roboFind™ can be used in conjunction with our Roboglasses™ product if desired. Our current research is restricted to developing a product for use by BVI adults. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Nothing to report How have the results been disseminated to communities of interest?Nothing to report What do you plan to do during the next reporting period to accomplish the goals?Nothing to report

Impacts
What was accomplished under these goals? The RoboFind™ Phase I effort established the feasibility of building a device to help Blind and Vision Impaired (BVI) individuals navigate and traverse both rural and urban terrain in a safe and effective manner. We developed the hardware and software required for implementing such a solution. Our research established the feasibility of building such a device for assisting the blind traveler. RoboFind™ functions are described briefly in the following paragraphs. Processor: A high-performance 32-bit microprocessor is used as the primary control processor for RoboFind™. This processor controls and collects data from these devices and develops an awareness of the user's current location and uses this knowledge to develop travel plans and directions for the RoboFind™ user. We used a DK-TM4C123G processor card from Texas Instruments (TI). The card provides three 3-axis gyros. These gyros were configured to provide a 3-axis accelerometer, gyrometer and magnetometer. Software was developed that implemented a Discrete Cosine Matrix (DCM) representation of the movement and motion of RoboFind™ in 3D space. GPS Receiver: This function receives GPS information from the GPS satellite system and provides latitude, longitude and time information. Our current implementation uses the FGPMMOPA6 module from GlobalTop Technology, Inc. RFID Reader/Writer: The test bed uses the M6E Nano RFID reader/writer module from ThingMagic. This single module solution requires minimal power and provides good RFID tag reading and writing performance. DARP Circuit Card: In our search for a digital voice-processing module that could record, store and playback voice commands we were unsuccessful in finding any commercial off-the-shelf module that met all of our cost, size, weight, memory storage and power requirements. For this reason, we developed a custom circuit card that provides the voice processing capability we require. The DARP module uses a Nuvoton ISD3900 integrated circuit device that provides for the storage and retrieval of voice data. The ISD3900 is a multi-message record/playback device. The device is designed to operate with a serial non-volatile FLASH memory device. Our implementation uses the Winbond W25Q128 device that provides up to 64 minutes of voice recording time. This is more than adequate storage for the voice utterances used in the RoboFind™ implementation. RoboFind can store a newly recorded voice message, playback a recorded message to an on-board speaker, or playback a recorded message via the Bluetooth Transceiver. Incoming voice can also be directly streamed to the Bluetooth transceiver without being stored. Bluetooth Transceivers: The test bed uses two different Texas Instrument Bluetooth transceivers. The TI BT-MSPAUDSOURCE module is used for implementing a Bluetooth Audio source function. We also used the TI SimpleLink™ CC2564 Bluetooth device in our evaluation. This is a dual mode evaluation module supporting both Bluetooth Classic and Bluetooth Low Energy (LE) modes of operation. This board mounts on the back of the DK-TM4C123G using the wireless interface connectors. User Support Services. The RoboFind test bed must not only provide routing and guidance information to the user, it must also provide facilities for setting up the RoboFind to operate in a given user's physical environment. Support services include loading GPS data and map data, identifying likely waypoints to a route of travel, RFID tag programming (tags have a unique ID associated with the tags, identification (e.g., hay barn, stall) and/or location [northeast door] as well as capturing and entering audio information to be played back to the user {"Now at northeast door of dairy barn.") One of the major findings of our research is that one of the most effective methods of providing user support information is use of an engineering tool that we designed to monitor and control RoboFind operation. This tool is call the RoboFind Graphical User Interface or RoboFind™ GUI. The GUI is approximately divided into four quadrants. The navigation panel in the upper left quadrant is concerned with providing navigation information. The motion-processing panel in the upper right quadrant provides a display of the raw data from the three 3-axis gyros. One is configured as an accelerometer, the second as a gyroscope and the third as a magnetometer. The utterance panel in the lower left quadrant contains controls and displays required for recording and playing back speech utterances. The RFID panel in the lower right quadrant contains a tabbed panel used for controlling and displaying various kinds of RFID information. These include: Navigation panel. The navigation panel consists of three major groups of components. The first is a Google maps display. The user is provided with all of the functionality available in Google Maps. This includes zooming in and out, panning around the map field and selecting either a map or satellite view of the terrain. The default view in the map panel is the location provided by the GPS receiver. The system automatically centers the view on the GPS latitude and longitude locations. A number of text boxes are provided that display information GPS data. This includes a digital timestamp, the latitude, latitude direction (north or south), longitude, and longitude direction (east or west). This panel also includes a compass rose that indicates the current heading of the RoboFind user. Heading is determined by the magnetometer by solving the Euler equations from the three 3-axis gyros. Motion-processing panel. The motion-processing panel provides three strip chart recorders that provide a real-time display of the data from the three 3-axis gyros. The motion processor is an MPU9150 provided by Invensense, Inc. Utterance panel. The utterance panel provides controls and displays for managing audio speech recording and playback. A dial selector knob is used to select a particular speech utterance for either recording or playback. Text boxes describe the contents of the utterance as well as the context of the utterance. Controls are provided to play the utterance back through either local speakers or via Bluetooth to the Roboglasses™. Provision is provided to monitor the length of a given utterance and the total amount of voice storage used. RFID panel. This is a tabbed panel that provides controls and displays for reading and writing RFID tags. Selecting "Tag Reading" will display the contents of an RFID tag as read by the m6e Nano RFID device. Selecting "Tag Writing" allows the user to enter RFID tag information as well as contextual information connected with that tag. A third tab called "Tag Contents" displays the memory contents of the RFID tag being read. Software Development As part of the Phase I effort, it was necessary to develop a significant amount of software. The focus of the software development was on developing the necessary low-level drivers required to exercise and test each of the hardware modules. We successfully completed the necessary modules. In many cases, TI, the manufacturer of our control processor provided the drivers. In other cases, it was necessary to develop our own driver software. In particular, it was necessary to develop rather sophisticated software to interface with the GPS receiver device. The device produces full NMEA sentences that describe the current location. It was necessary to build a parser to extract the data (latitude, longitude, timestamp, etc.) from the sentence. We also implemented the Discrete Cosine Matrix software and the software to extract Euler angles using data provided by the MPU9150 motion processor. We have demonstrated that the hardware and software in RoboFind™ can provide a solution for navigation and safe travel for BVI persons moving about in a rural environment.

Publications


    Progress 06/01/15 to 05/31/16

    Outputs
    Target Audience:Our target audience remains those blind individuals who live and/or work in rural areas. We are restricting our current research to adults. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest? Nothing Reported What do you plan to do during the next reporting period to accomplish the goals?We will be completing the integration of the test bed and software with all sensors. We will complete the development of the route planning and guidance software. The project will culminate with an evaluation of the system on the campus of Southern Arkansas University and on a local farm.

    Impacts
    What was accomplished under these goals? The primary goal of this research is to determine the feasibility of using new sensor and geo-location technologies to enable blind users to safely travel and navigate, avoid hazards and travel safely in a rural environment. We are developing technology and solutions that will enable safe communications outdoors and within buildings. The primary focus of the research is on identifying and using the new technologies in sensors to provide an enhanced environmental state description that can be communicated to the blind user. In the early phases of our investigation we have focused on identifying and then integrating the new sensor technology required for creating a state space description. We have focused on the following sensors and sensor technologies. a. Ultrasonic sensors. We are currently using ultrasonic sensors in our Roboglasses™ product for object detection and subsequent distance determination to that object. Object location and distance is communicated using haptic feedback devices located in the glasses stems. Because of limitations in haptic device size (8 mm) and human physiology (ability to determine relative location of haptic actuation) we are limited to 5 devices in the glasses stem and identifies object to within ±300mm. This works well for determining the relative location and distance to an object. However, the sensors are capable of providing accurate distance information with an accuracy of ± 1mm. This raw sensor data will be used in RoboFind™ to aide in object identification as well as location. We are currently modifying the Roboglasses™ software to make this highly accurate measurement data available to RoboFind™. We are currently using sensors manufactured by Maxbotics Inc. b. Global Positioning System (GPS). We are using the CC4000 GPS module manufactured by Texas Instruments. This device provides provides time, latitude, longitude, satellite status, and speed. Note however, that GPS information is only available when the user is outdoors. c. Radio Frequency Identification (RFID). RFID technology allows the use of small, inexpensive RFID tags to identify objects and locations. We have selected a device from ThingMagic, a Trimble Navigation company called the nano. This small device is used to both read and write RFID tags. c. Accelerometer. We have adopted the MPU-9150 device for use in providing acceleration information. Acceleration information is necessary for calculating the distance traveled by the RoboFind™ user. This device provides acceleration information in 3 axes. We have completed integration of the MPU-9150 into our testbed and are able to access the required data. d. 3-Axis Gyro. The 3-axis gyro is also a part of the MPU-9150 device. This device provides data that indicates the orientation of the user. This device will also be integrated into the Roboglasses™ device to indicate the orientation of the user's head. The device has been integrated into the RoboFind™ test bed. e. Magnetometer (compass). The MPU-9150 also provides a 3-axis magnetometer. This device will be used to implement the magnetic compass function that is necessary for dead-reckoning and indoor navigation. This device has been integrated into the test bed and we have begun development of algorithms f. Sensor tags. We are investigating the use of sensor tags for use with RoboFind™. These are small, intelligent tags that were developed as part of the Internet of Things (IOT) research at Texas Instruments. These devices have an embedded Bluetooth radio and transmit information to a Bluetooth receiver. Since RoboFind includes a Bluetooth device for communicating with the Roboglasses™ it can easily adapt to utilize the sensor tag technology. Sensor tags will be used along with RFID tags to indicate location of objects and areas of interest. We have completed the design of the test bed and are currently integrating all components into it. We are currently also integrating the first version of the RoboFind™ software into the test bed. This includes all drivers for all of the devices mentioned above as well as a graphical user interface (GUI) that runs on a laptop and is used to show the data and state of the RoboFind sensors and the direction instructions provided to the user. We are continuing our research into development of the most appropriate environmental state description for use in RoboFind™ computation. This state description uses real-time information from the sensors and is the primary data structure used in computing routes of travel, locating hazards to travel, and providing hazard avoidance instructions to the user. Our work has been informed significantly by the previous work of Dr. Nicholas Giudice at the University of Maine. Dr. Giudice will become more involved as the test bed is finalized and new route finding algorithms are implemented. In addition, Dr. Mahbub Ahmed from Southern Arkansas University (SAU) is assisting with the design and fabrication of 3D printed plastic parts used by the project. We have also contracted with the company 3-D Frame Solutions in Texarkana, TX who is assisting us in the design and fabrication of the modified Roboglasses™. We are making a number of modifications to the Roboglasses™ to provide the necessary functionality to utilize the RoboFind™ device. We completed an initial survey of the campus of SAU. The campus will be used as one of the locations for evaluation testing of the completed RoboFind™ test bed. We identified likely locations for the outdoor testing which will require navigation using both sidewalks and across open areas. We identified hazard areas and will utilize those in our test scenarios to ensure that RoboFind™ avoids these hazardous situations. We are currently surveying the Foshee farm as another location for test bed evaluation. We are particularly interested in demonstrating navigation and wayfinding in indoor locations where GPS is not accessible.

    Publications


      Progress 06/01/15 to 01/31/16

      Outputs
      Target Audience:Our target audience remains those blind individuals who live and/or work in rural areas. We are restricting our current research to adults. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest? Nothing Reported What do you plan to do during the next reporting period to accomplish the goals?We will be completing the integration of the test bed and software with all sensors. We will complete the development of the route planning and guidance software. The project will culminate with an evaluation of the system on the campus of Southern Arkansas University and on a local farm.

      Impacts
      What was accomplished under these goals? The primary goal of this research is to determine the feasibility of using new sensor and geo-location technologies to enable blind users to safely travel and navigate, avoid hazards and travel safely in a rural environment. We are developing technology and solutions that will enable safe communications outdoors and within buildings. The primary focus of the research is on identifying and using the new technologies in sensors to provide an enhanced environmental state description that can be communicated to the blind user. In the early phases of our investigation we have focused on identifying and then integrating the new sensor technology required for creating a state space description. We have focused on the following sensors and sensor technologies. a. Ultrasonic sensors. We are currently using ultrasonic sensors in our Roboglasses™ product for object detection and subsequent distance determination to that object. Object location and distance is communicated using haptic feedback devices located in the glasses stems. Because of limitations in haptic device size (8 mm) and human physiology (ability to determine relative location of haptic actuation) we are limited to 5 devices in the glasses stem and identifies object to within ±300mm. This works well for determining the relative location and distance to an object. However, the sensors are capable of providing accurate distance information with an accuracy of ± 1mm. This raw sensor data will be used in RoboFind™ to aide in object identification as well as location. We are currently modifying the Roboglasses™ software to make this highly accurate measurement data available to RoboFind™. We are currently using sensors manufactured by Maxbotics Inc. b. Global Positioning System (GPS). We are using the CC4000 GPS module manufactured by Texas Instruments. This device provides provides time, latitude, longitude, satellite status, and speed. Note however, that GPS information is only available when the user is outdoors. c. Radio Frequency Identification (RFID). RFID technology allows the use of small, inexpensive RFID tags to identify objects and locations. We have selected a device from ThingMagic, a Trimble Navigation company called the nano. This small device is used to both read and write RFID tags. c. Accelerometer. We have adopted the MPU-9150 device for use in providing acceleration information. Acceleration information is necessary for calculating the distance traveled by the RoboFind™ user. This device provides acceleration information in 3 axes. We have completed integration of the MPU-9150 into our testbed and are able to access the required data. d. 3-Axis Gyro. The 3-axis gyro is also a part of the MPU-9150 device. This device provides data that indicates the orientation of the user. This device will also be integrated into the Roboglasses™ device to indicate the orientation of the user's head. The device has been integrated into the RoboFind™ test bed. e. Magnetometer (compass). The MPU-9150 also provides a 3-axis magnetometer. This device will be used to implement the magnetic compass function that is necessary for dead-reckoning and indoor navigation. This device has been integrated into the test bed and we have begun development of algorithms f. Sensor tags. We are investigating the use of sensor tags for use with RoboFind™. These are small, intelligent tags that were developed as part of the Internet of Things (IOT) research at Texas Instruments. These devices have an embedded Bluetooth radio and transmit information to a Bluetooth receiver. Since RoboFind includes a Bluetooth device for communicating with the Roboglasses™ it can easily adapt to utilize the sensor tag technology. Sensor tags will be used along with RFID tags to indicate location of objects and areas of interest. We have completed the design of the test bed and are currently integrating all components into it. We are currently also integrating the first version of the RoboFind™ software into the test bed. This includes all drivers for all of the devices mentioned above as well as a graphical user interface (GUI) that runs on a laptop and is used to show the data and state of the RoboFind sensors and the direction instructions provided to the user. We are continuing our research into development of the most appropriate environmental state description for use in RoboFind™ computation. This state description uses real-time information from the sensors and is the primary data structure used in computing routes of travel, locating hazards to travel, and providing hazard avoidance instructions to the user. Our work has been informed significantly by the previous work of Dr. Nicholas Giudice at the University of Maine. Dr. Giudice will become more involved as the test bed is finalized and new route finding algorithms are implemented. In addition, Dr. Mahbub Ahmed from Southern Arkansas University (SAU) is assisting with the design and fabrication of 3D printed plastic parts used by the project. We have also contracted with the company 3-D Frame Solutions in Texarkana, TX who is assisting us in the design and fabrication of the modified Roboglasses™. We are making a number of modifications to the Roboglasses™ to provide the necessary functionality to utilize the RoboFind™ device. We completed an initial survey of the campus of SAU. The campus will be used as one of the locations for evaluation testing of the completed RoboFind™ test bed. We identified likely locations for the outdoor testing which will require navigation using both sidewalks and across open areas. We identified hazard areas and will utilize those in our test scenarios to ensure that RoboFind™ avoids these hazardous situations. We are currently surveying the Foshee farm as another location for test bed evaluation. We are particularly interested in demonstrating navigation and wayfinding in indoor locations where GPS is not accessible.

      Publications