Over the last 20 years, there has been increasing levels of automation in mobile machines, bringing control and monitoring capability to machines that offer value to the operator. This has been fueled by the increase in capability (and the lowering in cost) of computing systems that can be used to interface with sensors for monitoring and actuators for control, with software written to tie the monitoring and control relationships together. This automation has brought about efficiencies in off-highway applications, which can be seen in precision agriculture applications, or automated construction equipment.
While this evolution of computing systems continues, a complementary path is now evolving in perception systems that bring new capabilities to sensing the physical world, and offer potential for an order-of-magnitude improvement in efficiencies in a wide range of off-highway applications.
Perception technologies allow automated machines to sense the physical world. This includes a wide range of rapidly evolving technologies that are becoming ever more powerful, reliable, and capable of solving complex automation problems that have traditionally only been well-suited for human control. Some major perception sensing technologies include LiDAR, camera vision systems, RADAR, and ultrasonic sensing. LiDAR and vision systems in particular are improving at a very fast pace, driven primarily by the automotive industry. Driver assist and fully autonomous functionality has been a major focus in the automotive industry over the last decade, and a key problem in these areas is developing a high-fidelity model of the physical world surrounding the vehicle. Billions of research dollars have poured into solving these problems, and this has now resulted in advanced vision systems that are now available for a fraction of the cost of these technologies just a few years ago.
This presents an enormous opportunity to bring increased value to off-highway mobile machine applications by applying this new technology to complex problems that previously could not be solved by automated systems.
Opportunity for Application in Off-Highway Systems
While perception technologies evolving in the automotive market can be very useful in the off-highway market, the types of problems they need to apply to are very different. Automotive perception problems involve things like identification of road signs, vehicle/pedestrian recognition, and lane identification. These are not the same problems that are important for agriculture and construction machines. While on-highway vehicles have a primary function of transportation, off-highway vehicles typically have an alternate primary function related to the work they need to perform. For example, agricultural equipment has a primary function of seeding, planting, spraying, harvesting, tilling, or baling, and construction equipment has a primary function of digging, grading, lifting, or moving earth. Most of these applications depend heavily on human operators to see the world around them and make decisions on how to properly operate the equipment in complex environments.
A tremendous amount of efficiency has been gained over the years in applying automation with GPS technologies for improving the precision of many of these applications, but often the work performed is not only related to the specific location in the world but also to the dynamic environment surrounding the machine. This is where advanced perception systems can bring new value.
These technologies can be applied to a wide range of object detection and object recognition problems that have been difficult for automated systems in the past. These perception systems allow for high resolution data of the surroundings to be gathered and used in algorithms that can draw out key features that are important to the application. A combination of these sensors and advanced algorithms provide the opportunity to close the gap between the efficiency of machines operated by novice and experienced operators by further automating machines to find the most optimal ways to accomplish the work to be performed.
Some examples of applications where perception systems can be used for improved machine efficiency are:
- Machine guidance and navigation in GPS-denied environments
- Detection of people or obstructions in work areas
- Evaluation of ground conditions (too wet, too dry), or weather conditions
- Complex alignment of independently moving machines
- Plant/crop evaluation – detection of weeds (separate from crops)
These are just a handful of potential applications that can be improved significantly by the use of perception systems in off-highway applications. Really any application where a human can detect the surroundings and make decisions to better operate equipment, is an opportunity for applying perception system to consistently emulate the most efficient operators. So, now the question is: How?
Implementing Perception Systems
While perception technologies are now at the place where the functionality and price point open up a huge number of applications that can offer real value to the end user, the complexity of these systems can be a barrier for implementation. There is no doubt that a close understanding of the technologies and applications are important for effective application of the technologies. The figure below shows a block diagram of the major considerations in the design of a perceptions system.
The goal of a perception system is to create a model of the real world for the application parameters that matter for automation. The sensors (LiDAR, RADAR, cameras) provides data of the sensed real world, and the processing system makes sense of this data to build a model of the real world that can be used to make decisions about machine operation. Choosing the right sensors and the right processing system (including algorithms) are critical to building an effective perception system.
JCA’s Engineering team has experience in the application of various perception technologies in a wide range of different mobile machine applications. In development of these systems, we have built experience in the application of each of these perception sensor types to understand the strengths and weaknesses for different problem types and different environments. We have also found the use of open-source technologies to be very valuable in building perception processing systems quickly and reliably. An example of this is the use of ROS (Robot Operating System) running on the JCA Hummingbird module for interfacing with advanced perception sensors.
ROS (www.ros.org) is a set of open-source software libraries that allow for rapid development and deployment of complex perception systems. ROS has gained momentum over the last few years, and now many manufacturers of advanced sensors provide ROS drivers for their products to allow for easy integration. JCA’s Hummingbird module is a ruggedized Linux-based computing system that can run ROS and allow for the connection of several ROS nodes over Ethernet or CAN interfaces. ROS allows for the input of several advanced sensors to be combined in a common frame of reference, so a model of the world can be created from multiple sensor inputs. Using these libraries, the implementation of complex functions specific to unique applications can be simplified by building on the expertise of the experts in the open-source community. The use of the JCA Hummingbird with ROS can work effectively in many applications to serve as the processing block in the above system diagram, to simplify the implementation of perception systems, and provide a clean interface to various sensors in advanced perception systems. This allows for rapid deployment of machine applications that incorporate perception systems for sensing complex environments, and using this information to build the intelligence of experienced operators into the automation provided by machine controls.