Tradeshow Demo Built On JCA’s Autonomous Framework

Erik Kapilik is an engineering student at the University of Manitoba that recently completed a 4-month summer term with JCA Electronics in the JCA Engineering group, where he had the opportunity to gain experience in autonomous machine technology through building a tradeshow demo that integrates some pieces of JCA’s autonomous framework. This article gives an

Humble beginnings to…
…fully functioning demo.

Erik Kapilik is an engineering student at the University of Manitoba that recently completed a 4-month summer term with JCA Electronics in the JCA Engineering group, where he had the opportunity to gain experience in autonomous machine technology through building a tradeshow demo that integrates some pieces of JCA’s autonomous framework. This article gives an overview of his experience.

As a student of Engineering I look to supplement my education with relevant work experience during the Summer months. This Summer I took a Co-Op placement at JCA Electronics. I was happy to learn that the Summer project involved autonomous robotics – a field that is extremely interesting to me. I alone was tasked with building a transportable system to demonstrate JCA’s Autonomous Framework (AFW) in a 5m x 5m area for Agriculture Trade shows. From May to June, the goal was to have a manually controlled system that also demonstrated FlightPath for a Progress show. From July to August, the goal was to demonstrate the Autonomous operation of performing a job.

The JCA Autonomous Framework provides the building blocks necessary to build multi-purpose autonomous off road machines. The Trade show demo design needed to keep the generality of applications that the JCA AFW is capable of, but still make the abstract concepts concrete, as well as the independent subsystems tangible and connected.

Since developing a system architecture for an autonomous machine from the ground up is a daunting task and requires experience with autonomous machines and agricultural machine applications, leveraging the JCA AFW is what made this project possible to complete in the short four month time line. I also was able to get support from JCA’s developers who had worked directly on the sub systems of the AFW.

As a Computer Engineering student I have a breadth of education and experience with electronics, wire harnessing, embedded development, Linux operating systems, scientific languages (such as Python), and application development. This project excitingly required use of all of these technologies.

Stage 1 – Mechanical Platform for the Autonomous Machine

The first stage of this project was to build a mechanical robotic platform. The robotic platform had to be light weight for easy transportation, scalable so that additional robots could be added, operate for long periods, appear mechanically similar to a real Agriculture machine, and drive similar to an Agriculture machine. A robotics kit that fit these constraints was used for the body and motors were used to drive it. Since the robotics kit did not have a very large area to mount a large JCA Controller like a Falcon or Hummingbird, I used a JCA Oriole. The Oriole is in a small container and is light weight. It has one CAN bus, Bluetooth or WiFi capabilities, up to 15 inputs (analog, digital, and frequency), and six high side outputs with PWM capability. I wrote a driver in C to interface with the motor driver as well as an algorithm to convert a target speed and turning angle into individual motor speeds to mimic an actual vehicles turning in a differential driver vehicle. I was able to install this new embedded code on the Oriole over Wi-Fi. Below is the first test of the Frankensteinesque robot with exposed wiring and motors driving in a full circle.

 

Since driving in circles and straight lines isn’t very versatile, I developed a rudimentary Android app that used two sliders for manual control. The tablet was able to connect directly via Wi-Fi to the Oriole’s configurable Wi-Fi Access Point. I used the JCA Parameter Protocol for communication between the two devices. Using the JCA Parameter Protocol is straight forward. Parameter set and get requests get handled in call back functions that I code. I am then able to save the parameter update and use it in the embedded library I wrote to convert steering angles and propulsion levels to individual motor speeds.

Stage 2 – Localization and Mapping

Since the goal was to demonstrate FlightPath, a JCA library that ties together tablet and RTK GPS technology with implement control and path planning, I needed to have GPS data. Using a RTK GPS would be impractical indoors. So instead I investigated vision systems that would be capable of taking a monocular camera image and detecting the robots position and orientation in 3-Dimensional space and then translating this into an artificial GPS coordinate. Thankfully, there were plenty of libraries for this purpose available in ROS packages.

ROS is a robotics middleware that provides libraries and tools to help software developers create robot applications. It provides hardware abstraction, device drivers, libraries, visualizers, message-passing, package management, and more. ROS is very easily installed on and used in JCA Linux Controllers such as the Hummingbird. But since the perception system is using very high resolution raw images with zero compression at 35Hz is a very demanding task, a NVIDIA Jetson TX2 was used for the perception system.

The NVIDIA Jetson TX2 will be replaced in future revisions with JCA’s much anticipated high performance Eagle controller capable of soaring through these type of demanding workloads as well as having an integrated RTK GPS. And it is is able to operate in rugged agricultural conditions; surviving wide temperature swings, high vibration, vehicle electrical transients, etc.

After evaluating several options for perception systems, I resolved on an fiducial tag detection based system called ArUco. After some camera specific configuration and calibration, it was able to give me the position in Cartesian coordinates relative to the camera lens and the rotation of the tag in quaternion coordinates. After some matrix conversions and linear scaling, I used an algorithm to convert local geodetic coordinates to latitude and longitude coordinates. These faked out GPS coordinates were communicated to FlightPath which ran on the Hummingbird via a multi-machine ROS network.

The high resolution raw image was provided over Gigabyte Ethernet from a Flir Blackfly Camera on a tripod positioned above the test area. This camera uses the latest sensors and other advanced features specifically designed for machine learning purposes. JCA Engineering owned one of these already from previous R&D projects – who doesn’t just have top of the line machine learning cameras lying around in storage?

After the manual controls application and the robot were refined, the first goal of the system was achieved on time – to have a manually controlled system that also demonstrated FlightPath for a Progress show.

Side Note: It was difficult to find a field that was small enough that the linear scaling didn’t look unrealistic (this tractor is travelling at 15 km/h in FlightPath) and had equivalent height:width ratio to the set up field.

Stage 3 – Section Control Physical Representation

As can be seen in FlightPath, there are green sections that are turning on/off. These could represent the state of any implement (seeder, sprayer, harvester, etc.) and the green path left behind shows what area has been covered by an active implement section. The interesting feature here is that if a section goes over area which has already been covered it will deactivate itself. This means that a decrease in waste since the sections will only be active over areas not yet covered.

As an additional feature to the Trade show System robot, I wanted to be able to visually communicate the section control activation in real time. I chose to use some green LED strips that looked very similar to the sections on the back of the tractor in FlightPath.

These were quite bright and only required a high side output of the Oriole to be turned on.

They were mounted to back of the robot using a custom designed 3-D printed piece.

I again used the JCA Parameter Protocol to control the state of the LEDs remotely. This just required some set up in the embedded C code on the Oriole and a parameter definition in WindTools. You can see all the parameters I defined and used in the below image of the Wind Tools Web Client.

The parameters could be updated from this Wind Tools web client by hand, but I needed to control the rover from a ROS node that would send control messages. I wrote a python library that was capable of sending HTTP requests to the Wind Tools Server to update parameters and could be used from a ROS node written in Python.

From figure 5 (above), you can see all of the sections are currently active. See video 5 (below) for a demonstration of over lap control being reflected by these LEDs.

The wiring to control the LEDs was straight forward and was easy to add to my wire harnesses. The Oriole readily had many open high side outputs that were controlled in a embedded software module I wrote. Thankfully, JCA has a hardware control library which makes adjusting a high side output as easy as uint8_t rslt = set_output(PIN_NUMBER, OUTPUT_STATE); so you only need to be concerned with the logic of your code.

In the above diagram, DTM06-12SB and DTM06-12SA are the two pin ports on the Oriole which accept industry standard wire harness connectors.

Using ROS to control the Oriole required some networking changes from the previous version where the Oriole was controlled directly from the manual controls tablet. Now controls were published to a ROS Bridge on the NVIDIA Jetson TX2 from the manual controls tablet as a ROS topic. The controls topic was subscribed to by a ROS Node running on the Hummingbird which made use of the Python Wind Tools Client library I wrote to make parameter updates.

Stage 4 – Rudimentary Mission Management

This ROS focused set up easily interfaces with JCA’s Guidance and Mission Control ROS Nodes. The Mission Control node takes missions set up in the FlightPath client app and gives the Guidance node information about the mission. The Guidance node takes the mission information as well as the current location of the robot, which is given by the localization system explained in Stage 2, and pilots the robot autonomously.

FlightPath was able to show the rover in the chosen field and track the progress of implement. The implement could be doing anything in reality, seeding, harvesting, spraying, etc. The only important part to us is that it is being controlled on/off and has overlap control.

To create a mission in FlightPath, several components to the mission must first be defined. These components are left to the user to define to make FlightPath as versatile as possible.

Firstly, a field must be defined. This is simply done by using the UI to drop pins which define the boundary of the field. For the purposes of this demonstration I used a headland width of zero, but the headland width could be increased to give an area for the tractor to turn around without seeding.

Secondly, a path is planned by FlightPath based on the previously defined field, the chosen start point, the width of the implement, and the turning radius of the tractor. You can then name this path to use in the mission.

Also, an implement must be defined. This tells FlightPath the geometry of the tractor implement that is actually being used. The implement can use a 3-point Hitch, and a GPS offset from the centre of the tractor can be defined if the GPS is not centred. You can also define each section of the implement, I only used four large sections to match my LED bars, but many more can be added.

After these three things have been defined a mission can be set up that uses these components. Then you just have to deploy the mission and give the controls over to the autonomous pilot. Below you can see a screen capture of FlightPath showing the robot being autonomously controlled and sections being activated and deactivated.

The below video shows the entire system working in concert! That achieves the second goal of the project: to demonstrate the Autonomous operation of performing a job.

In conclusion, using the JCA Autonomous Frame Work I was able to create a fully autonomous robotic tractor that was capable of driving like an actual agricultural machine and displayed the tractor implement sections activating and deactivating in four months.

Just imagine what you could do with the JCA Autonomous Frame Work.

Thank you to JCA for giving me a very unique summer Co-op experience which allowed me to be creative, work with a breadth of technologies as well as expert developers, and learn about the agile development methodology.

Eric Kapilik

University of Manitoba – Computer Engineering Co-Op Student

https://www.linkedin.com/in/eric-kapilik/

Blog

Read JCA's latest blog articles to learn more about the future landscape of autonomous agricultural technology and how we are shaping the future with it.
See All Blogs
September 28, 2023
AGCO and Trimble announce JV bringing together Trimble Ag and JCA Technologies
JCA Technologies to be part of joint venture between AGCO and Trimble creating an industry-leading mixed fleet Precision Ag focused organization.

Contact Us

Have a vision for making agricultural machinery work smarter? An idea that you’ve wanted to build but haven’t known how to begin?

There’s never been a better time to take that first step. Let JCA Technologies outline a solution for you.