Project Mercury

Precision agriculture is providing seed companies huge competitive advantage. A prominent seed-growing company contacted Luxoft for aid in this area due to its previous work with John Deere.

We had three months to develop a proof-of-concept for an in-cab harvest visualization tool and data transfer service. Modern tractors are fitted with sensors that communicate crop data to the cab: values like moisture percentage, yield, and mass flow.


Our client already owned a data analysis and visualization platform. This platform provides growers all the tools they needed to make intelligent decisions about their fields. Our problem was in getting the data from the grower's tractor to the client's database. At present, growers used a USB stick to transfer this data from their vehicles to their computer-- certainly an archaic technology in 2017! We wanted to do it real-time via cellular connection.

There was also a data visualization component. Showing the sensor data to growers on the spot and allowing them to make real-time decisions about their crops was a huge value add for our client. Thus our problem was staged: can we create an app that allows users to transfer their data to the cloud and visualize that data in real-time?


Our first three months had a single goal: prove that we could execute on our concept. As the lone designer on this project, this meant I didn't have a lot of time to test or iterate. Deadlines had to be met.

Our initial meeting with the client was on-site at their headquarters in Des Moines, Iowa. At this point we had a two-day brainstorming session on requirements and features.

Because the timeline was so limited I only had a chance to iterate on a low-fidelity prototype for a few days before starting to flesh out the visual design. Luckily, the client had a pre-existing style guide. My task to was to interpret these branding guidelines for the iPad form factor. This was something of a unique challenge because they dictated most of the UI look like Google's Material Design, so I had to find a medium between that and the rules in Apple's Human Interface Guidelines.

Had I been able to spend more time on the project this is an area I certainly could've delved deeper into.

After three months, I had a chance to remedy my mistakes from the first version with a contract extension. We had a group of test users trying the app (literally) out in the field and providing us regular feedback. It was halfway through this contract extension that the project was put on ice, so I did not get a chance to see my solutions developed and tested with real users.

There is a chance the project will continue in 2018.

More screens

Here are some more examples from the project.