WildSense collects hundreds of thousands of images of tigers from around the internet and puts them into a citizen science project for analysis. Users experience Wildsense as a game, in which they annotate the photographs, improving as they play and compete with friends and other Wildsense users. Machine learning algorithms and data analysis are applied to users’ responses, allowing us to uncover insights into tigers’ locations and behaviours. In particular, the app uses algorithms to generate probability-based predictions about the geographical location of tigers.
Wildsense was designed and developed for iPhone and iPad use. It is available on the App Store.
The app is fun and competitive, with stat tracking, an online leaderboard, and achievement points up for grabs.
With each user, Wildsense’s machine learning algorithms get ‘smarter’, and better able to provide invaluable insights for tiger conservation.
Wildsense combines the latest technology in computer vision, crowdsourcing, gamification, machine learning, UX, and iOS design and development.
Wildsense started with an idea: that crowdsourcing wildlife image analysis might yield the sorts of insights that experts aren’t able to get working alone. We set out to design, develop and build a fully-functional app that would source wildlife images from the internet, and allow individual users to comment on them. Using individuals’ responses and machine learning algorithms, we would build a new kind of automated wildlife tracking system.
We designed the app from a carefully thought-out initial specification and built it iteratively, using user flows, wireframes and visual designs.
We were breaking new ground with Wildsense, so we were required to develop a number of original algorithms for its back-end. These included an image ranking system to identify and promote images for analysis.
We developed an original gamification model that encouraged participation from thousands of users.
We performed extensive metadata analysis on the images, which involved semantic analysis and building networks of ‘tags’ to organise the images.
We launched Wildsense in March 2015.
We were delighted to gather substantial press interest from the likes of the BBC, National Geographic, Fast Company, Good, Rewrite Wired and others all over the world.
Following the successes of Wildsense, we’re excited to be returning its configurable back-end for further wildlife projects.
We designed and developed a user interface for a system that monitors financial trading communications.
We built a system that opens up the diagnosis of eye disease to crowdsourcing, allowing users to learn while annotating pictures of retinal scans.