How we coded the world’s coolest ocean drone

Blueye Robotics

Blueye Pioneer is the world’s first ocean drone for consumers. Serving as a digital diving mask, it allows people to explore the world beneath the surface of the sea using their smartphone or tablet, and then share high-quality videos online. As part of an integrated team of designers and developers at Blueye Robotics, my colleague Alexander Vanvik and I, Martin Rechsteiner, helped design and code the app that controls and interacts with Blueye Pioneer. Here’s how we did it...

Early phase prototyping and testing

We were involved in the development process of Blueye Pioneer from the very beginning. Our mission: to create an intuitive way of controlling it. We knew that the drone should be controlled by users’ smartphones, but we weren’t precisely sure how. Whether it was best to control it using a hand-controller, by touch, a head-mounted display or other solutions. Everything was completely open. Therefore, we spent a lot of time prototyping different ways of operating the drone, including many types of input mechanisms. What we learned from all this prototyping and testing formed the basis for the first iterations of the app.

I’ve lost count of how many different types of technical solutions we prototyped and tested before we landed on the solution we are now refining.
Here we are out with the Blueye team testing one of the early prototypes of the drone.

From simple apps to simulators

We tested in two ways: first, we took a boat out and experimented with the web-based prototype that Blueye had already created. This gave us an opportunity to gain experience operating the drone as well as iterating on their existing prototype. Secondly, since we didn’t always have access to a physical drone, we created a simulator in Unreal Engine (a game engine used to develop many high-end game titles). This enabled us to prototype and test a lot from our desks at EGGS, where we could not only iterate interaction principles and user interface elements but also try out different ways of interacting with the drone. Furthermore, we could see the impact on viewing by adjusting the placement and angle of the camera on it.

Here you can see me with a prototype to control the camera using head movements with the Unreal Engine simulator. In order to track the position of the head movement, we taped a phone to my head that broadcasted the accelerator data to the simulator.

Choosing a tech stack

When it was time to start developing the actual app, we spent many hours with Blueye’s chief software developer discussing what technology stack to use. We knew that we needed to support both iOS and Android, so we looked at numerous cross platform options. We didn’t want the user experience to suffer from our choice of framework, so anything that didn’t allow us to use native interface elements wasn’t considered an option. This basically left us with three alternatives: to use React-native, Xamarin or just create native implementations for both iOS and Android.

We quickly decided that React-native probably wouldn’t give us the fine-grained control we needed without having to tap into the native code anyway. Because Blueye already had experience using Xamarin, and we knew that a lot of the same functionality had to be written for both apps if we were to go completely native, we decided that Xamarin would be the most suitable fit.

With Xamarin you can write apps in C#, that compile down to native iOS and Android apps. This meant that we could develop a common core that was shared between platforms, but still have the fine-grained control of writing native interface elements. Things like video transfer, drone communication, video uploading etc. This turned out to be a smart move because it saved us a lot of time by not having to double the programming work.

Xamarin gives you full access to the native API’s. For instance, in the controller view, we’ve implemented all the instruments using UICollectionView’s with custom layouts. We’re also rendering a 3D-model of the drone using a native SpriteKit view.

Working holistically in cross-disciplined teams

Blueye plays a significant role in the development of the drone. We’ve also worked as part of a tight-knit team of product designers, especially on the simulation and testing of the thrusters, camera positioning and different modes of control and steering. Because Blueye has a solid, top-level design strategy that infiltrates all aspects of the product, there has been a well-founded fusion between branding, product design, strategy, and development. This allows us to create a seamless experience for the user.

We’ve developed and innovated in teams, closely interacting with both product designers and digital designers in iterative processes.
Here are some of the people working on Blueye. From left; Alexander Vanvik, digital designer and developer from EGGS, Johannes Schrimpf, Real-time Control Engineer at Blueye, Andreas Viggen, Product Developer at Blueye, me, Borja Serra, Electrical Engineer at Blueye and Jonas Follesø, Chief Software Developer at Blueye.

Creating a social experience

Today, we have a fully functioning iOS app, along with having smartly prepared for the development of the Android app. We’re now focusing on the social aspect of using Blueye Pioneer. The intention is to create a Blueye user community within the app - a social forum where users can share film footage, interact live via video stream and then be able to store recorded material in the cloud. With an official worldwide launch planned for spring 2018, Blueye Pioneer is available to pre-order now.

Visit Blueye and read more about the underwater drone

Sounds interesting?

Alexander Vanvik

Have a chat with our Creative Leader Digital Design
Alexander Vanvik
+47 415 68 492
Email

We use cookies to ensure you get the best experience on our website. If you continue to click on this page, you accept the use of cookies. Read more about our cookie policy and our privacy policy.

Got it!