Homework 7: Mobile-Focused Tool


For this round of prototyping I chose to focus on the look and feel of the application. For past iterations, I had been bothered by the inability to animate the interface in the way it would appear to the Doctor. Everything was very static, despite the ability to navigate through the screens. I tried to give all the interface elements a shiny gold metallic look that evokes decorative elements of a gold watch.

The homing interface mainly on showing a path to prearranged objects that the Doctor had prior access to. Each item has its own button that, when touched, shows a path to the desired person or thing. The view of the map starts as a bird’s eye view, and when a button is touched, the viewpoint swoops down to ground level and points at the start of the path. Paths of each type have their own color to indicate which object it’s leading to.

The alarm interface has three different settings: time, location, and a message to indicate what the alarm is for. The time screen has sliding rows of time units. The number or name selected is in the somewhat isolated center of the row. The location screen shows a map interface that allows the Doctor to choose a location for the alarm. The message screen offers the ability to either type a message by touching individual letter buttons, or to touch a microphone icon and speak the message. As words are recognized the microphone pulses a bit.

Version 1 to Version 2

Since I was never a professional animator, and what skills I had with Blender were rusty, I focused on just getting basic meshes and animations into Blender.

The screens for alarm version 1 have moving time rows, and the typing effects. The map is simply static. In version 2, I discovered that Marvel allows different gestures on its hotspots. I set the hotspots to use pinch and spread gestures to simulate zooming in and out of maps. Each image was created using Blender to animate the zooms.

The homing screen in version 1 was just a borrowed image from old prototypes imported into Blender, and given a very basic animation. Mainly, I was just seeing if I could solve the problem of animating the rotation of the map. It turns out I should have animated the camera instead of the map plane. For version 2, I figured out how to tie the buttons to the camera in the scene graph, so buttons are now present. When the TARDIS button is touched, the path appears, and the camera flies in behind the Doctor and faces the start of the path.

Reflection On Using Marvel and Blender

I had originally started this project using InVision, but it only allows images up to 10MB in size to be uploaded. My animations are animated GIFs made from very short movies produced in Blender. Some of the images are several times larger than that. Marvel had no problem with large GIFs at all.

I really love working with Blender, and it’s about as flexible an animation tool as one can find. For situations where very specific behaviors need to be modeled, like the ones I had in this project, it did the job very well. Another benefit of using Blender is that premade meshes of various useful objects can easily be found so that you don’t have to go to the trouble of making them yourself. For example, searching for “Doctor Who Blender”, I was able to find a rather nice mesh of the Doctor to use in the prototypes. The one problem with Blender is that it’s a tool made or detail, not speed. I sunk a lot of time into looking up how to create animations, and tweaking the prototypes. Having gone through this process, I would hesitate to use Blender for anything but the most specific scenes that couldn’t be communicated any other way.

Other Situations and Projects

One use I could see for Blender is to create prototypes of 3D spaces mostly composed of easily found meshes. I found slides from a talk on this subject where the speaker was showing how to use blender for room layouts. These rooms could be scripted to give a walkthrough of an office, or a prototype for a video game.

Of the image-based prototyping tools, I have so far liked Marvel the best, and even paid for a subscription for a few months. It’s flexible, offers touch gestures, and doesn’t complain about my ridiculously huge images. Another cool feature is that there are Android and iPhone apps that let you test out your prototypes right on the phone. InVision had a neat feature where you could text a link to a phone, but it choked on my prototypes. The Marvel app seems to be more reliable.

Homework 7: Mobile-Focused Tool

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s