The photosynth service offers a multitude of abilities that allow users to create panoramas, spins, walks all with a 3D like affect. This platform enabled them to share their creations with their peers.
Together with the photosynth Dev team, we worked over several months refining, re-defining, and exploring multiple possibilities.
The project required a new UX for a whole new photo technology enabled by photosynth, but the most challenging aspect of all was creating an immersive experience that pivots from time and space, one that reflects the magical qualities that this kind of photography has.
We initially created personas based on user scenarios.
An outside research company, found suitable candidate that fit our requirements, and conducted phone interviews based on questions defined by us.
The next step: research UX paradigms in gaming, specifically FPS and 3D games on mobile as well as more sophisticated PS type games. These could give us inspiration on how to display information on top of 3D environments.
Together with the product’s UX lead, I achieved this immersive feeling, by using full scale photography whenever possible, on top of which sat the UI. The content is the king and the user moved around the worlds created by simply sliding their finger to the sides or (in future versions) moving forward seamlessly. These huge, immersive images, were accompanied by big bold font use, and nice transitions between the worlds that users created. The portals to connected synths were represented by delicate empty circles, that came into focus as the user moved them to the center of the screen.
Adding the image’s title, description, tags, is done on top of the image itself.
The Photosynth app “Feed” is also immersive, flipping through image headlines changes the background accordingly.
A new way to take photos
The new photo taking ability, needed us to provide lots of guidance for the user.
Before use: Illustrations of how to snap the photos appeared on top of the camera's live view
While photo taking: The user was guided by the app on where to snap next, by using highlights on top of the live view, and as they approached the correct location the camera pixels came into focus.