Sony's Hands-Free 'N' Prototype Wearable
When a smiling Japanese man invites you inside his tent on the side of
the road at SXSW, you should take him up on his offer. Especially if
that tent has a Sony logo on the outside.
When I accepted just such an offer, I got some hands-on time with a next-generation wearable prototype designed by Sony's Future Labs Program. Project N is part Google Glass and part Amazon Echo, but one that you can wear around your neck and operate completely hands-free.
The technicians at the Future Lab were clear that Project N is still
very much a prototype—it didn't respond to every voice command. But it
did well enough to give me an idea of what its potential could be in the
wearable market. Whereas Glass required a small screen to present
information, N attempts to provide the same features exclusively using
voice commands.
At a glance, Project N looks like a narrow pair of sunglasses hanging
around your neck—noticeable and somehow out of place. I don't think it
is as distracting as Google Glass, but it does stand out.
The first thing I did when I put on the N was play music. The N
creates a direct field of sound around your head. I could hear the sound
clearly and loudly, but a person standing a few feet away could barely
make out the song. The idea is that you can listen to music and interact
with the audio assistant without disturbing those around you. The
company also developed open-ear headphones that allow you to listen to
music while still taking in ambient noise.
Like Google Glass, you can use N to take pictures of anything that is
directly in front of you. Just ask it to "Take a Picture," and a small
camera lens rolls forward, snaps the photo, and rolls back.
The N is primarily a voice and audio interface. You speak commands,
and it answers you via audio. The voice recognition relies on a
combination of local and cloud-based technologies. The prototype needed
to be on Wi-Fi to work properly; it supports GPS, so theoretically it
could be used to track your bike routes. Sony has partnered with Yelp
and Strava to provide content for the platform, although I wasn't able
to test these out.
There was no information provided about pricing or availability. This
exhibition was just a way for Sony to collect feedback on a new device
platform and allow its engineers to meet with actual users. The big
question is how this device integrates with other personal assistants
like Google Now, Apple's Siri, and Amazon's Alexa.
Will it serve as a hardware front-end or a completely different
personal assistant platform? Sony representatives wouldn't comment, but
they did say they were looking for "a lot" more content partners.