Saturday, December 14, 2013

Google Glass Impressions



So I just got done trying out Google Glass at a marketing event in Austin.  Here are my impressions.

The event itself was held in a downtown art gallery.  Coffee and sodas, various snacks, gallery staff dressed in black walking around, handlers from Google ready to answer questions and swap out units that weren't working correctly (about halfway through the event, my unit lost its wifi connection).  I signed a waiver that lets Google use all the images and audio recorded at the event for marketing purposes, but I doubt any of you will be seeing me in an ad any time soon.  I think they mostly wanted to cover their asses since there were about a hundred walking video cameras in that space.

As for the hardware itself, Glass is awkward.  It responds to voice commands fairly well, even in a crowded gallery with a bunch of people talking around you, but a lot of the navigation has to be done using the touch interface on the right side of the glasses, just behind the display.  This would be fine on occasion, but during the thirty minutes or so that I was using it, I was swiping and tapping that little touch-sensitive area every 15-20 seconds.

The display is better than I expected.  On the New York Times' website, I was able to see probably about ten lines of article text on the screen at any given time, which was more than I anticipated based on the video demos I’ve seen.  But again the awkwardness of the interface got in the way - you scroll through the page by sliding your finger along the side of the glasses.  In several places in the operating system menus, you can scroll through things by tilting your head, which would have been a lot more convenient.

I was a little surprised by how quickly my eyes got tired looking up and to the right like that.  It would be okay to glance up at it from time to time, but trying to use it for anything sustained really just isn't practical.  Another thing that caught me off guard was that I couldn't read the display without my glasses on.  I'm nearsighted, so I thought the fact that the display was literally an inch from my eyeball would make it unnecessary to use my glasses, but whatever optical illusion they use to make it look like the display is a foot or two in front of you apparently also triggers pre-existing vision problems.  The unit itself has no lenses – it’s just a headband with nose rests and a display – so I was able to put it on over my glasses, but again, it was awkward.

The bone conduction speakers work, but not really well.  Initially, I couldn’t hear anything at all, and my friend had to ask one of the Google handlers how to get sound.  It turns out the solution is to put your finger in your ear.  Again – awkward.  *Really* awkward.  Once I plugged my finger in my ear, though, I could hear okay.  The sound is tinny, with very little bass, but that’s not really surprising; the sound from an external iPhone speaker is of higher quality, but not by a lot.  I listened to a piano piece, the THX noise, and a rock song (Strange Television by Deadboy & the Elephantmen, if anyone wants to know), all on YouTube.  It was adequate; if I’d listened to someone talking, I’m fairly sure I would have been able to understand what they were saying.  The sound comes through the hardware mounted on the right side of the glasses, so obviously it’s not in stereo.

The map application was solid.  It was able to give me directions to nearby restaurants and movie theaters with no problems, though I had to scroll through them by swiping the touch area.  The accelerometer did an excellent job of tracking where my head was pointed and swiveling the map to match my movements.  If I were blindfolded and dropped into a strange city with Google Glass, I could get around about as well as I could with a smartphone (assuming I didn’t get mugged for wearing a $1600 piece of hardware on my face).

Taking photos and videos is pretty clearly one of the main things Glass is designed for.  It really is just as simple as saying “Okay Glass, record a video.”  The camera was mostly pointed wherever I was looking; I didn’t have to adjust my head position much to get things framed clearly.

My friend ran into a problem, though – he has vision problems in his right eye, and they didn’t have any units with the display mounted on the left side.  He played with it for a couple of minutes before taking it off.  One of the Google handlers hooked a Glass unit into his smartphone via Bluetooth and showed what the display looks like.  Once the device hits the market, though, I assume there will be “lefty” models available.

And that’s about it.  Overall, I wasn’t tremendously impressed.  It’s a cool concept, but this isn’t going to replace your smartphone any time soon.  Part of the problem is the display – it’s not designed for extended use – but the biggest issue I had was the interface.  Glass relies far too heavily on the touchpad for navigation and control.  That’s a software issue, though, so it will get better with time and third party developers.  I wouldn’t buy this when it hits the market, but Glass 2.0 or 3.0 may be worth the money.

I think I covered everything, but I’m happy to answer any questions.