A colleague at Epic is part of the Glass Explorer programme, and this weekend I was pleased to be able to take a Google Glass home to learn and experiment with. We took the device up to the LT14 show last week and it was a bit of a draw on the stand and was great to introduce people to this new world of wearable technology.
My first impressions were that this is absolutely not intuitive to set up! It’s not like getting a new iPhone that ‘just works’. This thing takes a bit of time to get to know and understand. That’s fine I guess, as most of our interactions with computers to date have been limited to using mouse or touchscreen as input devices, so you do need to learn the basic user interactions needed for this thing.
Unfortunately though, there is no on-screen tutorial when it’s fired up (bearing in mind that I’m not the first user of the device, but there is no obvious tutorial in the menu system that I could view either). So I had to go to the Google Glass Help site and watch a few videos on my tablet first. This is a bit of a drag to be honest, I really dislike the trend for 5-10 minute instructional videos, I just want to get going.
People who are used to voice input for their computers will probably feel quite at home with Glass, such as Sat Nav and Siri users. I’ve not got any voice input devices myself, although my Android phone has Google voice search, but that’s not really been of interest to me until now. So I had to get over that initial self consciousness of using voice input for the first time. Frankly, saying “OK, Glass” out loud in a social context makes you feel like a complete dick. Extroverts and show-offs may like that, but not me. I can’t really see my opinion changing the more I use it, maybe I’m just not a fan of voice input devices, especially in social situations where they put up significant barriers. A new name has even been coined for these folks: Glassholes.