May 23rd, 2013

Last week I was given the opportunity to fulfill a dream I’ve had for almost a year: After scribbling my life away on a lengthy non-disclosure agreement and quickly pacing up the steps to the 2nd floor of the Practice Fusion offices, located on Taylor Street in San Francisco, I found myself in the heart of a buzzing and energetic room filled with entrepreneurs, developers, designers, technologists and futurists, all keen on living 3 – 5 years ahead of the general population.
Voices filled the hall, echoing bold new ideas and concepts describing the many ways in which Google Glass’, and other wearable computing technologies, will most certainly change the world: “OK Glass… show me the future”. 
The meetup I attended was arranged by the SF Smart Glasses Apps and Developers group, hosted by Glass futurist Zane.  There are nearly 400 members in the group with this most recent meetup being host to 50+.
A number of select attendees were in possession of the Google Glass device via the #GlassExplorers program; those generous enough to share became the life of the party, gifting to others a glimpse into a new reality: one with access to a wealth of the world’s information existing in your line of sight.
The evening kicked-off with a presentation by Siamak “Ash” Ashrafi; Ash presented with great detail his experience of being accepted into the Glass Explorers program, selecting/customizing his device, and the close attention that Google pays towards vogue. They’re treating the adoption of Glass as a life changing event; devising a grand plan to make you feel undoubtedly comfortable about adopting and wearing this device. (The slides from Ash’s presentation can be found here).
During our second presentation by Lance Nanek (Glass Explorer) we discovered an Easter Egg titled “Meet The Team“.  This easter egg allows you to view a VR panoramic image of Sergey Brin and his team of engineers. Using the device’s variety of sensors you’re able to pan around Google’s offices in 360°, as if you were actually there!
With devices like Google Glass, Jawbone’s UP, and Nike’s Fuel Band being widely adopted, we’re being given the opportunity to chronicle our lives, capturing data from a variety of sensors that are readily accessible to the general public through consumer electronic devices. With the capturing of massive amounts of streaming, multi-structured data, we’re able to learn more about ourselves than we ever could have imagined.  Google Glass has 13 known sensors [1] (some of which aren’t even yet accessible through the Glass API) and a camera :
  • MPL Gyroscope
  • MPL Accelerometer
  • MPL Magnetic Field
  • MPL Orientation
  • MPL Rotation Vector
  • MPL Linear Acceleration
  • MPL Gravity
  • LTR-506ALS Light sensor
  • Rotation Vector Sensor
  • Gravity Sensor
  • Linear Acceleration Sensor
  • Orientation Sensor
  • Corrected Gyroscope Sensor
Storing and correlating all of this data has become an entirely new problem to solve; relational systems, such as Oracle, aren’t prepared to handle these massive amounts of data in the same way as a next generation NoSQL database.  The paradigm shift of relational to NoSQL parallels the shift in readily available consumer technology moving towards an era of machine generated data, device self-awareness via a multitude of sensors (take the NEST thermostat, for example) and new breeds of intelligence software that help us make better decisions.  There’s a lot of data being captured, analyzed and reported on; the amount is growing at an exponential rate.
I’m excited to see how Google Glass pans out and whether we’re ready to adopt such a technology at this time. One thing I’m certain of is that instant access to information, no longer just at our fingertips but in the blink of an eye, has the potential to be highly addictive. Google prides itself on the mentality that every millisecond counts; that mentality shines bright with Google Glass.