Android App Ratings & Reviews

Google’s “Gaze” Patent Intended to Track Your Eyes Movement

by On August 13, 2013

google

We already know that Google is tracking everything you do online, from your shopping habits to where you go, what do you like, where do you live, if you have pets, what are your friends, basically Google is everywhere in your life. The only condition is to use their products, i.e. their omnipotent search engine (which is number 1 in the world by a landslide, despite the privacy issues revealed by the PRISM scandal), their Chrome browser, their Circles social network, not to mention the Android Mobile operating system.

Now, a patent filled by Google will monitor your eyes movement, so in the near future Google will even know what you’re looking at, so basically, your life will be an open book for the nice people from the Mountain View.

The patent is for an eye-tracking technology that will be used in the Google Glass initially, and will represent a new way of controlling your smart device, instead of gestures or taps, that are currently used in smartphones.

Obviously, this will be the initial use of the “Gaze” patent, I mean controlling your Google Glass, in the future there are no  limits for its use, for example menu navigation or, why not, to determine the best ad placement in your browser, not to mention the possibility of knowing what are your preferences when visiting an online shop etc.

With this new patent, Google will know for how long you’re staring at something and it will be very easy to determine what it was you were looking at.On the creepy side, with this patent, Google can determine your pupil reaction to the item (if the pupil dilates, you’re interested or something like that) and via the smartphone’s GPS they will know precisely where you were when viewing the item.

google

Let’s take a look at a small portion from the patent “Abstract” :

A gaze tracking technique is implemented with a head mounted gaze tracking device that communicates with a server. The server receives scene images from the head mounted gaze tracking device which captures external scenes viewed by a user wearing the head mounted device. The server also receives gaze direction information from the head mounted gaze tracking device. The gaze direction information indicates where in the external scenes the user was gazing when viewing the external scenes. An image recognition algorithm is executed on the scene images to identify items within the external scenes viewed by the user. A gazing log tracking the identified items viewed by the user is generated.

Comments