Wow, this is a very cutting-edge
new innovation from ING Netherlands. My android cellphone shows me the available ATMs in my sight, by pointing the camera of the phone to any direction. ATM markers are added to the actual camera image, so I know which way to go.
The concept of processing the real environment image and react somehow is there since some MIT guys introduced
Sixth Sense. This is a mini computer appliance that can see what you see and even project a GUI to any surface to show additional information related to the real world image. It is like you check
a book in your favorite bookstore and the device projects on it the amazon rating of the book (downloaded from the web).
Interesting technology indeed.
Does it only work on Androids? Would be a pitty...
Excellent idea otherwise!
I know someone at Nokia who's been working on something like this. This is the best site I found:
But back to the original post, I was wondering how this works - is it based on a combination of GPS and server side processing (i.e. figure out your location using GPS, and direction using the view from the camera and then return the relevant information)
- in which case there would be a LOT of location specific data (360 degree photos) that would need to be pre-setup on the server end.
But it may also be feasible to integrate this with Google Street view... just thinking aloud.
© Finextra Research 2016