Three important things happened last week – we discovered that the ceiling fan at our office can withstand hits from a ball made out of scarves, we reached 6000 likes on Facebook (self five) and our version 3.0 update came out.
Why 3.0 Is Awesome News for You
Faster detection. Rev Eye is all hell and fury now. We detect content in less time than it takes you to put one sock on really fast.
See what your peeps are up to on Stream that shows you activity on Rev Eye.
We have become more lovable. There is more of us to love. No, that came out wrong. We are not fat but we have made loving and sharing campaigns easier.
Snap, which adds virtual objects to the real world for you to take cool pics with, now has face detection, and can also identify other body parts like hands, legs etc. Snap places the virtual objects smartly, making your selfies much easier to capture. With this update we have brought animations and imagery to the world around you. We’re open for requests from you to add what you’d want to see in your world. Write to us :)
You can now express yourself to your heart’s content because we have added user comments feature to Motion Prints and Snap objects.
We are fixing, no, scratch that, squashing bugs!
By Amar Sameeran
By developing Rev Eye we have solved one of the issues present in AR which is finding an AR Portal. Various players in the market have been using indicators like Blip, Scan With Layar and other images. AR advertising seems to be a tricky solution which advertisers still don’t seem to have yet figured out, owing to the guessing game where they are not sure if the consumers are going to scan content in a relevant fashion. We intend to address this problem with the age old design paradigm of Content discovery. Use location to show prominent Augmented Locations and Media around you. We made it easy for a user to find what’s interesting and trending by compiling popular, recent and featured media lists.
We use a hybrid approach combining the best of what the best have to offer
Our biggest challenge was to overcome the latency issues associated with flaky connections which plague most mobile networks, we wanted the user experiences and loading times to be consistent.
a) Smarter & Faster :
We used a device based MVC approach where views coordinate with a controller which retrieves data from a local source as opposed to the cloud in order to provide a smooth, cumbersome-free experience.
The controller queries the cloud behind the scenes and retrieves the local source as and when needed in an efficient way so that the local cache always stays in sync.
b) Uber Fast AR Experiences:
We also use the same hybrid cache based retrieval for our AR Campaigns , this enables us to provide sub 20ms detection speeds ( B| )