Wednesday, April 16, 2014

Please Steal My Ideas #1: The Foodometer

Sadly, my better ideas often aren’t related to any projects I’m working on. Rather than let them go to waste, I figure it’s better to unleash them to the world in hopes that someone else might benefit. So, in what I might optimistically call the first in a whole series chronicling my minor epiphanies, I present giveaway idea #1: Yelp, please steal this idea for a foodometer.

First a question: what’s the difference between a photo of food and a photo of a restaurant?

Here’s why I care: when I’m browsing for restaurants in Yelp, one of the key things I’m looking for is the atmosphere of the place. The best way to gauge this is by a photo, but browsing through the user-submitted photos is frustrated by the fact that 99% of them seem to be of food. I really want to be able to filter for views of the surroundings.

(The answer to the question, by the way, is gravity.)

From Yelp’s perspective, they’re not going to want to force the burden of classifying photos onto the users. It’s too much friction for the submissions process. If only there was a way to categorize them automatically…

Which brings us to gravity. The difference between a photo of food and a photo of a restaurant is the orientation of the phone that took it, with respect to gravity. If the phone’s pitched downwards, it’s probably a photo of food. If it’s mostly upright, you’re seeing the restaurant. How can you tell the direction of gravity? Ask the accelerometers. All smartphones have them.



There is, sadly, one big caveat. It doesn’t look like accelerometer data is saved in the EXIF metadata along with the photo; at least not in a useable way. You can learn which way the camera was angled in order to rotate the image to the correct vertical, but there’s nothing of finer grain. For this technique to work you’d need to take a photo within the app, and have it record the accelerometer values itself. Still, the Yelp app does have the ability to take photos within the app. So why not?