More on geo-tagging photos with a time element


Some readers have written in with questions about my photo geotagging post from last week. One common question is whether the place name has ended up in the file's metadata somehow. The answer is: I don't think so. I did an "export as original" on the photo in question and ran it through a bunch of exif dumper tools and didn't find anything that suggested a name like that.

The EXIF data looks like this:

Create Date         : 2014:06:16 13:02:15.202
Date/Time Original  : 2014:06:16 13:02:15.202
GPS Altitude        : 5.2 m Above Sea Level
GPS Latitude        : 37 deg 29' 6.13" N
GPS Longitude       : 122 deg 8' 53.30" W
Circle Of Confusion : 0.004 mm
Field Of View       : 57.2 deg
Focal Length        : 4.1 mm (35 mm equivalent: 33.0 mm)
GPS Position        : 37 deg 29' 6.13" N, 122 deg 8' 53.30" W
Hyperfocal Distance : 1.89 m
Light Value         : 15.4
Lens ID             : iPhone 5 back camera 4.12mm f/2.4

(Side note: that's an interesting term, huh? Today I learned.)

Anyway, it's just a bunch of numbers, as you would expect. Something in the actual Photos app on the Mac and the equivalent thing on my phone is translating it to a name.

What's kind of nutty is that the same picture still shows "Facebook - Headquarters" when viewed on my phone. Really. Check it out:

iOS 15.something or other view

So, not only is there some mapping going on, but the phone and the computers (both of them) are looking at two different sources of data. I have to assume the phone has it cached, while the Macs must have flushed it and picked up the new value in recent times.

Or, who knows, maybe Apple is running multiple backends with disjoint geographical data sources. It wouldn't be the first time they had terrible map data, right?

So here's another fun problem: how do you do a "fourth dimensional" geo-tag (that is, adding a time system) without revealing all of the places a person's been and when they were there? In other words, how do you do that without compromising privacy?

The best I can figure so far is that you'd send back a list of ALL of the place names for a given area and let the device figure out which times apply to which photos, and just discard the rest. Also, it should probably be "zoomed out" pretty far, such that only very coarse bounds are given to the server. Just return all of the mappings for all of the polygons or whatever inside some giant swath of space, and do all of the nitty gritty stuff on their device.

Otherwise, hey, it becomes pretty easy to track people after the fact.