In 2013, Craig Mod wrote a fascinating article in the New Yorker, I particularly liked his comment on how digital cameras were a stop gap between the Leica M3 and networked lenses. Apple’s WWDC last week got me thinking further, the steps towards everyday augmented reality that featured are a step closer to major change in how we photograph and associate data - linking the image sensor with web based data and personal information.
At last we may have gone full circle. Old family albums from my parents’ generation and earlier are meticulously sorted and organised showing the locations and dates of the pictures, and often listing the names of relatives who would otherwise be unknown.
For the last couple of decades, digital photography has dominated and only a tiny percentage of the trillion photos taken just in the last year will have been printed, many of course have been lost in the pile of failed hard drives and those that remain will probably contain little information of help to future generations trawling through their ancestors’ pictures. This is due to change.
Technology now automatically stores the locations of the photo and increasingly who we are, thanks to facial recognition data - Craig talked about weather information, routes and “state of mind” from other social posts, but now let’s add in augmented reality capabilities; where our phone “knows” exactly what it is looking at and can layer data on to images seen through our devices’ cameras. We must now be close to amazing possibilities with metadata attached to our image files and live streaming in real time.
The iPhone already creates albums automatically - shortly it will be a reality to have our phones build amazing travel journals in the background, tagged, annotated and shared to our social networks without any intervention. We’ll have a missing couple of decades but ultimately, we will be as good as our parents were and making wonderful family albums and heirlooms.