This update on making photos whilst walking with Maya along the coast starts to explore ways of augumenting the still photography. Large format art photography has been my way of creating an art work from these poodlewalks, which are embedded in a particular place.
Maya is now between 5-6 months old and she is quite comfortable walking for an hour or so with me along the coastal rocks on the southern Fleurieu Peninsula. As we are on the cusp of winter in South Australia there is early morning cloud cover, the showers sweeping in from the south west are more frequent, and the coastal winds are much stronger.
Whilst I’ve been on these early morning walks I thought that it would be interesting to find a way to show what Maya is hearing, smelling and seeing whilst she is with me. I have no idea how to do this, but I started wondering how Augmented Reality (AR) could add to these kind of walks; or alternatively what could be added using generative AI for texts written by ChatGPT, or an image using Midjourney.
I quickly realized that generative AI is step too far for me as is that that version of AR with its overlay of digital data on top of the real world that is consumed through a camera-and-sensor-laden headset. There is little point in the latter as few people would have the required equipment that mediates the entire world through screens placed centimeters from users’ corneas that makes the whole world a screen.
However, there is a space for something along the lines of supplementing, augmenting, adjusting, or overlaying reality; such as supplementing the still photography is a video. A video offers sound and movement that would augment the frozen moment of the still photography. I need to do more video as I am not sure about podcasts or films, as is done with MAP‘s. Nor do I have the connections to collaborate with a writer like the SALT project which was commissioned by Art Walk Projects.