The camera never lies. Except, of course, it does - and seemingly more often with each passing day.
In the age of the smartphone, digital edits on the fly to improve photos have become commonplace, from boosting colours to tweaking light levels.
Now, a new breed of smartphone tools powered by artificial intelligence (AI) are adding to the debate about what it means to photograph reality.
Google's latest smartphones released last week, the Pixel 8 and Pixel 8 Pro, go a step further than devices from other companies. They are using AI to help alter people's expressions in photographs.
It's an experience we've all had: one person in a group shot looks away from the camera or fails to smile. Google's phones can now look through your photos to mix and match from past expressions, using machine learning to put a smile from a different photo of them into the picture. Google calls it Best Take.
The devices also let users erase, move and resize unwanted elements in a photo - from people to buildings - "filling in" the space left behind with what's called Magic Editor. This uses what's known as deep learning, effectively an artificial intelligence algorithm working out what textures should fill the gap by analysing the surrounding pixels it can see, using knowledge it has gleaned from millions of other photos.
It doesn't have to be pictures taken on the device. Using the Pixel 8 Pro you can apply the so-called Magic Editor or Best Take to any pictures in your Google Photos library.