So NASA released this amazing picture of a spacecraft actually landing on Mars:
But the first thing I, as a photographer, think is “Damn. That’s not sharp”. Click to see the image full sized to see that it is indeed not sharp. Hey, it’s from a dangling spaceship on Mars, so this is not criticism!
But still… I can make that better.
Using the AI sharpening software I use, with only very little effort we get this sharpened image:
Again, click and view full screen to see the sharpness. Amazing, no?
Now. Is this unethical? Am I altering, doctoring even, a NASA image?
There’s two ways you can materially change a photo:
Manipulating images to make them art is OK if you say it’s art.
Distorting for nefarious purposes (like to “prove” that the earth is flat) is not OK.
But this is neither art nor distorting the image. This is simply bringing back a clearer picture of the reality that there actually is. Just like correcting the white balance would be.
So I think we’re good here. Enjoy the sharpened image.
A short note today about leading lines. We use those to lead the viewer into a photo and call attention to the subject. You can use a wide-angle lens, and you can look for lines naturally occurring in the environment. Like the perspective lines here, in the parking lot at Place d’Orléans mall, that seem to point to Rose:
As soon as I saw r=those “Alhambra-like” columns, I knew we had a photo. It’s all about opening your eyes.
This, incidentally, is one of those images that can also work very well in Black and White – here, with the super-cool grainy Tri-X film look – and you really need to see it full size to judge:
In this particular case I am not sure which one I prefer – I love both. What do you think? Let me know!
When we fix images, as we do daily in the store (www.michaelwillemsphoto.com) sometimes it is easy – and sometimes we need a lot more effort. Like in this before/after example:
White balance isn’t enough – not even close. For these colours I needed to use Lightroom’s white balance, extensive HSL, and especially the new excellent “Color Grading” tool. If you haven’t needed it yet – you will. And then a coloured local adjustment brush to add some skin colour, quite often – this is an art as much as it is a craft and a science.
But then there’s also Photoshop to remove the small imperfections, and an AI-based de-noise tool to lower noise. (“AI” stands for “Artificial Intelligence” – it’s not the name “AL”…)
In the end, it is always worth it. Memories preserved. Because when the photos fade, the memory itself fades.
Google DeepDream is the neural network technology Google developed to see what a deep neural network is seeing when it is looking in a given image. Now, the algorithm has become a new form of psychedelic and abstract art.
Often, the “dreams” are pretty disturbing. Like this pretty well known “Dog Spaghetti” image:
Scary, but then… that is the stuff dreams are made of, right?
And here’s one of mine – two images together, tulips and a frog, and how the AI, after crunching away at them for a while, makes sense of them:
As you may know, one of the things I sell in the store is Wall Art. Photos in many sizes that look great on walls, in other words. You can see some examples by clicking on https://michaelwillemsphoto.com/wall-art/. Here’s one:
And I recommend you give serious consideration to doing this as well – hand prints on your wall. Your own prints, or my prints, or anyone else’s prints: as long as you have decorative visual art that is on walls, not just on Facebook walls.
Here’s another example:
If you need help, ask. And you may, because sorting out print quality, paper types, sizes, frames. and so on can be a bit of a struggle. But we’re here to help.
One tip: as the two photos here show: don’t forget Black and White!