It’s All About the Details
Taking the Mystery Out of Sharpening and Noise Reduction
Learn how to use the latest software tools to get sharp, noise-free photographs
I get lots of questions about sharpening and noise reduction. It’s understandable, as these topics can seem shrouded in mystery, with cryptic terminology, and conflicting opinions and theories about how to get the best results.
In this webinar I’ll try to demystify this important topic, give you a clear understanding of how sharpening and noise reduction work, and show you the best ways to use the latest tools to get sharp, noise-free images. We’ll cover the robust sharpening and noise-reduction tools Lightroom and Photoshop, but also explore and compare other tools like DxO PureRAW, Topaz Denoise, and Topaz Sharpen AI.
These are some of the topics we’ll cover:
- Sharpening workflow – input and output sharpening
- Different sharpening methods and how they work
- Finding the right settings for input sharpening in Lightroom and Camera Raw
- Refining output sharpening settings in Lightroom and Photoshop
- Rescuing blurry photographs with AI-powered tools
- Manual noise reduction tools and settings in Lightroom and Camera Raw
- AI-powered noise-reduction tools – comparing Adobe Denoise, Topaz Denoise, and DxO PureRAW
- Recommended workflow and settings for AI-powered tools
I’ve been making digital prints since the late ’90s. I’ve always tried to keep the image quality as high as possible, and make sharp, clean, noise-free prints that are also esthetically pleasing. During that time I’ve seen the software tools improve dramatically, allowing us to achieve amazing results that weren’t possible even a few years ago. But which tools should you focus on? And how do you get the best results from those tools? Join me for this webinar and learn how to get the highest-quality results from your photographs.
Click the link below to sign up!
It’s All About the Details, May 18th, 2026
$27
The virtual reality (VR) modes of The Photographer’s Ephemeris 3D and Planit Pro allow you to virtually place yourself on the ground in any place at any time, visualize how the scene will look, and see exactly where the sun, moon, or Milky Way will be in relation to the landscape. Here, the virtual-reality mode of The Photographer’s Ephemeris 3D (right, below) showed me the moon position and lighting for this scene of Half Dome from December 2024:
Here the virtual-reality mode in Planit Pro (right, below) shows the position of the Milky Way above a Sierra peak on an August evening:
The virtual-reality feature of The Photographer’s Ephemeris 3D helped me plan this photo of a solar eclipse sequence from August 17th, 2017, in Idaho’s Sawtooth Mountains. While the peak (El Capitan) looks a bit truncated in the VR version (right, below), I could still see the sun’s path, and its position when fully eclipsed:
Augmented Reality (AR) can be a fantastic tool for pre-visualizing images when you’re able to scout a location in person. Here I used PhotoPills Night AR mode (right, below) to see where the north star would line up with this rock pinnacle in Death Valley, enabling me to visualize, set up, and compose this star-trail photo (left, below) before dark:
The Photographer’s Ephemeris 3D allows you see how light will fall on the landscape. I used that feature to determine if there was a time when the water surface on this alpine lake would be in the shade while the mountainside on the far shore would be in the sun. There was, late in the afternoon (right, below), and that allowed me to make this abstract telephoto composition of melting ice on the water, with colorful reflections of the mountainside lit by the late-afternoon sun: