When shooting with a given lens, how do you know which aperture is optimal for your creative vision?

Let’s say you want shallow dof, is there a place where you can learn that the sweet spot for dof and sharpness for a given lens model is when it’s stopped down by 1/2 stop?  

Conversely, where would you go to learn that a given lens model goes soft after f/11?

Sure, you could create a test rig, take tons of pictures with each at every aperture and inspect them with a magnifying glass, but that seems awfully inefficient.

Thanks and I look forward to everyone’s guidance and insights…

  • ApatheticAbsurdist@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Old school rule of thumb (not entirely true but a starting point): 2 stops down from wide open (eg: on an f/1.4 lens, f/2.8 would be 2 stops down) is often the “sweet spot” of a lens. More recent lenses have gotten a bit better and may only need a stop. If you’re looking for maximum sharpness somewhere between f/5.6 to f/11 is going to be it but the top end doesn’t depend on the lens it depends on the sensor. The old saying is “f/8 and be there” and in most cases that works. But with many newer lenses f/5.6 is just as sharp and gives you a little shallower depth of field if that’s what you’re after (and if you pushing 40MP on APS-C you might be better off at f/5.6 because of the next factor:)

    Conversely, where would you go to learn that a given lens model goes soft after f/11?

    The lens doesn’t go soft the diffraction grows as you stop down. There is an “airy disk” of blur that grows the smaller the aperture gets… it’s a bit of a balance… most lenses are soft wide open and stopping down a little improves their optics, and if you want more things to be sharp you stop down to get more depth of field and have more things in focus, but as you do that the blur diffraction grows… and it grows purely based on the aperture, no matter what lens you use. The issue is that if the blur is smaller than the size of a pixel on your sensor you’ll never notice it. Once the blur grows to more than a couple pixels of your sensor wide, then you notice it. So on sensors with smaller pixels (small sensors with high MP) you’ll notice diffraction earlier maybe at f/8. I’ve had some where I’ve noticed it at anything smaller than f/6.3. A larger sensor with few megapixels you’ll be able to stop down to f/11.

    When you get a new lens, you need to take a bunch of photos to learn the qualities of your lens. You can to it scientifically or you can just go shoot and experiment.

    Generally if I get a new lens I go out take some shots wide open, then 2/3 of a stop down then 1 or 2 stops down from wide open, maybe some at f/5.6 and some at f/8 just to see what it’s like. I have a 50mm f/1.4 that is a bit soft at f/1.4 (if it’s really low light it’s not the end of the world, but if I really want something tack-sharp f/1.4 isn’t ideal) after using it a bit I’ve come to feel that if I stop down to f/2.2 it will be much sharper. But I had to do it for myself over time started at f/2.8 and it worked well but over time took some shots where I’d push it a little… f/2.0 one time but it was a little soft, f/2.5 was still decently sharp, eventually I came to realize intuitively that f/2.2 was the widest I wanted to go unless I was really pushing it.