>>3168947>>3168951Okay, so let's break it down like this: Let's assume that a nice lens is capable of passing through 24 stops before the shadows start to become contaminated with light, and that it's mounted to a camera that can record just half that at 12 stops DR. This is fine, because we don't care about the lower reaches of this range, much like we don't care about nor measure the darkest stops a camera is technically capable of capturing, since it's entirely contaminated with noise. However, the greater the difference between these two ranges, the more the captured range is protected from the effects of adverse lighting with respect to the subject, because these effects are shifted below the visible threshold.
An interesting aspect to all of this, is that this property is a union of both the lens and medium, that is, high-contrast/low-DR film and sensors can get away having the appearance of good micro-contrast even with lenses of lower standard, meanwhile high-DR cameras like the Nikon D8xx or Sony A7R series often receive remarks of having very bland images. This is not an inherent fault in the sensor design itself, but that the existing lenses are no longer offering the same ratio of total deliverable contrast vs. the amount the camera is capable of seeing. Sure, they can provide enough for the image to seem "contrasty" and free of flare or veiling in a general sense, but not to the degree that's needed for micro-contrast to kick in.
So why do people often refer to that "famous" Leica contrast? Well, for one, Leicas are historically shot with high contrast film, the CCD sensors they began to using come the digital age were quite high-contrast, and even their current CMOS sensors still only clock in at about 12 stops, vs. the 14~15 that a high-end cameras can crank out. Even if their lenses are objectively very good in being able to control light, they've had the "benefit" of a medium that's easily satiated by such performance.