Yes, we do see "better" in shadows. But if the image reproductions is truly faithful, we should see "better" in those shadows, too. The problem is the reproduction is not truly faithful for many reasons. The gamma curve is one aspect of the failure to reproduce faithfully. Gamma correction is applied to compensate for that well known (to the technical people) error.
But for reasons of history, they way it gets applied is not convenient. It was the display device with the error in television (e.g. the CRT). But the correction was being applied at the source (camera control circuits). That means the signals and their numeric values in between do not have a linear relationship to the display levels. This started in television systems and persisted into computer systems when they first used televisions as display devices. Image files then became part of that "signal" that was not linear, as well as digital interconnection like DVI.
To make things worse, the CRTs did not all have the same curve, and different compensations were, and still are, used.
Fortunately, RAW files store the linear values. Now if only we could push linear image files further to the destination. I know PNG supports this (set gamma attribute to 1.0 when storing linear pixels). Maybe TIFF can do this, too.
But we still have a world of BMP, GIF, JPEG, PNG, and TIFF based images with non-linear gamma corrected values that were meant for correcting the effects of a CRT, which fewer and fewer people are using. LCD displays operate with different characteristics than a gamma curve. They are not linear, either, but the gamma correction does not apply. They have to do a double conversion to undo the gamma correction, then apply a correction for their own differences.