Get ready to get your nerd on.
This is another of the many HDR compression algorithms that uses clever math to control compression and attempt to minimize artifacts, such as halos. Here is the developer page:
Compressing and Companding High Dynamic Range Images with Subband Architectures
Yuanzhen Li, Lavanya Sharan, & Edward H. Adelson
Dept. of Brain and Cognitive Sciences, and Computer Science and Artificial Intelligence Laboratory
Massachusetts Institute of Technology, Cambridge, MA
http://www.mit.edu/~yzli/hdr_companding.htm![]()
What is cool about this technique is that the developers have made the code freely available to any joe like me who wants to experiment with it. The code is written to run in a Matlab-like environment (you can use Matlab, or its open source, free cousin Octave).
The workflow I used is convoluted, but I warned you at the beginning of the thread....
I shot 9 to 11 exposures for each scene - I did this because it is easy to set up on my Promote Control and it was about 25°F out at the time and I did not feel like trying to edit the settings - in fact the LCD was slow to update, just to give you an idea of the shooting conditions.
I used three of the exposures and fed them to Zero Noise. From ZN i got a 16 bit per channel TIFF.
The compression code requires a portable bitmap/float map file format, so I took the TIFF into PS and converted it to a 32 bpc PBM.
Then you run the algorithm. Out pops a LDR image that has been compressed - the output from the default settings is a little too saturated. You can control the amount of desat by prescribing non-defaults, but I am not there yet.
So, back in PS I desaturated the image and did some final edits. Here are the results. I tried a few different kinds of scenes and found the halo artifacts to be minimal, although there are some areas where they are obvious. Again, once I try adjusting the parameters away from the defaults, maybe this will be able to be tuned for the image content.
/nerdage.
kirk
Backlit tree
Statue in deep shade
Cabin







