PDA

Ver la versión completa : Noise reduction in perfectRAW



jdc
13/12/2009, 10:03
Manuel, the results are outstanding, better than mine, including one sees clusters appear less colorful (I do not think that either directly to the chroma).

Again, congratulations for having succeeded in synthesizing the work of the team and of course for your personal contribution

:):):):xxlmas::xxlmas::xxlmas:

Congratulations

jdc
14/12/2009, 15:43
I made some changes to my code.

1) noise reduction before interpolation: minimal modification but important.

2) ability to process low frequency noise in multi-pass (1 or 2 ...), adapting the procedure of Manuel, including intervening on the "edge" and the "threshold" (for very noisy images)

3) slight change in the treatment of chromatic noise by adjusting the contrast and saturation (chromaticity) as a function of previous treatment.

In one case I does not increase the contrast (in Lab mode, with S-curve that adapts to low and high light), but I increase the chromaticity uniformly (Lch) of 5% for dull , pastels and saturated (of course there is control of gamut).

In the second case (strong treatment Gaussian) I increase contrast of 3% and the chromaticity varies from 7% for dull colors and pastels and 5% for saturated colors.

4) the algorithm processing the Chromaticity from Manuel based on the average is slightly better than the one I use (with Lassus) based on the median, but it is considerably slower. I implemented a procedure that allows, giving the coordinates of an area, activate the Manuel’s procedure after the median (which in a GUI would be a local editing) ... I made this patch right from the bottle "Samuel Smith" at 300x300 pixels.

The total processing time on my old machine (3 years) is about 1 min 40 sec (I have not added any sharp, or median)

The denoised image is better than before but I think she can not quite at the image of Manuel. Note that my image is generally more contrasted, because the levels are adjusted automatically

I want to clarify one important point colorimetry. The colors we see on screen are limited by the screen…that which is usually sRGB. But the colors of a modern camera far outweigh sRGB to be fairly close to ProPhoto (or WideGamut).

When you "clip" an image in RGB mode, you suppressed completely certain colors (you can’t see in sRGB, but it would on a device from Adobe or WideGamut). Similarly when you clip Lab values in ProPhoto, it’s less important but nevertheless we remove color.

When one has to work on the colors (chromatic noise ,... contrast, saturation, sharpening, low lights,…) it comes very easily to leave ProPhoto. The procedure that I use which works on relative colorimetry will bring back the colors outside gamut in the color gamut while keeping the hue.

Best regards
:)

http://img199.imageshack.us/img199/7482/d70025614dec.jpg

jdc
15/12/2009, 13:39
I tried to optimize the processing speed by doing:
* the size of medians (luminance and chrominance)
* on the treatment algorithm of Gaussian noise, reducing the "tile" to the second pass

Also I wanted to validate the image D3x (25M pixels): a) the quality of treatment, b) the memory (before "out of memory") c) time spent.

For the D700 image at ISO 25600, I use a median (chroma) 9x9, 2 passes Gaussian luminance, with reduction of 1 "TILE", I get a processing time with "JDDFD" as interpolation, but without "sharp", 1 min 30 sec . The image quality is comparable to that posted here. If I use a 7x7 median, the image is less good but acceptable with a time of 1 min 15 sec.

I tried the "constant time median filtering (ctmf ())" that works in 8 bits. Certainly it's fast, about 5 seconds for any radius, but the quality is not there ...

For the image D3x to 6400 ISO, I use a median (chroma) 7x7, 1 pass Gaussian luminance. The time is 2 min 40 sec, and reaches the limit (on my machine) for memory.I attach the image D3x to ISO 6400.

Best regards

http://img686.imageshack.us/img686/9016/d3x6415dec09.jpg

ejmartin
15/12/2009, 16:48
I tried the "constant time median filtering (ctmf ())" that works in 8 bits. Certainly it's fast, about 5 seconds for any radius, but the quality is not there ...



Are you working in the gamma-corrected output space, or with linear gamma?

The dynamic range of the D3x at ISO 6400 is less than eight stops, so the use of 8-bit data should not have a dramatic effect on processing, due to the presence of noise which dithers the output. This is especially so if you work in gamma=2.0 space, where apart from deep shadows the spacing of the tonal levels is approximately in fixed units of photon shot noise. However, things may be different for median filtering, which tends to erase the dithering effect of the noise. Can you show an example of what happens when you use 8-bit data? A crop of a problem area will do, rather than a large image.

jdc
15/12/2009, 17:30
Hello Emil.

I really tried ctmf() in all directions, with many trials, with only ctmf() or a mixture of functions calls like quicksort (fq_sort ())... In all cases, the image contains large artifacts.

Of course all the work is done in linear mode before RGB conversion, so no gamma. In fact ctmf () works with "unsigned char" and the function implemented by Lassus, works in the real interval -65535 +65535.

I made a transformation as the variables (float) are in the interval [0 .. 256]. I completely uninstalled this function and the calling code, so I have no pictures, but believe me, at first I screamed "waooh" about 4 to 6 secondes to sort a 5x5 or 20x20 median, but looking at the picture, especially on the D700 image to 25600ISO, there are large artifacts with large drifts of color, eg: packages of 500 or 1000 pixels greater in low light but also present elsewhere, with colors derived, flat areas, ...ugly

In less noisy images, the result is watchable from a distance, but close ... bof, bof, bof...

The author Simon Perreault said on its website that it's necessary to create another process for 16-bit images. I tried, but my skills are insufficient, I can get images in 16 bit but disastrous.

:)
_______________________________________________
_______________________________________________
I forgot to mention that in areas without artifacts, the denoising is very average, approximately as a 3x3 or 5x5 median, meaning that many chroma has not disappeared.
:o
_______________________________________________
_______________________________________________
After many difficulties (I'm not a programmer ...), I managed to implement OpenMP and do the sorting with "Parallel programming".

The image is similar - no differences.

The sorting time on my machine for a file 25600ISO to D700, to sort 2 volumes of data type (float) (channels 'a' and 'b' Lab) with a 9x9 median, rose from 30 sec to 17 sec.

The total time is reduced to 1min 20 sec.

Best regards.:)

ejmartin
16/12/2009, 19:06
Hello Emil.

I really tried ctmf() in all directions, with many trials, with only ctmf() or a mixture of functions calls like quicksort (fq_sort ())... In all cases, the image contains large artifacts.

Of course all the work is done in linear mode before RGB conversion, so no gamma. In fact ctmf () works with "unsigned char" and the function implemented by Lassus, works in the real interval -65535 +65535.

I made a transformation as the variables (float) are in the interval [0 .. 256]. I completely uninstalled this function and the calling code, so I have no pictures, but believe me, at first I screamed "waooh" about 4 to 6 secondes to sort a 5x5 or 20x20 median, but looking at the picture, especially on the D700 image to 25600ISO, there are large artifacts with large drifts of color, eg: packages of 500 or 1000 pixels greater in low light but also present elsewhere, with colors derived, flat areas, ...ugly

In less noisy images, the result is watchable from a distance, but close ... bof, bof, bof...

The author Simon Perreault said on its website that it's necessary to create another process for 16-bit images. I tried, but my skills are insufficient, I can get images in 16 bit but disastrous.
_______________________________________________
_______________________________________________


Have you considered a fast bilateral filter rather than a fast median filter? The bilateral filter will not generate quite so much flatness of the image when the window size is large. You can find code for it at

There are by now relatively fast versions of this
filter to be found which you might find helpful:

For an introduction, see
A Gentle Introduction to Bilateral Filtering and its Applications (http://people.csail.mit.edu/sparis/bf_course/)
with code:
Fast Bilateral Filter (http://people.csail.mit.edu/sparis/bf/)

and another variant:
http://www.merl.com/reports/docs/TR2008-030.pdf
with code:
Download | Imaging, Multimedia and Graphics (http://img.cs.usask.ca/img/index.php?page=download)

and another, without code:
Shell & Slate - Fast Median and Bilateral Filtering (http://www.shellandslate.com/fastmedian.html)

Apparently Photoshop's Surface Blur filter is a version of the bilateral filter, using an algorithm that is much slower for large windows.

jdc
17/12/2009, 10:58
Emil thank you very much for these links and code.

I knew the work of Sylvain Paris, but my choice was made on ctmf (), why? why not ....

It is reading your site, I envisioned a fast median 8 bits (the depth of noisy images is lower than non-noisy) ..

The median I use does not affect all components of the image, which would bring a video effect, but only on the chroma, which forced me to code development.

I have not reinstalled all the functions that use "ctmf ()", but one is still implemented. The median BG-RG was designed as Dave Coffin, who reduced the color artifacts. I shall therefore use this median to show some 200% crops.

First I chose an image 200ISO with AHD interpolation. It sees the first image, the artifacts in "Pure brewed”.

The second picture, see the application of a 7x7 median, which reduces these artifacts and does not change the appearance of the image.

The third picture sees the application of a median 7x7ctmf; certainly artifacts "Pure Brewed" have disappeared, but the colors are distorted and artifacts appear.

http://img686.imageshack.us/img686/7305/d700200ct.jpg

I then applied the median 7x7 (which of course does not reduce the noise of chromaticity) to image 25600ISO with the treatment of noise (luminance and chrominance).

The first image without median7x7, the second with the median 7x7ctmf.

The shows also that the color shades have disappeared, and artifacts appeared.

http://img691.imageshack.us/img691/400/d70025600ct.jpg


Conclusion: It is desirable to work in 16 bits.

I'll look at the code that you gave me the details ... I think the work will be important, the holiday season is here and then I go to Kenya in January. Nevertheless I'll try (I am not a developer), I think it will work faster and well done if the median used is 16 bits.

Currently I work with OpenMP, which reduced by 40% processing time of parts of code where we can implement the parallel programming. If we add OPenGL that develops Egon ... and perhaps the median fast 16-bit, then we will have application in almost real time.
:)


Best regards.

ejmartin
17/12/2009, 21:54
Emil thank you very much for these links and code.

I knew the work of Sylvain Paris, but my choice was made on ctmf (), why? why not ....

It is reading your site, I envisioned a fast median 8 bits (the depth of noisy images is lower than non-noisy) ..

The median I use does not affect all components of the image, which would bring a video effect, but only on the chroma, which forced me to code development.

I have not reinstalled all the functions that use "ctmf ()", but one is still implemented. The median BG-RG was designed as Dave Coffin, who reduced the color artifacts. I shall therefore use this median to show some 200% crops.

First I chose an image 200ISO with AHD interpolation. It sees the first image, the artifacts in "Pure brewed”.

The second picture, see the application of a 7x7 median, which reduces these artifacts and does not change the appearance of the image.

The third picture sees the application of a median 7x7ctmf; certainly artifacts "Pure Brewed" have disappeared, but the colors are distorted and artifacts appear.

I then applied the median 7x7 (which of course does not reduce the noise of chromaticity) to image 25600ISO with the treatment of noise (luminance and chrominance).

The first image without median7x7, the second with the median 7x7ctmf.

It shows also that the color shades have disappeared, and artifacts appeared.

Conclusion: It is desirable to work in 16 bits.


It looks to me that the artifacts are all in deep shadows; if color differences fall below some value, they are set to zero. If you are doing the operations in linear gamma, this could be the cause of the problem with 8-bit manipulation; color will then be strongly posterized after transformation to output space with gamma~2. If you are using linear gamma space to do operations, I would suggest you try again with gamma=2, or even better, Lab space. The spacing of colors in 8-bit space will be closer to perceptually uniform. Although the range space may be quantized into 8-bit discretization, the output of these fast median filters is a ratio of a range average and a weight, which can assume more finely spaced values if implemented properly. In other words, use an 8-bit range, but do the calculations in 16-bit or even floating point.

jdc
18/12/2009, 14:52
Hello Emil

Thank you for the tips, which are very good.
I relocated ctmf function () .. but this time, in Lab mode. I kept OpenMP.

Depending on the level of noise, I directed treatment before (little noisy images), or after converting RGB and gamma (noisy images). The switch is based on the choice of the median radius: up to 7x7 = RGB before converting; 9x9 after gamma. It is useless to go beyond of 9x9 ... We do not deduct more noise and artifacts are introduced.

I also introduced the ability for users to choose the quality level of chroma noise reduction. By introducing into the command line, a level of quality, it affects the quality and processing time:
ex: quality = 0 all the processing is done, sort in OpenMP = 17 seconds
ex: quality = 3, a part of treatment 1 / 4 is in OpenMP and 3 / 4 with ctmf () = 8 seconds

I enclose 2 crops, the first image quality = 0, the second with quality = 3. We note that there is some chroma in gray, there are no artifacts in the shadows (or minimal).

The total processing time is in this case (quality = 3) 70 seconds

:):)
and
:xxlmas::xxlmas: for you.

Crops 1 = OpenMP

http://img13.imageshack.us/img13/8117/d7256omp.jpg

Crops 2 = 1/4 OPenMP + 3/4 ctmf()
http://img709.imageshack.us/img709/4266/d7256ompctmf.jpg
_______________________________________________
_______________________________________________
I am also attaching the image D3x to 6400ISO with treatment with OpenMP and ctmf () (median 7x7 and precision = 3)

Noted a slight smoothing of color, probably the part of the 8-bit processing, but it seems to me, to be acceptable.

The total processing time for this image is 1 min 55 sec.

:)

http://img130.imageshack.us/img130/3746/d3x6418dec09ompctmf.jpg

ejmartin
20/12/2009, 04:54
After much exploration, I have two things I am relatively happy with:

1) hot/dead pixel removal algorithm, acting on RAW data. This is a faster version of the one I implemented before, a variant of what Manuel is using on post-interpolation data.

2) line noise removal algorithm, acting on RAW data. It is better to remove pattern noise before interpolation, since otherwise it strongly corrupts the interpolation itself.

In addition, I am in the initial stages of development of a rather fast chroma NR algorithm that has the possibility that it might preserve saturated edges. It will take a while to work out the properties sufficiently well that the chroma NR is optimized.

By far the step that takes the longest is luminance NR. The algorithm I am currently using is quite computationally expensive, and I am not sure that the results are worth it. The output of AMaZE (or for that matter LMMSE) has a sufficiently nice grain structure that I suspect the image would print well without luminance NR; the main utility of the luminance NR might be to allow sharpening of detail.

Anyway, here is the latest, with a very crude version of chroma denoise:

http://theory.uchicago.edu/%7Eejm/pix/20d/posts/ojo/d700_rmo_dctline_lmmse_ydn_crcbdn-PS.png

The image without luminance NR (and also without any curves/saturation in PS, which was performed on the above image) may be found here (http://theory.uchicago.edu/%7Eejm/pix/20d/posts/ojo/d700_rmo_dctline_lmmse_ydn_crcbdn.png).

Edit: A version with much stronger luminance NR (and many more NR artifacts) may be found here (http://theory.uchicago.edu/%7Eejm/pix/20d/posts/ojo/d700_rmo_dctline_lmmse_ydn_crcbdn-a-PS.png).

jdc
20/12/2009, 12:28
To expedite the processing of chrominance noise (without using ctmf ()), I give the option of bypassing the "edge", in which an image at very high ISO (D700 25600ISO) did not have a priori very important.

The processing time is reduced by about 6 seconds. To process an image of D700 ISO 25600 on my machine, retaining all the quality, the overall time is just over 70 seconds.

It is active by entering "-2" for "precision".

The command line becomes:
dcraw_98 -w -v -o 4 -4 -T -5 4 -q 18 -9 NRM 50 2 2 3 4 0 -2 D700-25600.NEF
:)

markanini
20/12/2009, 13:21
ejmartin, I cant see much removal of luma noise in the latest example.
Either this is a misstake, or you have decided to apply less agressive filtering which probably wise for keeping details. Nevertheless it looks very nice for a high iso shot.

I have some maybe crazy ideas about brute removal with thresholds to remove color blotches that become visable after salt and pepper removal.
Since even very sofisticated nonlocal means denoising can't deal completely with color blotches. I'm considering:

Processing only chroma which correlates with low intensity, then desaturation, since chroma is innacurate anyway, looks like what nikon JPEGs do.
Maby processing in HSV or HSL would be better for setting thresholds?
Further down the line setting thresholds for different hues according to white balance.

Tell me what you think.

Lassus
20/12/2009, 19:47
After much exploration, I have two things I am relatively happy with:

1) hot/dead pixel removal algorithm, acting on RAW data. This is a faster version of the one I implemented before, a variant of what Manuel is using on post-interpolation data.

2) line noise removal algorithm, acting on RAW data. It is better to remove pattern noise before interpolation, since otherwise it strongly corrupts the interpolation itself.

In addition, I am in the initial stages of development of a rather fast chroma NR algorithm that has the possibility that it might preserve saturated edges. It will take a while to work out the properties sufficiently well that the chroma NR is optimized.

By far the step that takes the longest is luminance NR. The algorithm I am currently using is quite computationally expensive, and I am not sure that the results are worth it. The output of AMaZE (or for that matter LMMSE) has a sufficiently nice grain structure that I suspect the image would print well without luminance NR; the main utility of the luminance NR might be to allow sharpening of detail.

I absolutely agree with your approach. From my personal experience, a dead/hot pixels removal filter before interpolation and an accurate chrominance smoothing is enough for getting a good-looking high ISO shoot print.

Which technique are you using for the chrominance NR? I really like the output on the green channel but there is some loss of contrast on the red cloth. After several trials I'm getting better results with the following transformation matrix than in Lab or chrominance differences domains:


+1.547 -0.577 -0.577 R
-0.577 +1.547 -0.577 * G
-0.577 -0.577 +1.547 BHope it helps.

markanini
21/12/2009, 23:49
Won't you post an example?

Lassus
22/12/2009, 01:11
My noise reduction filter isn't as elaborate as those of Emil, Manuel or Jacques but as you have asked...

http://img109.imageshack.us/img109/4329/d700hsli25600.jpg

The sample does not have any kind of luminance NR apart from dead/hot pixel removal before interpolation.

jdc
23/12/2009, 22:51
After much thought, planning and extensive testing, I almost developed a "median quick" that does not use sorting function or ctmf (), but works with histogram and lag windows.

The median is not "constant-time" ...

I optimized for "normal" use, ie the already very noisy images, for example D700 at ISO 6400 or D3X at ISO 6400. However, for images 25600ISO, conventional treatment is best.

The processing time for the 2 channels 'a' and 'b' (Lab) is about 4 to 5 seconds (without using OpenMP), compared with the 15 to 20 seconds with sorting in OpenMP and 30 seconds without OpenMP. Which brings image processing global D700-6400ISO of approximately 50 seconds.

Treatment is 12 bits, which is generally sufficient because of the work done by Emil, but there are some problems to solve (??).

On an image of D700 at 6400 ISO, I see no differences between treatment with sort + OpenMP and the "median quick"
;)
_______________________________________________
_______________________________________________
I improved treatment median fast without using a traditional method of sorting (qsort(), sort (), OpenMP), but by inspiration from anything done by the median speed of 8 bits and literature on median 16-bit fast.

With the changes, the quality of treatment does not lead to visible differences in images D700-25600ISO, D3x-6400ISO or D700-6400ISO (200% crops) between treatment using OpenMP classical sorting and fast median.

It seems that the treatment of 11 bits must meet virtually all of the images very noisy, however I can deal with optional 12 bits or 13 bits.

For an image of 12M ° - D700-25600ISO and a 9x9 median -11 bit (which is sufficient in all cases tested), the median processing time for the 2-channels 'a' and 'b' is a total of 3.7sec.

For an image of 12M ° - D700-6400ISO and a 7x7 median - 11 bit (which is the basic for the normally noisy images), the median processing time for the 2-channels 'a' and 'b' is total of 3 sec

For an image of 25M ° - D3x-6400ISO and a 7x7 median -11 bit processing time for 2 channel 'a' and 'b' is a total of 5.5 sec.

For this same image, but using a 9x9 median of 13 bits, the processing time goes to 12sec.

My site is updated, and you can activate these functions by:
D700-image 25600ISO: -9 NRM 50 2 2 3 4 0 -3
D700-image 6400ISO : -9 NRM 50 1 2 3 3 0 -3

:)

ManuelLlorens
27/12/2009, 22:54
I keep reading and seeing your results. No idea who has closed this thread.

Guillermo Luijk
27/12/2009, 23:45
I don't know why it was closed, but have found out that clicking on "CERRADO" allows to re-open it.

jdc
28/12/2009, 14:06
One of the difficulties of processing noise is related to the use of median to reduce chromatic noise (there are other approaches than the median: average used by Manuel or Luis, or a combination, etc..), and they are effective for a minimum of frames 5x5 but usually 7x7 or 9x9, 11x11 exceptionally (and beyond and already 11x11 artifacts appear, moreover, he no longer visible gain ...).

The disadvantage is the duration of treatment, which can be if used the normal procedures in C++ (sort ()), result in significant processing time, eg to treat 2 channels ( 'a' and 'b' mode Lab or RG and BG RGB), a 9x9 median can take a file of 12M °, about 1 to 2 minutes ... (on my machine) and more on an image of 25M ° and a median 11x11 (about 3 or 4 minutes).

These times are in my view unacceptable in normal use [exceptions (testing, development ...) ] the user does not want to wait until such a time.

The first thing that comes to mind is to accelerate the sort (), by an algorithm more efficient. There is the fq_sort() function based on: a) the quicksort algorithm and b) stop sorting when the median is found. We arrive by this method to divide the processing time by about 2 or about 30 to 40 seconds for a 9x9 median of 2 channels and an image of 12M °.

Then, if the material lends itself (Intel Core 2 or more) parallel processing. This is possible in this case, because the 2 channels 'a' and 'b' are independent and non-vector processing. The result was a reduction of about 40% or processing time from 16 to 22 seconds.

Of course it could be satisfied ? :


Nevertheless the literature is replete with articles on the medians for fast image processing:
these articles describe the process of treatment (8 bits) and give the pitfalls into 16 bits, but the code available (at least the one I found) is always 8 bits.
process addresses the problem by assuming that after processing the first window (eg 9x9), simply find the median after recalculating by shifting the pixels in the left column of the previous window and the right column of the window. Of course it works in 8 bit and quickly, processing the same image 12M °, with median 9x9 takes about 3.5 seconds. But because there is a significant but, the treatment is done in 8 bit, even if a noisy image contains less information, the tests performed on very noisy images are disappointing and do not validate this solution ...
the authors propose "the" 16-bit solution, which is dried up many identical values or to zero. These "identical", because obviously there are image processing which is in 16 bits and the image is usually 12 bits, obtained by theory for a raw 12 bit image, 4096 pixels by 65,535 with good values or noisy and about 60,000 identical values or zeros. If nothing is done the algorithm 8 bits, fetches, making loops, the median values + 1 or -1, as values are identical or zero. We soon realized that this treatment is not suitable because the total time will find that multiplied by 10 or 15.
This 16-bit solution is to introduce a classification of pixels in the current window, in addition to the values of the histogram. In updating the new window by removing the left column and add the right upgrades on the histogram, but also placement. Of course this should work, but must develop a code efficient and more compared to 8-bit case, this classification (obtained by sorting and insertion) takes time to load.



My approach to the problem:


As noted Emil Martinec, noisy images and very noisy images contain less informations of good quality , the pixel-level 'good' is essentially an 8-bit quality, but because there is a but, about 4 bits are crowded by noisy pixels, and the rest by the same values or zeros. We do not need 16-bit processing ...
Moreover, to handle chroma noise, beyond the median 9x9 there is no point .. and we add artifacts. We did not strictly need constant-time
What we seek is processing the image deteriorated to an accuracy of about 12 (or 14-bit max), but where the relevant information is 8 to 10 bit with a 7x7 median in the majority cases and 9x9 images for very noisy
Hence my efforts to treat only about 12 bits (or 11 or 13) and with a simple treatment that does not bother classifying pixels. I will describe below how I proceed, but in terms of results:

there are no qualitative differences seen on all the tests I could do (D700 25600-ISO, D3X to ISO 6400, D700 to 6400 ISO) between processing 16 bits per fq_sort () and in 11 bits per my method.
scrutinizing the details of the calculation and making one next to another area of treatment performed in 16-bit and fq_sort () and 12-bit "fast median", almost all values found for the median is identical to some of the gap is about 1 to 3 per 10,000 ... which is very low but significant
processing time (without using OpenMP) become more than acceptable (for 2 channels):

3 seconds for a 7x7 median - and 11-bit image to 12 M° - The most common
3.7 seconds for a 9x9 median - and 11-bit image of 12 M°
5.3 seconds for a 9x9 median - 12-bit images and 12 M °
5.5 seconds for a 7x7 median - and 11-bit image of 25 M°
12 seconds for a 9x9 median - 13-bit image of 25 M°







The simplified algorithm used (if processing channels 'a' and 'b' Lab):


transformed values (float) 'a' and 'b' which are in the range -128 + 128 in the new interval (int) 0 .. 2048 (11 bit) to 0 .. 8192 (13 bit)
calculate the median with fq_sort () in the first row and first column
save this value
determine the number of pixels in the window whose value is below the median (n_Mm)
construct the histogram 11 to 13 bits of this window

for the line - first window:
calculate the pixels in the left column of the previous window and from the right column of the current window
if these pixels are below the median being, reduce n_Mm
adjust the histogram
if N_Mm exceeds half the size of the window, down the median and adjust n_Mm
if N_Mm is less than half the actual window, increase the median and adjust n_Mm
save the median values calculated for this line
replace the values of the current line by the calculated medians
pass to following lines
...
recalculate the new values 'a' and 'b' in the real interval (float) -128 128
if using "edge sensitive" , apply the desired filter
then the usual treatment of Lab values: increasing the saturation, contrast eventually ... and RGB conversion





I will continue the comparison:
* Files with Canon "CR2"
* Examining the histograms in detail with “histogrammar” (thank’s to Guillermo)
* Examining the deltaE94
* Etc..
:)

Lassus
28/12/2009, 17:03
These times are in my view unacceptable in normal use [exceptions (testing, development ...) ] the user does not want to wait until such a time.
Jacques, I agree with you and keep following your improvements as always.

I just want to clarify all my attemps in NR are not intended for a common usage but for testing purposes. When I feel my NR is done I will start the code optimization. As you know, Weiss also has a fast bilateral filter method with an impressive performance:

http://img690.imageshack.us/img690/6479/bilatspeed.jpg

For a 25 pixel radius kernel this will reduce the speed at least to the tenth part...

jdc
28/12/2009, 18:19
Hello Luis

I know the work of Weiss, the code is available in 8-bit (?), hence my development.

I do not think we should go too far in the rays of the medians, beyond 11x11 artifacts that appear deformed (slightly) the geometric figures, the corners became colorless. I propose, however, my resolve medians between 5x5 and 15x15, the optimum seems to me it (compromise) is to 9x9.

For 5D2 (21 M°) I get to a processing time by 6 seconds for 9x9 and 10 seconds for 15x15, and for D700 9x9=3 seconds and 15x15=6.5 seconds (for 2 channels 'a' and 'b' Lab).

I have attached a picture where for a 5D2 to 6400 ISO, I compare the Colorchecker24:
* With a 9x9 median treated in 16-bit fq_sort () and OpenMP
* With a 9x9 median treated in 11-bit (median fast)
* With a median 15x15 treated in 11-bit (median faster).

http://img514.imageshack.us/img514/652/5d26400.jpg


The 3 images are very close, we can see the corners of the squares of 15x15 are rounded ...(little).


Second on images 25600ISO, 6400ISO for a D700 and 5D2, I compared the 16-bit histogram "histogrammar" ... the differences are very small.

Small differences (near zero) appear if one does not "edge", but very low.

I also compared the deltaE94, by referencing the color of the image 200iso, and the same interpolation.

Again the differences are very small, usually deltaE94 <1 (between 9x9 16 bits and 9x9 “quick median” ...
In some cases (some blue and red) the details of noisy areas, can bring deltaE94 of about 2 ... between the solution 16-bit and “quick median” . (Note that the lowest deviations from the reference are obtained with the “quick median”.

But it seems necessary to test this code with other protocols ...

I send you my code.
:)

Lassus
28/12/2009, 18:49
Yes, the 8-bit limitation may be a problem. The advantage of using the distance-to-themean criteria is that you can use a 50x50 window for dealing with coarse noise without generating artifacts in the corners.

Anyway, the optimum would be working with wavelets as Emil does. They are much more powerful, specially for luminance NR. I need to read a lot about them before starting the new approach I have in mind; that's why I'm not investing so much time in NR right now.


I send you my code.
Thank you. Timings are impressive in my old computer. :)

fujicoly
28/12/2009, 18:50
Well I took a different approach with the NR in my test converter.

The converter processes the NEF 4 times each time using a different interpolation process and mild NR. again different NR algo for each image then the images are blended together to recreate details lost in each image. This takes around 10 mins per image :o

To my eye this gives good results, see the D3x 6400ISO image below:-

http://img510.imageshack.us/img510/9672/d3xfujicoly.png

Yups
28/12/2009, 19:15
Omg this picture result is amazing. :xxlmas:

jdc
28/12/2009, 19:29
The same image D3X 6400ISO with treatment "means" ... It is possible to reduce the noise again, but from my point of view, we lose a little texture in details.

Interpolation "noname" (Luis Sanz)...no sharp..

Processing time 1 minute 40 seconds.
http://img13.imageshack.us/img13/9777/d3xz.jpg


:)

markanini
28/12/2009, 21:14
@fujicoly: I've experimented with a few pieces of software that include nlmeans denoising, waited minutes for process to finnsh between tweaking parameters because none have implemented crop preview... Your process seems to outshine all efforts I've seen. I personally wouldn't mind waiting 10 minutes for results like that!

@jdc: I've gathered that median filtering leaves behind gaussian noise which seems to correlate with the residual color blotches I see in your examples. I have no doubt your NR process will work well with medium noise shots and your work to improve processing time should be commended. I also would like to complement you about tonality and color of you examples, very natural and life like on my calibrated S-IPS flat panel.

Lassus
28/12/2009, 21:36
Fujicoly, your result is very good. In fact it doesn't seem a denoised sample but a natural one. :xxlmas: How about the D700@25600 file?

I guess my executable with something like "-q 0 -N 8 575 1000" can do an acceptable work (not as good as yours), but I don't have enough memory in this computer for testing it. Bear in mind my NR routine is only 50 lines length including declarations and memory allocation: there is no luminance noise reduction and chrominance smoothing is no way optimized...

fujicoly
28/12/2009, 22:02
Thanks, I will also try your command line below and see how it works. I am looking forward to a new build of your dcraw.

I will post the D700 results tomorrow!

Colin


Fujicoly, your result is very good. In fact it doesn't seem a denoised sample but a natural one. :xxlmas: How about the D700@25600 file?

I guess my executable with something like "-q 0 -N 8 575 1000" can do an acceptable work (not as good as yours), but I don't have enough memory in this computer for testing it. Bear in mind my NR routine is only 50 lines length including declarations and memory allocation: there is no luminance noise reduction and chrominance smoothing is no way optimized...

jdc
29/12/2009, 15:04
Markanini thank you for your appreciation.

Regarding the color rendering, which is difficult to assess. It is desirable to use the tools of color management, both internally (eg by using Lab and relative colorimetric) that for evaluation after treatment where recognized model (?) per share is of deltaE94 (or deltaE 2000).

We are fortunate to have on these images a "Colorchecker24" which allows to verify exposure (the 4th column of gray to be around L = 50 / 51), white balance (the 3rd column of gray must be to L = 66 a = b = + -1).

For other colors, it is a part of "drift" of camera (for which we can do nothing, if not to develop a color profile ICC, - I have done for my old D200) and secondly, the drift induced by the treatment of noise (luminance and chrominance). It should not be (with the same interpolation) large drift of the 2nd type.

I enclose the Colorchecker24 at 100ISO and 6400ISO; deltaE94 differences are around 1, thus almost negligible
http://img693.imageshack.us/img693/4407/colcheckd3x.jpg


On the other hand, I continued my tests to assess the "median quick", observing the image:
* 8-bit processing with ctmf ()
* Processing "median quick" in bits 10 /11 / 12
* Changing the radius of the median between 7x7 up to 25x25
* Adjusting the threshold value of "edge sensitive"

It appears that treatment 8bits, apart from the fact that he does not withdraw (or little) the chromatic noise, provides horizontal colored lines at some transitions

The 10-bit processing has the same defects in less pronounced.

The 11-bit processing gives some of these defects, yet delivered in less influenced by the threshold level (edge sensitive) and the median size.

The 12-bit processing solves all these problems ...

I changed (easily) my algorithm to process the chromatic noise until the rays of 25x25 ... and made several image processing:
* D700 25600ISO - median 12-bit and 15x15 or 21x21 reduces "blotches" colored
* D3x 6400ISO - median 12-bit and 13x13, gives an image with fewer "blotches" and a better overall appearance. More contrast, and little loss of texture (eg leaf bottom right) (time: 1 mn 40 sec)


I enclose an extract from that image D3X 6400 ISO
http://img686.imageshack.us/img686/1733/d3x3.jpg


http://img710.imageshack.us/img710/7551/d3x2.jpg

:)

ejmartin
29/12/2009, 15:14
A large range median filter will tend to posterize colors, since the nature of the median is that it replaces the local pixel value with another local pixel value that tends to be the same for many local pixels, since the median tends not to change strongly as one moves within its range. A method less prone to posterization is something akin to the bilateral filter, for which there are also fast versions (eg weiss). Smart blur in PS is similar to a bilateral filter. One could do the bilateral filter using 8-bit bins for the fast method, but carrying out the additions in 16-bit accuracy for both the numerator and denominator (the blur is a weighted ratio -- the sum of weights times pixel values divided by the sum of the weights) and recover something quite smooth and not posterized.

jdc
29/12/2009, 16:30
To objectively compare both color, texture, processing noise, etc.. I attach the same image extract - D3x - 100 ISO ...with the same interpolation, gamma, exposition, etc. but of course without treatment of noise.

It is obvious that everything is a matter of compromise.

Best regards

http://img37.imageshack.us/img37/8394/d3x100.jpg

:)
_______________________________________________
_______________________________________________
Emil

I am not in favor of the use of large side for the medians. However I wanted to check for other uses, quality and processing time with large radii.

In the case of treatment of chromatic noise, in function of images I use 7x7 or 9x9, and in difficult cases 11x11 or 13x13. Skip over brings only limited gains in noise reduction, but as you point out the image tends to posterize.

The algorithm I use is close to that of Weiss, but simpler.

For 8-bit, I use ctmf () developed by S. Perreault, who is close in its spirit of "bilateral filter”, he also refer to Weiss.... http://vision.gel.ulaval.ca/~perreaul/publications/Id699_2007.pdf (http://vision.gel.ulaval.ca/%7Eperreaul/publications/Id699_2007.pdf)

I mean simple, not trying to make the ordinal ranking, needed in 16 bit or with real, but circumvent the difficulty by using the same principle of treatment 12 bits.

The processing time for 7x7, 9x9 or 11x11 are similar to those I could get treatment with 16-bit "optimized". Of course if I have the 16-bit code, I use ...

Our view is more sensitive to variations of gray tones and dull, and for saturated colors. I changed the "edge sensitive" (my version and not that of Lassus) that take into account the luminance, and integrate chromaticity (Lch).

Depending on the level of chromaticity, I change the value of "edge" to enhance treatment for gray and dull tones. This leads, for pastel shades and saturated increasing the threshold results in a reduction of artifacts (posterization, deformations, etc..).

I send you my code for this treatment.

And Happy New Year and best wishes for 2010.
:)
_______________________________________________
_______________________________________________
I updated my version of dcraw that takes into account changes made by Dave Coffin: support for Canon EOS-1D Mark IV and improvements colorimetry on 7D and Nikon D3S, etc..

I also slightly modified the code processing the noise before interpolation.

I tried to know when using the algorithm of Manuel (denoise ()) to remove the Gaussian noise, why did 1 pass when dark colors are rendered well enough, but there are still "blotches" and image lacks contrast, and when it is 2 or more passes, the picture is certainly better overall, but the dark areas are a with posterization : fewer shades in black or brown to very low values rgb.

I reviewed the images processed in the same conditions, the D700 and D3x low sensitivity and high sensitivity. These pictures were taken the same day, at the same time, under the same conditions, only change couples speed / sensitivity.

All values below are 'rgb' before any exposure correction, RGB conversion or gamma, and noise without treatment.

For an image of D700 to 200iso, very low pixels (bottle "Fiddlers's") are in rgb: 387,384,393 [0 .. 65535] and 6400ISO rgb: 20, 96, 62.

For an image of D3x at 100ISO, pixels are in rgb: 245,256,260 [0 .. 65535] and 6400ISO rgb: 21,21,368. These latter values are out of gamut AdobeRGB ...

First observation: in the same shot, leading once treated the same overall exposure, the images at high ISO, make the very difficult low light, giving often off-color gamut.

There are other images from the row of "embroidery" which is a highly relevant event for colorimetry, as many colors are out-gamut sRGB and AdobeRGB, but inside of ProPhoto. For example, the first thread "red" and the eleventh “blue”.

Applying the treatment of denoising "Gaussian" that works in rgb, led by the Lab values, the algorithm can not handle negative values (CLIP) that inevitably occur. This leads to two consequences: 1) the lights are very low posterised with rgb values to 0,0,0 and 2) red and blue color gamut are not interpreted by the CLIP, creating dark areas with black in red, and loss of texture in the blue wire.


To try to remedy in respect of areas posterised, there are at least 4 solutions: 1) working entirely in Lab mode (without clip L a or b), 2) work in real (float) and build a little system of color management in RGB mode, 3) arranging the image starting to no longer be starting from the situation in non-gamut.; 4)…

This is the 3rd track I have chosen, increasing slightly the 3 rgb values as a function of luminance. Empirically I added about 512 for low luminance values (below 512), and making transitions up to CIEL = 6000. Of course we must make this change prior to denoising Gaussian.

Once done, if it is running 2 or 3 passes: in the dark, the picture is less posterised the blotches have reduced, the overall contrast is better ... and changes in these areas are very close to those of reference images 200 ISO and 100 ISO.

This treatment also brings a slight improvement for the red and blue (embroideries), however it remains to be done.

Of course one goes more for the treatment of noise causes a processing time a little longer (about 15 seconds), but there is always the option of having 1, 2 or 3 passes ... change in course settings.

I enclose 3 extracts images with the new treatment: D700 6400ISO, D3x 6400ISO and D700 25600ISO.

D700 6400 ISO
http://img694.imageshack.us/img694/9417/d76400.jpg


D3X 6400 ISO
http://img502.imageshack.us/img502/2199/d3x4.jpg


D700 25600 ISO
http://img340.imageshack.us/img340/3711/d725600a.jpg

The command line for D700 6400 and D3x 6400
dcraw_99 -w -v -o 4 -4 -T -5 4 -q 21 -W -F 1.8 4 0.1 -9 NRM 50 2 1 -3 5 2000 -9 D700-6400.NEF

and for D700 25600 ISO
dcraw_99 -w -v -o 4 -4 -T -5 4 -q 21 -W -F 1.8 4 0.1 -9 NRM 50 3 2 -3 6 1000 -9 D700-25600.NEF

I update my site with version dcraw_99

And best wishes to all for this new year 2010.

:)

ejmartin
08/01/2010, 07:39
A fresh approach, very fast, and I think it holds some promise:

http://theory.uchicago.edu/%7Eejm/pix/20d/posts/ojo/d700_rmo_dctline_lmmse_dpyr_rmoY-PS.png
_______________________________________________
_______________________________________________
A refinement of the fast approach:

http://theory.uchicago.edu/%7Eejm/pix/20d/posts/ojo/d700_rmo_dctline_amaze_dpyr_rmoY-noisefn-PS.png

It seems there is a compromise that has to be made between erasing chroma noise fluctuations in low midtones, vs erasing detail in the red cloth at the upper left. This may be because of the choice of YCrCb color space to do the denoising, rather than say Lab. If we are allowed to obliterate more of the detail in the red cloth (already much of it has been sacrificed in order to do chroma NR at low frequencies in other parts of the image) then a smoother result with fewer chroma blotches can be obtained:

http://theory.uchicago.edu/%7Eejm/pix/20d/posts/ojo/d700_rmo_dctline_amaze_dpyr_rmoY-noisefn-alt-PS.png



I may try a few further refinements but I think this is close to the final method.

jdc
11/01/2010, 09:37
Emil your results are good...

I have developed several series of tests to assess the colors : overall testing by measuring deltaE94 from a chart 468 colors where a significant number is outside of sRGB or Adobe RGB and locals tests to measure pixels or group of pixels isolated

Locals Tests
I compared 5 situations, with the D700-200ISO image to see the impact of each treatment : chromatic noise, luminance noise before interpolation, luminance noise.

For the chrominance noise, I compared:
* No treatment
* Quick median 11x11 mode Lab
* Median 11x11 16-bit mode Lab
* Median 11x11 16-bit RGB mode
* Deviation from the mean 31x31 mode Lab
All values are reported by the command -9 PIX5 k x y dx dy, with k=9 pixels dx=dy=1 pixel.
Example: 3064 -730 corresponds to the pixel x = 3064 y = 730
The Lab and rgb values are before conversion RGB and without gamma, so different from those we could measure with Photoshop. The rgb values are 16 bits [0..65535].

Local test chrominance :Nouvelle page 1 (http://jacques.desmis.perso.neuf.fr/manuel.html#localst)

Tomorrow I'm away for 15 days.

:)

Lassus
11/01/2010, 13:16
Emil, your results are very good for a fast method. I think the first of the three samples fits better my taste.

Regarding the problem with the red cloth, have you tried spacing out the R/B multiplier constants in the YCbCr transformation matrix?



Local test chrominance :Nouvelle page 1 (http://jacques.desmis.perso.neuf.fr/manuel.html#localst)

Very instructive, Jacques. :si:

Thanks,

-Luis

jdc
11/01/2010, 17:57
I added on my site (in French) the process of testing more generally.Nouvelle page 1 (http://jacques.desmis.perso.neuf.fr/manuel.html#test_gam), and my point of view for Perfectraw


As a further local test on luminance noiseNouvelle page 1 (http://jacques.desmis.perso.neuf.fr/manuel.html#locallum)

:)

ManuelLlorens
13/01/2010, 00:49
Emil, your results are very good for a fast method. I think the first of the three samples fits better my taste.

I agree with Lassus, congratulations, Emil. The resolution is great and you have not loosed any local contrast, but in the red fabric. Really impresive for a fast method.

BR,

jdc
28/01/2010, 15:01
I just spent 15 days in Kenya on the coast of the Indian Ocean near Mombasa .. sun, 30 °C from morning to night, sea water over 30 ° C.. but I have not forgotten Perfectraw.

I have developed a procedure from "wavelet" entirely in Lab mode to process the chrominance noise. It is possible to play on 3 parameters: a) the strength of wavelet b) the "edge" c) the speed (or accuracy of treatment) ranging from 2.5 seconds to 9 seconds according to the settings.

I have attached 2 images with settings means. These 2 pictures have received noise reduction treatment (luminance), exposure and gamma as match restitution substantially identical to the original, the first D700-25600ISO and the second D700-6400ISO

D700 25600 ISO

http://img715.imageshack.us/img715/300/wchlab256.jpg (http://img715.imageshack.us/i/wchlab256.jpg/)

D700-6400 ISO
http://img690.imageshack.us/img690/5905/wchlab64.jpg (http://img690.imageshack.us/i/wchlab64.jpg/)

:)
http://yfrog.com/jvwchlab256j
_______________________________________________
_______________________________________________
The same image 25600ISO, but with different settings (chrominance processing time: 4 seconds)

http://img641.imageshack.us/img641/9275/wchlab2562.jpg (http://img641.imageshack.us/i/wchlab2562.jpg/)

ejmartin
14/02/2010, 19:29
Back to NR efforts. I modified the fast method to apply NR prior to demosaic; this may be a good choice to replace the hot/dead pixel smoothing that I have been using. It definitely improves the demosaic. I can then back off the post-demosaic NR and preserve some more detail in reds.

Here is the latest:

http://theory.uchicago.edu/%7Eejm/pix/20d/posts/ojo/d700_cfadpyr_dctline_amml_dpyr-PS.png

jdc
15/02/2010, 12:11
I changed the treatment of Gaussian noise after interpolation. Current treatment is based on a wavelet luminance in Lab mode. The number of parameters on which we can act is important (edge, level of chromaticity, luminance level, etc..). Of course it will be necessary to optimize

The crops resumed full treatment without exposure compensation. The total processing time, aperture, interpolation, denoising, converting RGB, TIFF writing is about 40 seconds. The time for denoising gaussian is about 8 seconds, and the one for chrominance denoising about 6 seconds (on my old computer). For these 2 times, preparing the edg, conversion rgb => lab and lab=>rgb, etc.. occupy about 50% of total time.

http://img51.imageshack.us/img51/5153/d700256wall.jpg

:)
_______________________________________________
_______________________________________________
And a picture of D3x-6400ISO corrected exposure.
The colorchercker24 is quite properly treated



http://img85.imageshack.us/img85/661/d3x64wall.jpg

markanini
18/02/2010, 18:42
Your latest NR example looks better than ever, jdc!

BTW has anyone tried this piece of software?:
Ximagic Denoiser (http://www.ximagic.com/d_index.html)

It gives you 7 different NR algos to to process images in Lab & RGB mode.

jdc
19/02/2010, 14:02
Markanini thank you for those comments.

Improved quality due in part to a new interpolation from Luis (JDD) dedicated to noisy images. I enclose a comparison of the same image with AHD interpolation, AMLL and JDD. Of course there is no treatment to reduce noise.
http://img97.imageshack.us/img97/9514/compinter.jpg



I made some improvements:
1) to increase slightly the speed of processing. The image D700-25600ISO attached, takes about 37 seconds.
2) to make better color, contrast, etc.. : Colorchercker24 is significantly well exposed, contrasting, saturated ...
3) to refine the treatment wavelet chrominance.
4)...
http://img97.imageshack.us/img97/1820/d70025619022010.jpg


I did not know "Xidenoiser. It's more interesting and lets compare. However the algorithms are slow and you do not know what really makes the treatment.

I brought another change in the spirit of "Xidenoiser" which allows to compare different algorithms.

With the command line "-9 NW a b c d e" you implement all the processing noise. Values in [] are the most common.
'a': impulse noise and high frequency [10 .. 70]
'b': Gaussian noise [20 .. 60]
'c': chroma noise [40 .. 90]
'd': accuracy of wavelets [4 .. 7]
'e': threshold of the edge [100 .. 2000]
eg for an image 25600ISO: -9 NW 60 50 80 5 200

But there is another option: "-9 NW2 a" allowing you to choose the processing mode of chroma noise:

a =- 12 ==>(default) wavelet Lab
a = 0 ==> Median 16-bit Lab
a =- 9==> Median quick Lab 12-bit
a =- 15 ==>Median 16-bit RGB red filter
a =- 16 ==> deviation from the mean in Lab mode (slow)
a > 0 (eg a=20) ==> ctmf() 8 bits

My website is updated Nouvelle page 1 (http://desmisja.perso.cegetel.net/geraud/photo_calcul.php)and the command line to obtain the image above is

dcraw_99 -w -v -W -o 4 -4 -T -5 4 -q 4 -F 1.8 3 0.2 -9 NW 60 50 80 5 200
:)