Normalize Gradient Script taking hours upon hours...

4 replies193 views
Rob Calfee avatar
Hi everyone, 

The new Normalize Gradient Script (NSG) worked well on my Pelican pic. I did a full WBP script and created a master to compare to. It worked very well. However, it takes FOREVER. Like 5 hours after running the WBP without integration.  Is anyone else see long processing times with NSG? 

Thanks,
Rob
Engaging
Mariusz Golebiewski avatar
Yes, the processing time is long, however not 5 hours long. It takes 40-50 minutes to process ca 200 subframes on my ryzen 5900x.
Helpful Concise
mousta avatar
It is labor intensive script. I have a Ryzen 5900x  and it took 4 minutes 39 seconds for 34 images. Also make sure you select a good high quality reference frame, results vary
Helpful
Benny Colyn avatar
It takes a long time, in part because it mostly seems single-threaded so it only uses 1 CPU core where most of us have 8 or more. Of course a lot depends on how many frames it needs to handle, I tried it on a 400+ frames dataset and that took a very long time.
Torben van Hees avatar
I will check again but for me it runs rather quickly, at least compared to localnormalization. Took about 4 minutes for 120 frames (24MP). And of course image integration is also far quicker since the normalization has already been done. I also think at least the I/O ran multithreaded. Do you use Linux, too? Maybe less of PI is multithreaded in Windows? I am using a Threadripper, so single-threaded should be slower, not faster.