Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[split] GIMP crashes randomly
#11
(02-12-2023, 10:13 AM)Ofnuts Wrote: How much swap space have you got on the system? What is your tile cache size? I ask this because you are on the low side of RAM 

I had to "wise up" a bit in order to respond to those questions, here is a screen shot made immediately followiing a stroked path.

The Gkrel tool thingie indicates 7875 MB memory installed, with 6605 MB "free"   and 4096 MB swap available (none in use) As you can see from  the CPU timeline, the CPU got a good work out.

On the other side, the Gimp dashboard, you can see that the cache  nearly maxxed out during the operation,.. claiming 2.9 G used out of the 3 allocated, before settling back down to 820 M after the stroke concluded. As a curio I set the cache up to 4 G in Gimp preferences, and the app would crash immediately upon initiating the stroke operation..... Heretofore prior the cache had been set at 2.0 G,   with 1.9 used during the stroke operation...so perhaps my bumping it up a gig will help going forward?

Just as a footnote, but Gimp was really swamped during the stroke operation   all the dashboard timelines seized for the duration of the stroke, and then spit out the data accumulated during that time, after completion (which is why I couldn't give you a meaningful screenshot during the operation....only afterwards)b  Gkrel did not seize up during this time, it just kept on ticking, through out the entire procedure.

[Image: oGCUhlf.png]


Reply
#12
So yes, tile cache went full during the operation, and Gimp started to use its swap, which is likely to slow it down significantly. Two things that surprise me:
  • Your cache is only 3GB on a 8GB machine running Linux
  • All your cores going flat out together. When I did my tests, only one CPU maxed (despite several settings to make it possible to use several) so the path rendering is likely single threaded. So what were the other cores doing? Is your filesystem encrypted?
Something that you can keep an eye on is Image > Properties. This tells you how much memory is taken by your image. Just loading the image and you are at 1.5GB, then undo steps are 500MB apiece. In the Undo history dialog you can clear the undo history to reclaim all that memory.
Reply
#13
Just to be abundantly clear, my machine has only one physical CPU, twin core, multi-threaded, and has always shown up as a 4 core machine on various diagnostics. I just wanted to clarify that, to eliminate any potential that might otherwise mislead you.

And frankly, I'm pleased to have the work load distributed across the "4 of them", rather than focused on one.

In fact, a source of irritation  for me is the way Gimp  uses the 4 cores when encoding a .webp animation.  Look at the following, note you are looking at 5 screenshots taken at slightly later times (8:54, 8:58, 8:59:08, 8:59:44, and 9:00 PM)  note the way the work is concentrated mostly on one core at any particular time, while the other three remain idle.My thought is that is very poor utilization.

 All 5 screenshots were taken while creating a single .webp animation, on this same machine we are talking about  here.

[Image: gZvrhjD.png]

So clearly this Gimp and this OS do multi-threading

------------------------------------------------------------------------------------
Compare this to the way Kdenlive distributes it's work across the 4 cores,(following graphic)  keeps them all 4 pretty loaded up (look closely at the top of each waveform, you'll notice that each is distinct from the others.

 
[Image: um5GeWQ.png]

I'd think that was what one would hope for?  A true essence of "distributed processing".   Angel

So, what tile cache setting would you recommend for me?

And is the Gimp "swap" entity something other than the swapper file used by the OS?  Seems odd that Gimp claims to have used some, while Gkrel claims not.  just curious.


Reply
#14
Intel became unable to increase CPU frequencies so multiplied the CPUs and now the performance is handled by application code.

Not everything is easy to multi-thread. I see than when I scale, all my processors max out, but generating plasma is on one CPU only.

Fundamentally to be able to edit very big image in a small RAM, Gimp cuts the image in tiles and works with tiles on disk. In practice to increase performance, some of that disk space is cached in RAM (this is the "tile cache") and on many modern system it is big enough to avoid putting things on disk most of the time. So this tiles file and cache is really Gimp's own swap. You could of course define a tile cache much bigger than you available RAM, and Gimp will use that and the system would do the swapping. Using Gimp's own swap is normally a better solution because 1) you can hope/expect that Gimp accesses it in some manner optimized for its own processing, and 2) only Gimp is affected by the swap, other apps in the system can continue to use all the RAM they want/need.

A good size for the tile cache is the amount of free RAM you have before you start Gimp (ie, what is left after booting and starting you favorite apps). On a system like yours I would try 6GB, and be ready to downsize that a bit (5G, 4GB) if the OS swaps.
Reply
#15
What I have found somewhat surprising, the thing that really helped my "path on large image" problem, more than anything else,  is to first make a selection large enough to contain the path to be plotted, copy that and paste as a new layer. Then plot the path on the layer rather than the base image, then once done, merge the layer back into the base image. It's remarkable how much system responsiveness and stability improve.   I'm speculating that working off the selection allows Gimp to do it's business without having to base everything off of pixel 1,1 of the large base image?

For example note in the following, I made a 3200 x 2000  floating layer (shaded red just for emphasis), and then plotted the path on that layer.   This same strategy also speeds up bucket fill operations on a large base image.  

[Image: WCftVIt.png]


Reply
#16
(02-20-2023, 04:32 AM)rickk Wrote: What I have found somewhat surprising, the thing that really helped my "path on large image" problem, more than anything else,  is to first make a selection large enough to contain the path to be plotted, copy that and paste as a new layer. Then plot the path on the layer rather than the base image, then once done, merge the layer back into the base image. It's remarkable how much system responsiveness and stability improve.   I'm speculating that working off the selection allows Gimp to do it's business without having to base everything off of pixel 1,1 of the large base image?

For example note in the following, I made a 3200 x 2000  floating layer (shaded red just for emphasis), and then plotted the path on that layer.   This same strategy also speeds up bucket fill operations on a large base image.  

So, got news for you Smile .  You don't even need to use a new layer, just make a selection around thee path. This path in a 20000*20000 image:
   

With the selection shown, rendering is instantaneous, and it hardly makes a blip in the dashboard, while rendering is a lot more visible and lengthy without a selection:
   

Even with something like this:

   

Making a selection (first blip) makes a difference:

   
Reply
#17
Ofnuts said: "So, got news for you . You don't even need to use a new layer, just make a selection around thee path."


Yeah, That's how the strategy first became apparent to me, I was experiencing  really long waits trying to bucket fill a small area on a really large canvas,  and found the use of a selection box to contain the work area, did away with the long wait. So, I tried it with paths, and was very impressed with the improvement.  (it's amazing (to me) the impact this has on Gimp's "grabbiness"  for resources.

Going the step further, (with the floating layer as a work area), made the process seem easier (to me) to explain here.
Plus, the way I've been using it most, I've got this really large city map that features rivers running through it. And I've been plotting the rivers on a transparent floating layer, and erasing the path in the areas where the river flows "under" a bridge, before pasting the path layer to the base image .
I've been voyeuring the claimed image "size" as reported at the bottom of the Gimp main work window, and what I've typically been observing is for an 80 MB *.png file, it loads claiming a 2.2 gig initial file size.   But after plotting only one path  (directly) on that image, the claimed value skyrockets to 3.2 gig.   But If I instead plot that path as outlined earlier, the claimed image size only goes up about a hundred MB.  
The disparity is mesmerizing.  Gimp must be extremely aggressive in reserving resources for certain operations?


Reply
#18
(02-20-2023, 07:44 AM)Ofnuts Wrote:
(02-20-2023, 04:32 AM)rickk Wrote: What I have found somewhat surprising, the thing that really helped my "path on large image" problem, more than anything else,  is to first make a selection large enough to contain the path to be plotted, copy that and paste as a new layer. Then plot the path on the layer rather than the base image, then once done, merge the layer back into the base image. It's remarkable how much system responsiveness and stability improve.   I'm speculating that working off the selection allows Gimp to do it's business without having to base everything off of pixel 1,1 of the large base image?

For example note in the following, I made a 3200 x 2000  floating layer (shaded red just for emphasis), and then plotted the path on that layer.   This same strategy also speeds up bucket fill operations on a large base image.  

So, got news for you Smile .  You don't even need to use a new layer, just make a selection around thee path. This path in a 20000*20000 image:


With the selection shown, rendering is instantaneous, and it hardly makes a blip in the dashboard, while rendering is a lot more visible and lengthy without a selection:

Even with something like this:



Making a selection (first blip) makes a difference:

@Ofnuts, thanks for the tip, I'll try this out because gimp crash sometimes when I use many paths on a big picture.
Reply
#19
In my experiences with Gimp, there are three primary ways that Gimp can "crash randomly"

1. Operating just fine until you perform some procedure that overloads the system, demands unavailable resources, or breaks some rule, and Gimp suddenly goes **poof** completely collapsing and taking with it any unsaved work.

2. Locking up, but not actually collapsing. All controls and menu selections become inoperative, and your most intuitive path of action is to close the application...at which point you get a pop-up reminding you that you will lose your work if you proceed. Sometimes in this scenario, other options might work better. Sometimes I've gotten lucky and been able to use task manager to kill a hung script-fu or Python fu, and restore Gimp to full functionality (a happy ending). FWIW, a problem with 32 bit G'MIC and QT being a very prominent (and recoverable) example of this. But sometimes you run out of ancillary processes to kill, and realize you're gonna have to kill Gimp too, and lose it all

3. But in a third crash scenario, Gimp does something truly extraordinary. Somehow on it's way down, it manages to capture a snapshot of your work at the most recent successful operation prior to the crash, and then when you reopen Gimp, it offers to restore the lost image?

Why/How can Gimp do this? What type of operation can it anticipate a looming crash, and know to take the snapshot, as well as to know to prompt you that the saved work is available the next time you start Gimp? This latter part has me especially curious if such a snapshot might be recoverable in the other types of crashes, provided we know where to look for it?

Is this possible, and if so where to look are my questions

I recall my most recent experience with #3 was when I stupidly began a text input by inserting a [tab] keystroke before entering the text, and not removing same before entering the text to the canvas. Gimp popped up a brief warning about an illegal operation, and then went bye-bye without further ado. But happily offered to restore my work on the very next start-up.

But yeah, is there a directory I could be looking at for a precrash snapshot, in these other crash instances?


Reply
#20
Here is a copy of the "image salvaged from crash" message


Attached Files Image(s)
   


Reply


Forum Jump: