Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[split] GIMP crashes randomly
#1
(02-06-2023, 06:15 PM)cjsmall Wrote: My question is: are others experiencing this as well?  

What I have noticed while working with canvas size 16,000 x 11,000...some operations take an especially long time to conclude, making Gimp appear "locked", although it'd actually chuffing away.

For example, a bucket fill operation, even within a relatively small target area of the larger picture, for instance only 150x 200 pixels, can take 45 seconds to actually paint the target.  My guess is that's because Gimp has to calculate it''s task relative to pixel 1;1 of the graphic, so there is a lot of heavy lifting going on under the covers, despite the appearance of a simple task. Paths on really large canvasses take  a long time to process, as well.

In fact, I've been getting a surprising amount of outright crashes when stroking a path on large canvases, more so than I've ever experienced with Gimp before,  which is heart breaking when I've got considerable work in progress that just goes "poof" along with it.
What I've done is to modify my workfow, performing all the other "non-path" tasks as a group, then saving that  as an interim work, before embarking on any extensive path work.... saving paths for last....which the strategy has eliminated a lot of frustration.


I've been knitting "tiles" together of an old map that  I found in a multi page format PDF online source.  Importing into Gimp at 600 PPI,  and then touching up the join areas.   And it was going so well I decided to expand my work, ultimately enlarging the canvas to 16,000 x 14,000....whereafter Gimp  got REALLY crash prone...crashed like half a dozen times just this afternoon. But having learned the hard way what to expect, I just saved a copy before plotting any path, just to be on the safe side.

Probably need to buy a new laptop, my current machine is maxxed at 8 GB ram..


Reply
#2
Created a 16,000 x 11,000 image. Used my truchet path script to generate a path with a 40px grid on it (so that's 6.6M linear pixels long). Rendering runs for  8'42" with one processor maxing out a 4.2GHz (with the nice little "burn" symbol over the processor temperature indicator Big Grin ), while it doesn't seem to be using much memory.

Ran the thing several times (tried to see if OpenCL would change something) and Gimp survived.

I do have random crashes, but they seem related to clipboard or drag operations.
Reply
#3
That's what is so maddening about it. It'll woo you in with a false sense of security, dozens of flawless operations. And then suddenly when you least expect it...*poof*

I lost 6 hours of work on one such instance a few days ago (only myself to blame, but I was still livid)......so I started saving in incremental steps, just to be proactive. And in one such instance yesterday I had about 3 hours of work in progress, was just about to plot a tiny little path, and thought "whoops, you better save first just in case" so I saved a worktemp.xcf just out of caution, and sure enough on the very next step as I plotted the tiny little path, everything went *poof* and I felt so vindicated for having thought to save first.

It's really rare that I have Gimp either seize up or go "poof", and that is reflecting upon years of use.

Which makes my experiences this past week, and particularly yesterday, all the more note worthy...I was beside myself in negative awe.....wondering what had gone south.

FWIW it was a 126 MB XCF file that ultimately exported as a 65 MB PNG, and as a 28 MB GIF, and as a 20 MB JPG ..... the finished work being a composite of numerous 4000 X 2000 "tiles" each approx 5 MB in size.

And there was just something about my expanding the canvas from 16,000 x 11,000, to 16,000 x 14,000 that made work performed after the enlargement especially precarious.

Bear in mind, this is not a "complaint" per se...just an observation made particularly relevant after coming here and reading this thread title....

Just as a benchmark, plotting the following path, 224 pixels wide, took 28 seconds before the results were painted on the canvas.  I could have been performing other tasks on my PC at the time, but Gimp  itself was pretty much preoccupied with the procedure (FWIW)

[Image: Q7KCD3t.png]


Reply
#4
(02-11-2023, 04:15 PM)rickk Wrote: Just as a benchmark, plotting the following path, 224 pixels wide, took 28 seconds before the results were painted on the canvas.  I could have been performing other tasks on my PC at the time, but Gimp  itself was pretty much preoccupied with the procedure (FWIW)

[Image: Q7KCD3t.png]

8.75s for me. Possibly better spec'ed PC (CoreI5 i5-9400H) but not outlandish either.
Reply
#5
Just meandering a bit, but that listing on the bottom rail of the application reading "Background (2.8 GB)", I assume that means that Gimp has reserved 2.8 GB of RAM for the project?

And then if I subsequently upsize the canvas to 20,000 X 20,000 that listing grows to "Background (3.4 GB)"

So, what I'm wondering....if I start out with a 16,000 X 14,000 (2.8 GB) canvas, and then put a couple hundred megabytes of content on that canvas (as well as any accompanying "undo" data Gimp maintains in the background)......and then I expand the canvas to 20,000 X 20,000 (3.4 GB) and then continue working adding more content........could I possibly be forcing Gimp's memory management into a "fragmented" state?

Similar to how one get's fragmented files on a hard drive, you know with the "flag" bit? Could it be that Gimp is being forced into some unusual acrobatics to manage fragmented memory content? (just an ignorant question, but that is what really got me started on this tangent)


Reply
#6
I don't think fragmented memory is a problem, at least as long as everything fits in RAM. Your problem is possibly that Gimp starts overflowing its tile cache and starts using its own swap or makes your computer swap, and then you have disk I/O added to the picture. If you have a SSD this may not be too obvious (but will wear out the disk in the long run). If you have a HDD the slow-down is more noticeable.
Reply
#7
Anything's possible, I suppose, although I consider myself to be fairly "swapper" aware. Any time I feel the system bog down, I just kind of intuitively look at the drive activity light to see if memory is being swapped out.

Sometimes I think that just managing workflow with some semblance of intelligence is the biggest help. Or put another way, sometimes I think I dig myself in a hole, and only once trapped do I realize "what else could you expect?"

A robust new laptop would be a big help too


Reply
#8
For reference,   just now  on a fresh  boot with no other applications running, I opened a 10,800 X 14,400 *.png file,  93 MB in size.

Very first operation, attempted to stroke a path 42 pixels wide, by 3600 pixels long......and *poof*...all gone!

Doubtfully  a "memory depleted" scenario, and on a machine that  otherwise runs stable, all day long..

And then reopened Gimp and successfully completed  the very same operation, without incident. So while my experience fails the "reproducible" litmus test.... It does appear there is some "paths on large images" stability issue.

The purpose of this post is to document, not intended as a gripe,  so please don't misinterpret my intent.  Smile

But there is something to this.


Asus laptop
Intel I-3 processor twin core 2.3 Ghz
8 GB Ram
64 bit Knoppix OS
32 bit Gimp

No other stability issues apparent


Reply
#9
(02-12-2023, 04:48 AM)rickk Wrote: For reference,   just now  on a fresh  boot with no other applications running, I opened a 10,800 X 14,400 *.png file,  93 MB in size.

Very first operation, attempted to stroke a path 42 pixels wide, by 3600 pixels long......and *poof*...all gone!

Doubtfully  a "memory depleted" scenario, and on a machine that  otherwise runs stable, all day long..

And then reopened Gimp and successfully completed  the very same operation, without incident. So while my experience fails the "reproducible" litmus test.... It does appear there is some "paths on large images" stability issue.

The purpose of this post is to document, not intended as a gripe,  so please don't misinterpret my intent.  Smile

But there is something to this.


Asus laptop
Intel I-3 processor  twin core 2.3 Ghz
8 GB  Ram
64 bit Knoppix OS
32 bit Gimp

No other stability issues apparent

How much swap space have you got on the system? What is your tile cache size? I ask this because you are on the low side of RAM and modern OSes use a lazy allocation strategy: allocation requests get a positive answer, and actual allocation happens when the process actually tries to use that memory. If there is no memory available, the OS kills a process (often the requester, but not always the requester) to recover some memory (aka "OOM killer").

So your sudden deaths could have more to do with low RAM than with stray pointers. Running Gimp in a console and looking for the messages when it dies could provide some info (as would looking at system logs).

Edit: Something that escaped me: you are running 32-bit Gimp... So you are normally limited to 4GB RAM in the process (but then why 32-bit Gimp?)
Reply
#10
(02-12-2023, 10:13 AM)Ofnuts Wrote: Edit: Something that escaped me: you are running 32-bit Gimp... So you are normally limited to 4GB RAM in the process (but then why 32-bit Gimp?)

Boy, that would be quite a tangent, lets just say that I run a dual boot rig, and have 64 bit Gimp on my Windows partition. So, anytime I expect to work with an image larger than 4 GB, I boot windows.    Smile

32 bit Gimp  seems  perfectly adequate 99.9% of the time, for the types of things I usually do with Gimp.

I really don't think I was anywhere near 4 gig active ram on my most recent experience.  It's not really that big a deal, I'm not looking for conflict. Just thought it was worth mention. 

 I just plan my work such that I don't have a lot of unsaved work whenever I intend to plot paths on large images, and everything is fine. It's a very small subset, of a subset, of my overall experience with Gimp, so I'm not losing any sleep over it.


Not near as much sleep as some appear to be losing trying to get legacy applications working with
latest releases of Linux, anyway. (<----- only a small amount of sarcasm)   Wink


Reply


Forum Jump: