Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
clothify granularity
#1
I tried using Python-fu clothify(). It worked the first time and it works great.

First I tried it on a smaller image, 1,000 x 1,000 px  (viewed in a 6 x 6 in area on my monitor).  Then I tried it on a larger image, 2,000 x 2,000 px  (still viewed in a 6 x 6 in area on my monitor).

The resulting 'larger' image looks different when viewed in a 6 x 6 in area,  but if I zoom in the 'clothify' effect looks the same.  So what I see in this is that clothify() is image size agnostic. The result can be seen on a per area basis, not per image area.  Is there a way to adjust clothify() so that its effect is spread out over a larger area... for a larger image ?
Reply
#2
(01-25-2018, 06:45 PM)sbhairava72 Wrote: I tried using Python-fu clothify(). It worked the first time and it works great.

First I tried it on a smaller image, 1,000 x 1,000 px  (viewed in a 6 x 6 in area on my monitor).  Then I tried it on a larger image, 2,000 x 2,000 px  (still viewed in a 6 x 6 in area on my monitor).

The resulting 'larger' image looks different when viewed in a 6 x 6 in area,  but if I zoom in the 'clothify' effect looks the same.  So what I see in this is that clothify() is image size agnostic. The result can be seen on a per area basis, not per image area.  Is there a way to adjust clothify() so that its effect is spread out over a larger area... for a larger image ?

Clothify is mostly made using random noise so there is no notion of size. From what I see it is mostly building a random bump-map and applying it on the picture. The "cloth" aspect is in the eye of the beholder...  

If you want a cloth texture see https://www.gimp-forum.net/Thread-Cloth-...easy-steps (do it in black and white and use it as a bitmap)
Reply


Forum Jump: