User Rating 0.0
Total Usage 0 times
Drop image here or click to browse PNG, JPG, WebP up to 4096px
Is this tool helpful?

Your feedback helps us improve.

About

The splotch filter decomposes an image into averaged square regions of varying size. Large clumps receive fewer iterations, producing broad color washes. Small clumps receive exponentially more iterations, preserving localized detail. The distribution follows count(s) = min + (max min) × tp, where t is a normalized position between 0 and 1. The exponent p controls curve steepness. At p = 1 the distribution is linear. At p = 3 (default) the smallest clumps dominate heavily, creating a characteristic organic texture.

Getting the parameters wrong produces either a blurred mess (too few small iterations) or an unchanged image (too few large iterations). This tool exposes the full parameter space with real-time preview so you can dial in the exact abstraction level. Processing runs in a Web Worker so large images (up to 4096×4096 px) do not freeze the browser. Note: results are non-deterministic. Each run produces a unique output due to random region placement.

splotch generator image filter pixel art abstract art image averaging clump effect painterly filter

Formulas

For each clump size level s starting from size and stepping down by step to a minimum of 1, the iteration count is computed as:

count(s) = min + (max min) × tp

where the normalized parameter t is:

t = size ssize 1

At each iteration, a random position (x, y) is chosen uniformly. All pixels within the s × s square anchored at that position are averaged:

R = 1s2 s2i=1 Ri

The same averaging is performed independently for each channel (R, G, B). Alpha is preserved. Every pixel in the square is then set to the averaged color, producing the characteristic flat-patch appearance.

Where: size = maximum clump side length, min = iterations at largest clump, max = iterations at smallest clump, p = exponent controlling curve shape, step = size decrement between levels, t = normalized position in [0, 1].

Reference Data

ParameterSymbolDefaultRangeEffect
Max Clump Sizesize42 - 64Side length of the largest averaged square
Min Iterationsmin1000100 - 50000Iterations at the largest clump size
Max Iterationsmax9000100 - 50000Iterations at the smallest clump size (1×1 or step-determined)
Exponentp30.1 - 10Curve steepness. Higher = more weight on small clumps
Steps11 - 16Decrement between clump sizes
Exponent 1 - - - Linear distribution: equal weight all sizes
Exponent 2 - - - Quadratic: moderate small-clump bias
Exponent 3 - - - Cubic: strong small-clump bias (recommended)
Exponent 5 - - - Quintic: extreme detail preservation
Exponent 0.5 - - - Square root: large-clump bias, very abstract
Step 1 - - - Every size level processed: smoothest gradient
Step 2 - - - Skip every other level: faster, blockier
Size 2 - - - Minimal abstraction: subtle mosaic
Size 8 - - - Moderate abstraction: painterly feel
Size 16 - - - Heavy abstraction: large color blocks
Size 32 - - - Extreme: nearly impressionist blobs
Size 64 - - - Maximum: coarse abstract tiles
Total Iterations - - - Sum of all size-level iterations. Higher = slower
Image Limit - 4096px - Max dimension. Larger images auto-scaled down

Frequently Asked Questions

The exponent p controls the distribution curve between large-clump and small-clump iterations. At p = 1, iteration counts increase linearly from min to max, producing a balanced abstraction. At p = 3, the curve is cubic, meaning most iterations concentrate on the smallest clumps. This preserves fine detail while still allowing large washes. Values below 1 (e.g., 0.5) invert the bias toward large clumps, creating highly abstract outputs with minimal fine structure.
Each iteration selects a random (x, y) position using a uniform pseudorandom generator. Since the positions differ between runs, different pixel regions get averaged, producing unique outputs. This is inherent to the stochastic nature of the algorithm. If you need reproducibility, save the output image rather than relying on re-running the same parameters.
If size 1 is not evenly divisible by step, the algorithm simply stops when the next decrement would go below 1. The smallest processed clump may be larger than 1×1. For example, size = 7 with step = 3 processes sizes 7, 4, 1. But size = 6 with step = 4 processes 6, 2 only. This produces coarser output. Use step = 1 for maximum smoothness.
Processing time scales with total iterations multiplied by the average clump area. A 4096×4096 image with 50000 max iterations at size 32 involves billions of pixel writes. The Web Worker prevents UI freezing, but expect processing times of 10-60 seconds for large images with high iteration counts. Smaller images (512×512) complete in under a second. Resolution does not affect quality per se, but more pixels require more iterations to achieve equivalent coverage.
Yes. If min > max, the distribution inverts: the largest clumps receive the most iterations and the smallest clumps receive the fewest. This is mathematically valid and produces an unusual effect where broad areas are heavily blended but fine structure is largely untouched. It creates a distinctive look different from simply lowering the exponent.
Averaging operates in sRGB space directly on the 8-bit channel values. This is computationally fast but not perceptually linear. In practice, sRGB averaging produces slightly darker midtones compared to linear-light averaging. For artistic purposes, the difference is negligible and the sRGB approach matches the behavior of the original clumpomatic implementation.