By Meghna Gangwar and Vinayak Baranwal

You have a 4MB photo that a web form rejects because it enforces a 500KB limit, or a folder of 300 PNGs that is bloating a deployment package. On Linux, you fix both in the terminal without touching a GUI. The standard tools are ImageMagick (convert, mogrify), jpegoptim (JPEG), pngcrush and optipng (PNG), and cwebp (WebP). GIMP covers the same goals through its export dialog when you prefer a visual workflow.
Install the packages from your distribution once, then run commands on copies or versioned trees. Later sections separate lossy compression from lossless optimization and document batch compress images linux flows.
jpegoptim --max=85 --strip-all applies lossy JPEG compression and removes EXIF, IPTC, and ICC metadata in one pass, which often shrinks files sharply for web use.jpegoptim --size=100k targets an approximate output size in kilobytes (for example 50k for about 50KB), useful when a upload or API enforces a hard limit and you accept more quality loss.optipng -o5 performs lossless PNG optimization and does not change pixel values, which makes it safe for icons, screenshots, and brand assets.pngcrush -brute input.png output.png tries many filter/compression combinations and writes the smallest output; it is slower than optipng but can squeeze extra space from stubborn PNGs.mogrify -quality 85 *.jpg batch-rewrites JPEGs in place in the current directory, so copies or version control are mandatory before you run it.cwebp -q 80 converts JPEG or PNG input to WebP; lossy WebP often beats JPEG size at similar appearance, and -lossless preserves PNG-like fidelity.jpegoptim and optipng are usually faster for bulk JPEG or PNG jobs.When a file is too large for an upload form, too slow to serve over mobile, or bloating a repository, the goal is the same: get the same usable image into fewer bytes. On Linux, you do that with purpose-built CLI tools, shell automation, or a GUI such as GIMP, depending on whether you are working on a single file or an entire directory of assets.
Resizing changes pixel dimensions (width and height). Smaller width and height mean fewer pixels, so file size usually drops even before you tune compression. Use resizing when the display size is fixed (thumbnails, fixed-width layouts) or when the source is far larger than you will ever show.
Compression keeps the same dimensions while reducing bytes by changing how pixel data is stored (for example stronger JPEG quantization), by stripping metadata, or by using a more efficient codec. People asking compress image linux command line often mean this kind of change, not a geometry change.
People Also Ask often blurs these ideas. If the goal is “smaller file, same pixel dimensions,” focus on compression and metadata. If the goal is “smaller file because we only need a 400px preview,” resize first, then compress.
Lossy compression discards information people may not notice at normal viewing size. JPEG is inherently lossy; tools such as jpegoptim --max=… or convert -quality … trade visual fidelity for smaller files.
Lossless compression preserves every pixel exactly. PNG optimizers (optipng, pngcrush) re-pack the same pixels into a smaller stream. Stripping metadata with convert -strip on PNG is also lossless with respect to pixels.
| Method | Tools | Trade-off |
|---|---|---|
| Lossy | jpegoptim --max=…, convert -quality …, cwebp -q …, ffmpeg with AV1/AVIF |
Much smaller files; quality can suffer if you push settings too far. |
| Lossless | optipng, pngcrush -brute, jpegoptim without quality reduction (optimize Huffman only), cwebp -lossless |
Smaller files with identical pixels; PNG gains are often modest compared with aggressive lossy JPEG. |
You need terminal access on a Linux system and permission to install packages. If you are new to the shell, see How To Work With the Linux Shell for navigation and session basics. For general Linux concepts, An Introduction to Linux Basics is a useful companion.
On Ubuntu and Debian-based systems, update the package index, then install ImageMagick, JPEG and PNG optimizers, WebP utilities, and GIMP in one step.
# Refresh package lists, then install image tools and GIMP (GUI).
sudo apt update
sudo apt install imagemagick jpegoptim pngcrush optipng webp gimp
You should see Setting up … lines for each package and no errors. A short excerpt of typical apt output looks like this:
Setting up imagemagick (8:6.9.x-xx) ...
Setting up jpegoptim (1.4.x-xx) ...
Setting up pngcrush (1.8.x-xx) ...
Setting up optipng (0.7.x-xx) ...
Setting up webp (x.x.x-xx) ...
Setting up gimp (2.10.xx-xx) ...
On Fedora, RHEL, and related distributions, use dnf:
# Install parallel packages: ImageMagick, optimizers, WebP CLI tools, GIMP.
sudo dnf install ImageMagick jpegoptim pngcrush optipng libwebp-tools gimp
On Arch Linux:
# Install ImageMagick, optimizers, libwebp (includes cwebp), and GIMP.
sudo pacman -S imagemagick jpegoptim pngcrush optipng libwebp gimp
These packages install the current stable builds from each distribution’s repositories, not necessarily the latest upstream release. After installing, confirm versions with each tool’s --version flag (for example convert -version, jpegoptim --version).
For Python-centric workflows, How To Process Images in Python with Pillow covers resizing and saving with quality settings from application code.
JPEG compression tools handle single files, strict size targets, and metadata removal. The sections below cover jpegoptim, which accepts a kilobyte cap directly, and ImageMagick’s convert command.
jpegoptim recompresses JPEGs and can strip metadata. Install it via the Prerequisites section (Debian/Ubuntu: included in jpegoptim package).
# --max=85 caps quality at 85 (lossy). --strip-all removes EXIF/IPTC/ICC.
# photo.jpg is overwritten in place unless you use --dest (see note below).
jpegoptim --max=85 --strip-all photo.jpg
--max=85: upper bound on quality; lower values yield smaller files and more visible artifacts. The value 85 is a practical default for photographic web content because most viewers cannot distinguish it from the original at normal screen sizes. For profile photos or thumbnails where fine detail matters less, 75 is a reasonable floor. For print-quality archival output, stay at 90 or above.--strip-all: drops EXIF, IPTC, and ICC metadata blocks. On a photo taken with a smartphone, metadata alone can account for 200KB or more of the total file size, so stripping it is often the highest-return action before any quality reduction.One important behavior to understand: if the source JPEG is already encoded at a quality lower than your --max value, jpegoptim will not re-encode it upward. Running jpegoptim --max=85 on a file that was saved at quality 70 leaves the file unchanged. This protects already-compressed files from accidental re-encoding but means you should verify the source quality before expecting results.
Check size before and after:
# List human-readable file size before optimization (example).
ls -lh photo.jpg
-rw-r--r-- 1 user user 1.2M photo.jpg # before
# Re-run list after jpegoptim.
ls -lh photo.jpg
-rw-r--r-- 1 user user 312K photo.jpg # after jpegoptim --max=85 --strip-all
Typical jpegoptim completion lines look like:
photo.jpg 3120x4208 24bit EXIF ICC APP12 [OK] 1248473 --> 319204 bytes (74.43%), optimized.
jpegoptim overwrites the original file by default. Keep backups, or write outputs elsewhere with --dest=dir (see the batch section).
To compress a JPEG to a specific target size such as 50KB or 100KB, use the --size flag:
# Target about 100KB; jpegoptim adjusts quality to approach the limit.
jpegoptim --size=100k photo.jpg
# Target about 50KB; expect stronger loss on detailed photos.
jpegoptim --size=50k photo.jpg
--size accepts a trailing k for kilobytes or a plain integer for bytes. On photographic images, targets far below ~20KB often show obvious blockiness because JPEG must drop high-frequency detail.
Example status line:
photo.jpg 2048x1365 24bit [OK] 2159834 --> 51234 bytes (97.63%), optimized.
ImageMagick’s convert command reads an input, applies options, and writes a new file. For imagemagick compress image workflows on JPEG:
# -quality sets JPEG quality roughly 1--100 (85 is a common starting point).
convert input.jpg -quality 85 output.jpg
# -strip removes profiles and comments; combine with -quality for smaller files.
convert input.jpg -strip -quality 85 output.jpg
-quality: higher values preserve more detail and produce larger files.-strip: removes metadata and color profiles embedded in the source.Before and after:
ls -lh input.jpg output.jpg
-rw-r--r-- 1 user user 1.1M input.jpg
-rw-r--r-- 1 user user 380K output.jpg
For a dedicated install walkthrough, see How To Install ImageMagick on Ubuntu.
PNG optimizers keep pixel data intact while reducing file size through lossless encoding and metadata removal.
Install pngcrush with the system package manager. On Debian/Ubuntu the correct command uses apt install, not apt get install:
# Install pngcrush on Debian/Ubuntu.
sudo apt install pngcrush
# -brute tries many zlib/filter strategies; output.png is the smallest found.
pngcrush -brute input.png output.png
-brute exercises many combinations; runs take longer than a single-pass optimizer but often yield the smallest lossless file.
Sample log lines:
Trying filter 0, level 9, strategy 1 ... IDAT length = 982341
Trying filter 5, level 9, strategy 0 ... IDAT length = 871204
Best PNG: IDAT length = 871204
Compare sizes:
ls -lh input.png output.png
pngcrush vs optipng. Use optipng by default. It is faster than pngcrush and produces comparable results for most PNG files. Use pngcrush -brute when optipng has already run and you still need to squeeze more bytes out, or when you are optimizing a small set of files where CPU time is not a constraint. On a typical web PNG, optipng -o5 reduces file size by 10 to 20 percent. pngcrush -brute on the same file may find an additional 2 to 5 percent. The tradeoff is that pngcrush -brute can take 10 to 30 times longer per file than optipng -o5.
Install optipng from your distribution (optipng package on Debian/Ubuntu, already included in the prerequisite install list).
# Install on Debian/Ubuntu if not already present.
sudo apt install optipng
# -o5 raises effort (0--7). Higher levels try more trials; 5 is a practical max for many jobs.
optipng -o5 input.png
-o5 increases CPU time versus the default -o2 but can improve lossless DEFLATE packing for web PNGs.
Example summary output:
Optimizing input.png
input.png: 1024x768, 3x8 bits/pixel, RGB
Input IDAT size = 512384 bytes
Input file size = 519883 bytes (3.61 bpp)
Trying:
zc = 9 zm = 8 zs = 1 f = 5 IDAT size = 441203
...
Output IDAT size = 441203 bytes (3.11 bpp)
Total reduction: 16.5% --> output file size is 448512 bytes.
By default optipng replaces input.png in place. To keep the original path intact, send output elsewhere: optipng -o5 input.png -out output.png.
PNG compression in ImageMagick remains lossless at the pixel level when you do not change depth or colors.
# Remove metadata only; pixels unchanged.
convert input.png -strip output.png
# -strip plus maximum zlib-style PNG compression level (0--9).
convert input.png -strip -define png:compression-level=9 output.png
-strip: removes ancillary chunks and profiles.png:compression-level=9: asks for the strongest DEFLATE effort ImageMagick will apply for PNG.ls -lh input.png output.png
Folder-scale jobs benefit from glob patterns, mogrify, or find so you can compress an entire directory without editing each file manually.
# All .jpg in the working directory: lossy quality cap and metadata strip.
jpegoptim --max=85 --strip-all *.jpg
# Preserve originals: write optimized copies under compressed/.
mkdir -p compressed
jpegoptim --max=85 --strip-all --dest=compressed *.jpg
With --dest, jpegoptim leaves the sources untouched and places new files under the target directory.
mogrify applies the same operation to many files in place; prefer convert when you need distinct output names.
# Rewrite every JPEG in the current directory with quality 85 (destructive).
mogrify -quality 85 *.jpg
# Resize: width 1200px, height scales to keep aspect ratio ('1200x' syntax).
# Typical resize image linux terminal pattern for uniform thumbnails or blog widths.
mogrify -resize 1200x *.jpg
# Convert each PNG to JPEG at quality 85, replacing extension in place.
mogrify -format jpg -quality 85 *.png
mogrify vs convert: convert input output always writes a separate file; mogrify edits filenames you pass on the command line directly.
mogrify overwrites originals permanently. Work on copies, or test with a single file before expanding the glob.
find walks directories and runs a command per match. Pair it with JPEG and PNG tools for trees of assets.
# All .jpg/.jpeg under a tree: optimize in place with jpegoptim.
# {} + passes multiple files per invocation, which is faster than \; on
# large directories because it reduces process spawning overhead.
find /path/to/images -type f \( -name "*.jpg" -o -name "*.jpeg" \) \
-exec jpegoptim --max=85 --strip-all {} +
# All .png files: lossless optipng -o5 in place.
find /path/to/images -type f -name "*.png" \
-exec optipng -o5 {} +
find: locates files by path and name.-exec … {} +: passes all matched paths to the tool in batches, which is significantly faster than \; on directories with hundreds of files. Use \; only when the tool cannot accept multiple file arguments, which is uncommon for the tools covered in this article.find recurses into subdirectories by default. If you want to limit it to the top-level directory only, add -maxdepth 1 after the path argument: find /path/to/images -maxdepth 1 -type f -name "*.jpg" -exec jpegoptim --max=85 --strip-all {} +
For more find patterns, read How To Use Find and Locate to Search for Files on Linux.
Reducing file size without visible quality loss is possible on both JPEG and PNG, but the meaning of “without losing quality” differs by format. For PNG, lossless means pixel-identical output. For JPEG, it means staying above the quality threshold where compression artifacts become visible to a normal viewer at normal screen sizes.
For PNG, lossless compression is straightforward: the pixel values do not change. File size drops because the DEFLATE encoder finds a more efficient way to pack the same data, or because metadata chunks are removed. Use optipng -o5 for full lossless re-packing or convert input.png -strip output.png to strip metadata only, as covered in the PNG sections above.
For JPEG, true lossless optimization is limited. Running jpegoptim --strip-all without a --max flag strips metadata and optimizes Huffman tables without forcing a lower quality level. However, any JPEG re-save is technically lossy because JPEG encoding is a destructive process. If you need a byte-for-byte identical JPEG with only metadata removed, use exiftool -all= photo.jpg instead, which strips EXIF data without touching the compressed image stream.
# Install exiftool on Debian/Ubuntu.
sudo apt install libimage-exiftool-perl
# Strip all metadata without re-encoding the JPEG stream.
exiftool -all= photo.jpg
This is the only method that guarantees the JPEG pixel data is not touched during the operation.
When lossless limits are not enough and you need a smaller JPEG, the question becomes: at what quality level do artifacts become visible?
For photographic content with continuous tones (portraits, landscapes, product photos), quality 82 to 85 in jpegoptim or ImageMagick is consistently indistinguishable from the original at 1x screen display. Quality 75 to 80 is often acceptable for thumbnails and preview images. Quality below 70 typically shows visible artifacts on smooth gradients and edges.
For images with sharp edges, text, or flat color areas (screenshots, diagrams, UI mockups), JPEG compression produces ringing artifacts at any quality level below 90. For these asset types, PNG lossless compression is the correct choice regardless of file size goals.
# Photographic content: quality 85 is visually safe for most web use.
jpegoptim --max=85 --strip-all photo.jpg
# Thumbnails and previews: quality 75 balances size and appearance.
jpegoptim --max=75 --strip-all thumbnail.jpg
For UI assets, screenshots, and diagrams, use optipng -o5 as covered in the PNG compression section above. Avoid JPEG for these asset types regardless of quality setting.
GIMP handles image size reduction through its interactive export dialog and optional non-interactive batch Script-Fu.
.jpg or .png) to match the format you need.Non-interactive mode runs a Scheme script without opening windows. Before running the script below, verify your GIMP version:
gimp --version
The file-jpeg-save procedure used in the script below works on GIMP 2.10. It is not available in GIMP 3.0, which ships as the default on Arch Linux and current Fedora releases. On GIMP 3.0, use file-jpeg-save replacement gimp-file-overwrite or export manually via File → Export As.
# -i: no GUI. -b: Script-Fu batch script; adjust /path/to/images/*.jpg.
# file-jpeg-save uses 0.85 on a 0--1 scale (about 85%). gimp-image-delete frees each image after save.
gimp -i -b '(let* ((filelist (cadr (file-glob "/path/to/images/*.jpg" 1))))
(while (not (null? filelist))
(let* ((filename (car filelist))
(image (car (gimp-file-load RUN-NONINTERACTIVE filename filename)))
(drawable (car (gimp-image-get-active-drawable image))))
(file-jpeg-save RUN-NONINTERACTIVE image drawable filename filename
0.85 0 0 0 "" 0 1 0 2 0)
(gimp-image-delete image))
(set! filelist (cdr filelist))))
(gimp-quit 0)'
-i: batch mode without a GUI.-b: passes the Script-Fu program; file-glob collects matching paths; file-jpeg-save writes JPEG at the given quality ratio.For large batches, jpegoptim or optipng usually finishes faster than GIMP batch mode. Prefer Script-Fu when you chain edits (resize, flatten, watermark, export) in one runnable script.
If you serve images over a CDN or to mobile users and your target environments support it, converting JPEG and PNG assets to WebP or AVIF can cut transfer size by 25 to 60 percent without a visible quality change. Both formats are worth evaluating before committing to a delivery pipeline.
The webp package supplies cwebp and related tools on Debian/Ubuntu.
# Install WebP command-line utilities on Debian/Ubuntu.
sudo apt install webp
# Lossy WebP at quality 80 (0--100 scale).
cwebp -q 80 input.jpg -o output.webp
# Lossless WebP from PNG (keeps pixel fidelity, still often smaller than naive PNG).
cwebp -lossless input.png -o output.webp
-q: lossy quality; higher numbers preserve more detail.-lossless: PNG-equivalent fidelity in WebP form.WebP often lands about 25–35% smaller than JPEG at similar viewing conditions for photographs.
ls -lh input.jpg output.webp
-rw-r--r-- 1 user user 1.2M input.jpg
-rw-r--r-- 1 user user 820K output.webp
Confirm WebP support in every client that must consume the asset. Current releases of major browsers support WebP broadly, but legacy systems may not.
AVIF (AV1 still image) targets maximum compression for web stills; encoding costs more CPU time than JPEG.
# Install ffmpeg on Debian/Ubuntu.
sudo apt install ffmpeg
Before running the conversion command, verify that your ffmpeg build includes the AV1 encoder:
ffmpeg -encoders 2>/dev/null | grep av1
You should see a line containing libaom-av1 in the output:
V..... libaom-av1 libaom AV1 (codec av1)
If the encoder is not listed, your ffmpeg package was compiled without libaom support. On Ubuntu, install the full build:
sudo apt install ffmpeg libavcodec-extra
The ImageMagick convert command can write .avif output only if ImageMagick was compiled with libheif and libavif support, which is not the default in standard Ubuntu or Debian packages. Running convert input.jpg -quality 80 output.avif on a standard install will either produce an error or write a zero-byte file silently. Verify AVIF support in your ImageMagick build with convert -list format | grep -i avif. If AVIF does not appear in the output, use the ffmpeg method instead.
# libaom-av1 AVIF: -crf lower is higher quality (typical range explored 18--40).
# -b:v 0 selects constrained-quality mode with libaom when paired with CRF.
ffmpeg -i input.jpg -c:v libaom-av1 -crf 30 -b:v 0 output.avif
# If your ImageMagick build includes AVIF, convert may write .avif directly.
convert input.jpg -quality 80 output.avif
libaom-av1: AV1 encoder used inside AVIF containers.-crf 30: quality control for the AV1 encoder. Lower values (around 20) give better quality and larger files. Higher values (around 40) give smaller files with more visible compression. A range of 25 to 35 works for most photographic content.-b:v 0: enables constrained-quality mode when paired with -crf using libaom.ffmpeg AVIF encoding can be slow on large frames. If your distro packages avifenc from libavif, benchmark it for batch jobs against ffmpeg and pick the faster option for your hardware.
AVIF adoption in browsers and CDNs is uneven compared with JPEG and WebP. Validate targets before you standardize on AVIF for production.
| Tool | Format Support | Compression Type | Typical Size Reduction | Best Use Case |
|---|---|---|---|---|
| jpegoptim | JPEG only | Lossy / lossless metadata strip | ~20–80% lossy depending on source; ~5–15% strip-only | Per-file or batch JPEG with optional size targets |
| optipng | PNG only | Lossless | ~5–25% | Production lossless PNG shrinking |
| pngcrush | PNG only | Lossless | ~5–40% depending on image | Aggressive lossless trials when CPU time is acceptable |
ImageMagick (convert) |
JPEG, PNG, WebP, GIF, TIFF, and more | Lossy or lossless by format | Varies | Single-output conversions and quality experiments |
ImageMagick (mogrify) |
Same as convert |
Lossy or lossless | Varies | In-place batch edits when you accept overwrite risk |
| cwebp | WebP output (JPEG/PNG input) | Lossy or lossless | ~25–35% vs JPEG at similar appearance (lossy WebP) | Web delivery when WebP is allowed |
| ffmpeg | AVIF, WebP, many codecs | Lossy (typical AVIF pipeline) | ~40–60% vs baseline JPEG in many photographic tests | AVIF or transcoding chains already centered on ffmpeg |
| GIMP | JPEG, PNG, WebP, and more | Lossy or lossless | Varies | GUI editing plus export control |
| Use Case | Recommended Tool | Command Pattern |
|---|---|---|
| Compress a single JPEG quickly | jpegoptim | jpegoptim --max=85 --strip-all photo.jpg |
| Compress JPEG to approximate KB target | jpegoptim | jpegoptim --size=100k photo.jpg |
| Lossless PNG optimization | optipng | optipng -o5 input.png |
| Aggressive PNG size reduction | pngcrush | pngcrush -brute input.png output.png |
| Batch compress all JPEGs in a folder | jpegoptim or mogrify | jpegoptim --max=85 *.jpg |
| Batch mixed formats | find + jpegoptim + optipng | find … -exec jpegoptim … {} + plus PNG branch |
| Convert to WebP for web | cwebp | cwebp -q 80 input.jpg -o output.webp |
| Convert to AVIF | ffmpeg | ffmpeg -i input.jpg -c:v libaom-av1 -crf 30 -b:v 0 output.avif |
| GUI single image | GIMP | File → Export As |
| Edit + compress together | GIMP Script-Fu | gimp -i -b '(…)' |
The sections above cover the main CLI and GUI tools available on Linux for reducing image file size, from single-file operations to batch workflows and modern format conversion. The following questions address common edge cases and decision points that come up when applying these tools in practice.
Q: How do I reduce image file size in Linux without losing quality?
Use lossless PNG optimizers such as optipng -o5 or pngcrush -brute, which keep pixels identical while shrinking the file. For JPEG, run jpegoptim --strip-all without a --max quality cap to strip metadata and apply Huffman optimization without forcing a harsher lossy setting. You can also remove metadata from many formats using convert input -strip output when you do not need embedded profiles.
Q: How do I compress a JPEG image to 100KB or 50KB in Linux?
Run jpegoptim --size=100k yourfile.jpg for an approximate 100KB result, or --size=50k for about 50KB. The optimizer lowers effective quality to approach the cap; busy photos may show artifacts at very low targets. Inspect the printed byte totals to confirm the new size.
Q: What is the difference between resizing and compressing an image in Linux?
Resizing changes width and height in pixels, so the image has fewer samples and usually a smaller file even before codec tweaks. Compressing keeps the same dimensions while changing how those pixels are encoded, stripping metadata, or switching formats. Resize image linux terminal flows use geometry operators; compress image linux command line flows tune codecs and containers.
Q: Which Linux tool is best for batch compressing images?
For directories of JPEGs, jpegoptim --max=85 --strip-all *.jpg or jpegoptim … --dest=backup *.jpg scales well. For destructive batch recompression across formats, mogrify is powerful but dangerous because it writes in place. Mixed trees benefit from find … -exec jpegoptim … {} + for JPEG and a parallel find for PNG with optipng.
Q: How do I compress PNG images in Linux from the command line?
Install optipng and run optipng -o5 input.png for a strong default lossless pass, or install pngcrush and run pngcrush -brute input.png output.png when you want the smallest trial-based result. Always compare ls -lh sizes and keep originals until you trust the command flags.
Q: Can I convert images to WebP in Linux to reduce file size?
Yes. Install the webp package, then run cwebp -q 80 input.jpg -o output.webp for lossy WebP, or cwebp -lossless input.png -o output.webp for lossless mode. WebP frequently beats JPEG on bytes at similar visible quality, but you must verify compatibility with every consumer of the file.
Q: How does ImageMagick reduce image file size in Linux?
convert input.jpg -strip -quality 85 output.jpg lowers JPEG size through quality control and removes metadata. For PNG, convert input.png -strip output.png deletes ancillary chunks and may shrink the container without altering pixels. ImageMagick can also set PNG zlib effort with -define png:compression-level=9 while remaining lossless at the pixel level.
Q: Is GIMP a viable option for reducing image file size in Linux?
Yes. Use File → Export As, pick JPEG or PNG, and tune the quality or compression slider while watching the estimated size. For many files at once, Script-Fu batch mode or gimp -i -b can automate exports, though dedicated CLI tools such as jpegoptim and optipng are usually faster when you only need compression.
Linux gives you purpose-built tools for every image compression scenario. Use jpegoptim for fast JPEG optimization with target size control, and optipng or pngcrush for lossless PNG reduction. For batch jobs across directories, combine find with either tool. When file size is the priority for web delivery, convert to WebP with cwebp or AVIF with ffmpeg. For one-off edits that combine compression with other changes, GIMP covers the workflow without leaving the desktop. Always run these tools on copies or version-controlled files until you trust the output.
Thanks for learning with the DigitalOcean Community. Check out our offerings for compute, storage, networking, and managed databases.
Building future-ready infrastructure with Linux, Cloud, and DevOps. Full Stack Developer & System Administrator. Technical Writer @ DigitalOcean | GitHub Contributor | Passionate about Docker, PostgreSQL, and Open Source | Exploring NLP & AI-TensorFlow | Nailed over 50+ deployments across production environments.
Thank you for an excellent, informative post. I came here looking for a way to reduce a jpeg file to meet the size limit for an avatar for a Linux forum.
- Kurt
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.
Full documentation for every DigitalOcean product.
The Wave has everything you need to know about building a business, from raising funding to marketing your product.
Stay up to date by signing up for DigitalOcean’s Infrastructure as a Newsletter.
New accounts only. By submitting your email you agree to our Privacy Policy
Scale up as you grow — whether you're running one virtual machine or ten thousand.
From GPU-powered inference and Kubernetes to managed databases and storage, get everything you need to build, scale, and deploy intelligent applications.