10 Hidden-Gem Python Automation Libraries That Quietly Replace Your Daily Shell Scripts



Key Takeaways

  • For complex automation, Python offers a more robust, readable, and testable alternative to shell scripts, which can become brittle and error-prone.
  • Libraries like Plumbum, Invoke, and Fabric allow you to run shell commands, organize tasks, and manage remote servers with the power and clarity of Python.
  • Python's ecosystem provides superior tools for common scripting needs, including file manipulation (pathlib), API calls (requests), CLI creation (Typer), and creating beautiful terminal UIs (Rich).

I once deleted half a staging environment with a single line of Bash. The script was supposed to clean up old log directories, something like rm -rf /path/to/logs/$DEPLOYMENT_ID/*. Except on one fateful Tuesday, the $DEPLOYMENT_ID variable was empty. The command became rm -rf /path/to/logs///*, which my shell happily interpreted as a request to wipe everything from the root of that logs directory downwards.

That painful lesson sent me on a quest. I love the command line, but I realized that as soon as a script involves variables, loops, or any real logic, Bash becomes a minefield of foot-guns. It's brittle, hard to test, and the syntax is, let's be honest, arcane.

So, I started replacing my go-to shell scripts with Python. I'm talking about a whole class of incredible libraries that let you write robust, readable, and powerful automation without ever leaving the comfort of Python. These are the hidden gems that quietly do the heavy lifting.

Here are 10 of my absolute favorites.

1. Plumbum – Shell Commands as Python Objects

This one blew my mind when I first saw it. Plumbum makes shell commands feel like native Python functions. You can pipe, redirect, and chain commands with an elegance that subprocess.run can only dream of.

  • What it gives you: Local and remote (SSH) command execution, rich path objects, and a way to build complex pipelines that are still readable.
  • Killer feature: The pipe | operator works just like in shell. It's pure magic.

From Shell to Python: Let's replace a classic grep | sort | head pipeline.

The old way (Shell):

grep -R "ERROR" logs/ | sort | head -n 10

The new way (Plumbum):

from plumbum import local

# Import commands as if they are objects
grep = local["grep"]["-R", "ERROR", "logs/"]
sort_ = local["sort"]
head = local["head"]["-n", "10"]

# Pipe them together!
pipeline = grep | sort_ | head
print(pipeline())

Look at that! It’s clean, explicit, and far less prone to quoting errors.

2. Delegator.py – The No-Fuss Subprocess Wrapper

Sometimes Plumbum is overkill. If you just want to run a command and grab its output without the verbose boilerplate of the built-in subprocess module, Delegator.py is your best friend. It’s what subprocess should have been.

  • What it gives you: A super simple API for running commands.
  • Killer feature: Captures stdout, stderr, and the return code in a single, convenient object.

From Shell to Python: Checking for files and then doing something.

The old way (Shell):

ls *.log && echo "found logs"

The new way (Delegator.py):

import delegator

c = delegator.run("ls *.log")
# The command's result is an object with useful properties
if c.return_code == 0 and c.out:
    print("found logs")

3. Invoke – For When Your Script Grows Up

Your deploy.sh script started as three lines. Now it’s 150 lines of unreadable spaghetti. Invoke lets you organize your automation tasks into clean, documented Python functions that you can run from the command line.

  • What it gives you: A powerful task execution tool, like make but for Python.
  • Killer feature: Automatically generates a command-line interface from your Python functions.

From Shell to Python: Turning a simple deployment script into a proper task.

The old way (deploy.sh):

#!/bin/bash
git pull
pip install -r requirements.txt
systemctl restart myapp

The new way (tasks.py):

from invoke import task

@task
def deploy(c):
    """
    Pull latest code, install deps, and restart the app.
    """
    print("Pulling latest code...")
    c.run("git pull")
    print("Installing dependencies...")
    c.run("pip install -r requirements.txt")
    print("Restarting application...")
    c.run("sudo systemctl restart myapp")

Now you just run invoke deploy and get a clean, repeatable process.

4. Fabric 2+ – Your SSH for Loop Killer

If you find yourself writing for host in hosts; do ssh $host "command"; done, stop what you're doing and install Fabric. Built on top of Invoke, it's designed for running commands across one or many remote servers over SSH.

  • What it gives you: A high-level API for SSH automation.
  • Killer feature: Effortlessly running the same command on a list of hosts, with proper connection handling.

From Shell to Python: Checking disk space on multiple servers.

The old way (Shell):

for h in host1 host2; do
  ssh "$h" "df -h /"
done

The new way (Fabric):

from fabric import Connection

hosts = ["host1", "host2"]

for host in hosts:
    with Connection(host) as c:
        result = c.run("df -h /", hide=True)
        print(f"--- {c.host} ---")
        print(result.stdout)

5. Pathlib + Shutil – The Sane Way to Handle Files

This one is already in the standard library, so you have no excuse! Manually concatenating strings to build file paths is a recipe for disaster. pathlib provides beautiful, object-oriented filesystem paths.

  • What it gives you: A cross-platform, object-oriented way to interact with files and directories.
  • Killer feature: Using the / operator to join paths. It’s so intuitive.

From Shell to Python: Finding all .txt files and moving them.

The old way (Shell):

find src -name "*.txt" -exec mv {} dest/ \;

The new way (Pathlib + Shutil):

from pathlib import Path
import shutil

src_dir = Path("src")
dest_dir = Path("dest")
dest_dir.mkdir(exist_ok=True)

for path in src_dir.rglob("*.txt"):
    shutil.move(path, dest_dir / path.name)

6. Watchdog – Stop Polling, Start Reacting

Need to run a script whenever a file changes? Watchdog replaces ugly while true loops with an efficient, event-driven approach that hooks directly into your OS's filesystem events.

  • What it gives you: A cross-platform API for monitoring filesystem events.
  • Killer feature: Lets you trigger Python functions on file creation, modification, or deletion without busy-waiting.

From Shell to Python: Running a job whenever a specific file is written to.

The old way (Shell with inotifywait):

while inotifywait -e close_write myfile.txt; do
  ./run_job.sh
done

The new way (Watchdog):

from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler
import time
import subprocess

class MyHandler(FileSystemEventHandler):
    def on_modified(self, event):
        if event.src_path.endswith("myfile.txt"):
            print("File modified! Running job...")
            subprocess.run(["./run_job.sh"])

observer = Observer()
observer.schedule(MyHandler(), path='.', recursive=False)
observer.start()
try:
    while True:
        time.sleep(1)
finally:
    observer.stop()
    observer.join()

7. Rich / Textual – Make Your Scripts Beautiful

Let's face it, the output of most shell scripts is just plain text. Rich lets you effortlessly add color, styling, tables, progress bars, and more to your terminal output. It makes your tools a pleasure to use.

  • What it gives you: Beautiful terminal output with minimal effort.
  • Killer feature: The track function for instant, beautiful progress bars.

From Shell to Python: A simple progress loop.

The old way (Shell):

for i in $(seq 1 100); do
  # This is always finicky
  echo -ne "Progress: $i%\r"
  sleep 0.1
done
echo

The new way (Rich):

from time import sleep
from rich.progress import track

for _ in track(range(100), description="[green]Processing data"):
    sleep(0.05)

8. Typer – Create CLIs with Type Hints

If your script takes arguments, you're probably doing some awkward parsing in shell. Typer lets you build robust command-line interfaces by just writing a normal Python function with type hints. It’s absurdly easy.

  • What it gives you: Automatic CLI generation from function signatures.
  • Killer feature: Free --help messages, type validation, and error handling.

From Shell to Python: A script that copies a source file to a destination.

The old way (Shell with getopts):

#!/usr/bin/env bash
while getopts "s:d:" opt; do
  case $opt in
    s) SRC="$OPTARG" ;;
    d) DST="$OPTARG" ;;
  esac
done
echo "Copying $SRC to $DST"
cp "$SRC" "$DST"

The new way (Typer):

import typer
from pathlib import Path
import shutil

app = typer.Typer()

@app.command()
def copy(src: Path = typer.Option(..., help="Source file path"), 
         dst: Path = typer.Option(..., help="Destination file path")):
    """Copies a source file to a destination."""
    print(f"Copying {src} to {dst}")
    shutil.copy(src, dst)

if __name__ == "__main__":
    app()

9. Pendulum – Date and Time Without the Headache

I have wasted hours of my life fighting with the date command and its weird formatting flags. Pendulum is a drop-in replacement for Python's native datetime that makes handling timezones a breeze.

  • What it gives you: A sane API for all things date and time.
  • Killer feature: Human-friendly duration handling (.subtract(days=3)) and timezone awareness by default.

From Shell to Python: Getting the date for three days ago.

The old way (Shell):

# And good luck remembering the format flags...
date -d "3 days ago" +"%Y-%m-%d"

The new way (Pendulum):

import pendulum

# It's just so clear what's happening.
three_days_ago = pendulum.now("UTC").subtract(days=3)
print(three_days_ago.to_date_string())

10. Requests + HTTPX – The curl and jq Killers

Piping curl to jq is a rite of passage, but it's also clunky. Handling headers, errors, and complex JSON is much cleaner in Python. Requests is the gold standard for synchronous HTTP calls, and its modern sibling HTTPX adds async support.

  • What it gives you: A powerful, human-friendly HTTP client.
  • Killer feature: Automatic JSON decoding and robust status/error handling.

From Shell to Python: Fetching user names from a JSON API.

The old way (Shell):

curl -s "https://api.github.com/users" | jq '.[] | .login'

The new way (Requests):

import requests

try:
    resp = requests.get("https://api.github.com/users", timeout=5)
    resp.raise_for_status()  # This will raise an error for 4xx/5xx responses
    for user in resp.json():
        print(user["login"])
except requests.exceptions.RequestException as e:
    print(f"An error occurred: {e}")

Conclusion: How to Know When It's Time to Switch from Bash to Python

I'm not saying you should abandon shell entirely. For a quick git push alias or a simple ls -la, it's perfect. But the moment a script becomes more than a one-liner, you should pause and ask if Python is a better tool for the job.

A simple rule of thumb

Here's my personal rule: If your shell script needs more than one if statement or a single for loop, it's time to rewrite it in Python.

Once you introduce that level of logic, the benefits of Python's readability, error handling, and data structures vastly outweigh the brevity of shell.

Embracing the Python ecosystem for robust automation

When you switch to Python, you're not just getting a better language; you're gaining an entire ecosystem. You get access to testing frameworks like pytest, powerful IDEs with debuggers, and the millions of packages on PyPI. Your simple script can now evolve into a robust, maintainable, and shareable tool.

What's your favorite shell-killer library?

I'm always on the hunt for new tools to streamline my workflow. These ten are my current go-tos, but I know there are more gems out there.

What did I miss? Drop your favorite shell-killing Python library in the comments below.



Recommended Watch

πŸ“Ί Create A Bash Executable File To Run Python Scripts (Or Any Program) For Automation
πŸ“Ί Python or Bash for scripting? Which is best?

πŸ’¬ Thoughts? Share in the comments below!

Comments