Note: This guide combines personal experience from writing deployment scripts across macOS, Linux, and Windows environments with patterns documented in the Python pathlib documentation, Git documentation, PowerShell cross-platform guidance, Python subprocess module, and GitHub Actions runner images.


A deployment script that works on macOS. A colleague runs it on Windows. It fails immediately.

The culprit? A hardcoded forward slash in a file path. I’ve seen this exact scenario play out multiple times over the past five years, and it’s almost always preventable.

Cross-platform scripting isn’t hard once you know where the landmines are buried. This post covers the common problems and practical solutions I’ve learned from writing scripts that need to work everywhere.

The Big Three: Path Separators, Line Endings, and Shell Differences

These three issues cause the majority of cross-platform scripting headaches. Master these and you’ll avoid most problems.

Path Separators

Windows uses backslashes (\). Everyone else uses forward slashes (/). This sounds simple until you’re debugging at 2 AM.

The problem:

# Works on macOS/Linux
config_path="$HOME/config/settings.json"

# Fails on Windows (unless using Git Bash or WSL)

The solutions:

In Python 3.4+, use pathlib or os.path.join():

from pathlib import Path

# This works everywhere (Python 3.4+)
config_path = Path.home() / "config" / "settings.json"

# Or the older approach
import os
config_path = os.path.join(os.path.expanduser("~"), "config", "settings.json")

In shell scripts, use forward slashes and let the environment translate. Most modern Windows environments (Git Bash, WSL, PowerShell with certain modules) handle this correctly.

If you must support native Windows CMD, consider PowerShell instead:

# PowerShell handles both separators
$configPath = Join-Path $env:USERPROFILE "config\settings.json"

Line Endings

Windows uses \r\n (CRLF). Unix systems use \n (LF). This invisible difference corrupts scripts, breaks parsers, and wastes debugging time.

The symptoms:

  • Scripts fail with bizarre errors like \r: command not found
  • CSV parsing breaks mysteriously
  • Git shows entire files as changed when only one line was modified

The solutions:

Configure Git to handle this automatically:

# On Windows, convert to CRLF on checkout, LF on commit
git config --global core.autocrlf true

# On macOS/Linux, only convert CRLF to LF on commit
git config --global core.autocrlf input

Or use a .gitattributes file in your repo (the better approach for teams):

# Force LF for scripts
*.sh text eol=lf
*.py text eol=lf
*.js text eol=lf

# Force CRLF for Windows-specific files
*.bat text eol=crlf
*.ps1 text eol=crlf

When reading files in code, be explicit:

# Python: handle both line endings
with open(filename, 'r', newline='') as f:
    content = f.read()

Shell Differences

Bash isn’t available everywhere by default. PowerShell isn’t available everywhere by default. Each shell has its own syntax, built-in commands, and quirks.

The reality check:

Feature Bash PowerShell CMD
Default on macOS/Linux Yes No No
Default on Windows No Yes Yes
Scripting power High High Low
Object pipelines No Yes No

The strategies:

Strategy 1: Pick one shell and require it

If your team can standardize, pick Bash (install Git Bash on Windows) or PowerShell (install on macOS/Linux). Then write all scripts for that shell.

#!/usr/bin/env bash
# This shebang finds bash wherever it's installed

Strategy 2: Use a cross-platform language

Python (3.6+), Node.js (14+), and Ruby run everywhere with minimal differences. For anything beyond simple file operations, these are often better choices than shell scripts. If you’re using AI to help write these scripts, remember that AI suggestions need verification since models sometimes generate platform-specific code without mentioning the limitation.

#!/usr/bin/env python3
import subprocess
import platform

def run_command(cmd):
    """Run a command, handling platform differences."""
    if platform.system() == "Windows":
        # Use shell=True on Windows for PATH resolution
        return subprocess.run(cmd, shell=True, capture_output=True, text=True)
    else:
        return subprocess.run(cmd, capture_output=True, text=True)

Strategy 3: Write parallel implementations

For simple scripts, maintain both versions:

scripts/
  deploy.sh      # Bash version
  deploy.ps1     # PowerShell version

This is more maintenance but sometimes the cleanest solution.

Environment Variables and Configuration

Environment variables behave differently across platforms. Knowing the differences prevents surprises.

Accessing Variables

Platform Syntax Example
Bash $VAR or ${VAR} echo $HOME
PowerShell $env:VAR echo $env:USERPROFILE
CMD %VAR% echo %USERPROFILE%
Python os.environ['VAR'] os.environ['HOME']

Common Variables That Differ

The home directory variable is the classic example:

Platform Variable Typical Value
macOS/Linux HOME /Users/scott or /home/scott
Windows USERPROFILE C:\Users\scott
Windows HOME Often not set (unless Git Bash)

Cross-platform solution in Python:

from pathlib import Path

# Works everywhere
home = Path.home()

In Bash (with fallback):

# Use HOME if set, fall back to USERPROFILE for Windows
home_dir="${HOME:-$USERPROFILE}"

Temporary Directories

Another variable that differs:

Platform Variable Typical Value
macOS TMPDIR /var/folders/xx/.../T/
Linux TMPDIR or /tmp /tmp
Windows TEMP or TMP C:\Users\scott\AppData\Local\Temp

Cross-platform solution:

import tempfile

# Works everywhere
temp_dir = tempfile.gettempdir()

Command Differences That Bite

Some commands that exist on both platforms work differently. Others don’t exist at all.

Commands That Work Differently

find vs Windows find

On Unix, find searches for files. On Windows, find searches text within files (like grep). Windows has where for finding executables.

# Unix: find all Python files
find . -name "*.py"

# PowerShell equivalent
Get-ChildItem -Recurse -Filter "*.py"

which vs where

# Unix: find command location
which python

# Windows CMD
where python

# PowerShell
Get-Command python

curl availability

curl is standard on macOS and most Linux distros. On Windows, it’s available in recent versions but may be an alias to Invoke-WebRequest in PowerShell.

# Check if you're getting real curl or the alias
Get-Command curl

# Use curl.exe explicitly on Windows if needed
curl.exe https://example.com/file

Commands That Don’t Exist

Unix Command Windows Alternative
grep findstr (CMD) or Select-String (PowerShell)
sed PowerShell -replace or Python
awk PowerShell or Python
chmod icacls (different model entirely)
ln -s mklink (requires admin on older Windows)

For anything beyond basic operations, Python or another cross-platform language is usually cleaner than trying to translate complex Unix pipelines.

File System Gotchas

Beyond path separators, file systems have behavioral differences that can break scripts.

Case Sensitivity

File System Case Sensitive?
macOS (APFS default) No (preserving)
Linux (ext4) Yes
Windows (NTFS) No (preserving)

This means Config.json and config.json are the same file on macOS and Windows, but different files on Linux.

Best practice: Always use consistent casing. Treat file names as case-sensitive even when the file system doesn’t enforce it.

Hidden Files

Unix hides files starting with . (like .gitignore). Windows uses a file attribute.

# Unix: list hidden files
ls -la

# PowerShell: include hidden files
Get-ChildItem -Force

File Locking

Windows locks files more aggressively than Unix systems. A file open for reading in one process may not be deletable or movable by another process.

Symptoms: “File in use” errors when trying to modify or delete files.

Solutions:

  • Close file handles explicitly and promptly
  • Use with statements in Python to ensure files are closed
  • Consider retry logic for file operations
import time
import os

def safe_delete(filepath, retries=3, delay=1):
    """Delete a file with retry logic for Windows locking."""
    for attempt in range(retries):
        try:
            os.remove(filepath)
            return True
        except PermissionError:
            if attempt < retries - 1:
                time.sleep(delay)
    return False

Practical Patterns

Here are patterns I use regularly for cross-platform scripts.

Detect the Platform

import platform
import os
from pathlib import Path

system = platform.system()  # 'Windows', 'Darwin' (macOS), or 'Linux'

if system == "Windows":
    config_dir = Path(os.environ.get("APPDATA", ""))
elif system == "Darwin":
    config_dir = Path.home() / "Library" / "Application Support"
else:
    config_dir = Path.home() / ".config"

In Bash:

case "$(uname -s)" in
    Darwin)
        echo "macOS"
        ;;
    Linux)
        echo "Linux"
        ;;
    MINGW*|CYGWIN*|MSYS*)
        echo "Windows (Git Bash or similar)"
        ;;
esac

Portable Shebang

#!/usr/bin/env bash

Using env finds the interpreter in the PATH rather than hardcoding a location. This handles systems where bash is in /bin/bash vs /usr/local/bin/bash vs elsewhere.

Normalize Paths Early

Convert paths to a canonical form at the start of your script:

from pathlib import Path

def normalize_path(path_string):
    """Convert any path to a resolved, absolute Path object."""
    return Path(path_string).expanduser().resolve()

# Now use Path objects throughout your script
config_path = normalize_path("~/config/settings.json")

Use Cross-Platform Libraries

Don’t reinvent the wheel. Use libraries designed for cross-platform work:

Task Python Library
File operations pathlib, shutil
Subprocesses subprocess
HTTP requests requests, httpx
JSON/YAML config json, pyyaml
CLI arguments argparse, click
Environment vars python-dotenv

Test on All Platforms

This sounds obvious, but many cross-platform bugs come from never testing on the other platforms. I’ve found that automated testing catches issues I’d never think to check manually.

Options:

  • Virtual machines (VirtualBox 7.x, Parallels, VMware)
  • Docker containers (for Linux testing on Mac/Windows)
  • Cloud CI/CD (GitHub Actions runs on all three platforms; this is my preferred approach for any automation workflow)

A simple GitHub Actions workflow (as of January 2026):

name: Test Cross-Platform
on: [push, pull_request]

jobs:
  test:
    strategy:
      matrix:
        os: [ubuntu-24.04, macos-14, windows-2022]
    runs-on: ${{ matrix.os }}
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.12'
      - run: python my_script.py --test

When to Give Up on Cross-Platform

Sometimes the honest answer is: don’t try.

If your script does deep system administration, interacts with OS-specific APIs, or relies heavily on platform-specific tools, write separate scripts. Forcing cross-platform compatibility adds complexity that may not be worth it.

Signs you should write platform-specific scripts:

  • Heavy use of system services (systemd, launchd, Windows services)
  • File permissions beyond basic read/write
  • Registry access (Windows) or plist manipulation (macOS)
  • Complex process management
  • Network configuration

For these cases, maintain separate implementations and document clearly which script is for which platform.

The Checklist

Before calling a script “cross-platform,” verify:

  • Path separators handled (use pathlib or os.path.join)
  • Line endings configured in Git (.gitattributes)
  • Environment variables use cross-platform fallbacks
  • External commands exist on all target platforms
  • File operations handle locking (especially on Windows)
  • Case sensitivity assumptions are explicit
  • Shebang uses #!/usr/bin/env
  • Actually tested on all target platforms

Cross-platform scripting is less about clever tricks and more about awareness. Know where the differences are, use the right tools, and test everywhere you need to support.

The investment pays off. Scripts that work everywhere are scripts you don’t have to debug at 2 AM when someone runs them on a different machine.


What cross-platform issues have bitten you? I’d love to hear your war stories. Find me on X or LinkedIn.