pypyr logo

task runner for automation pipelines

script sequential task workflow steps in yaml

conditional execution, loops, error handling & retries

For when your shell scripts get out of hand. Less tricky than makefile.

Simple variable substitution & configuration file management.

Automate anything by combining commands, different scripts in different languages & applications into one pipeline process.

If you’re new, get started here.

term
$ pip install pypyr

pypyr is free & open-source.

You can show your support by ⭐ starring the pypyr repo on github ⭐!

pypyr runs on Linux, MacOS & Windows - anywhere with a Python runtime.

# ./show-me-what-you-got.yaml
context_parser: pypyr.parser.keyvaluepairs
steps:
  - name: pypyr.steps.echo
    in:
      echoMe: o hai!
  - name: pypyr.steps.cmd
    in:
      cmd: echo any cmd you like
  - name: pypyr.steps.shell
    in:
      cmd: echo ninja shell power | grep '^ninja.*r$' 
  - name: pypyr.steps.py
    in:
      py: print('any python you like')
  - name: pypyr.steps.cmd
    while:
      max: 3
    in:
      cmd: echo gimme a {whileCounter}
  - name: pypyr.steps.cmd
    foreach: [once, twice, thrice]
    in:
      cmd: echo say {i}
  - name: pypyr.steps.default
    in:
      defaults:
        sayBye: False
  - name: pypyr.steps.echo
    run: '{sayBye}'
    in:
      echoMe: k bye!
term
$ pypyr show-me-what-you-got
o hai!
any cmd you like
ninja shell power
any python you like
gimme a 1
gimme a 2
gimme a 3
say once
say twice
say thrice

$ pypyr show-me-what-you-got sayBye=true  
o hai!
any cmd you like
ninja shell power
any python you like
gimme a 1
gimme a 2
gimme a 3
say once
say twice
say thrice
k bye!

combine commands, applications & scripts into a repeatable pipeline

You can run any combination of command, shell, external executable, any callable, external script & inline code in the same pipeline, alongside composable built-in steps that do useful things like read & write files, format json, toml & yaml and manipulate data structures.

human-friendly pipelines

The pipeline yaml format is human readable, human editable, human mergeable. You’re meant to author pipelines by hand - because it’s easy, and deliberately so! Definitely source-control your pipelines & get easy text-based diffs so you’re not spending time deciphering opaque machine generated syntax & worrying about weird xml or json. Author your pipelines in whatever text editor makes you happy.

control flow with conditional execution, branching & looping

Run your own code, scripts and commands in foreach and while loops. Conditionally run or skip your custom code based upon switches you control. You apply pypyr’s control of flow to your commands & scripts without having to do any coding. No more trying to remember how a bash IF statement works!

parallel execution

Spawn asynchronous concurrent subprocesses without writing any code! Run programs in parallel and parallel shells concurrently. You can also define serial sequences inside parallel workloads.

pass your own custom cli args

You can use pypyr as a framework to write your own console applications without having to write code to capture, parse & validate custom input cli args. So you can avoid all that repetitive plumbing when you’re trying to automate something useful in a script.

automatic retries, error handling & compensations

Automatically retry your own commands & scripts when they fail, keep on retrying until it succeeds or until a retry limit you set. Selectively choose which errors should stop your pipeline execution. Use failure handlers with multiple steps to catch exceptions and to encapsulate more complex error handling logic.

variable interpolation & substitution

Use string interpolation or variable substitution to replace a placeholder {token} with variable values. This works for strings, obviously, but also any complex type! You can replace a string {placeholder} with an entire data structure like a list or a map, or other simple types like int or bool.

configuration file tokenization

Load, merge, format & interpolate values to-and-from text-based configuration files such as json, toml & yaml. pypyr is very useful to prepare configuration files & templates you need to bootstrap bigger systems like Terraform, Cloudformation & Heat, where you might need to inject your own variable values into the configuration file on the fly rather than duplicate configuration per environment.

cli & api

The pypyr CLI favors minimal typing and sensible defaults. You can also invoke your pipelines from code using the Python pipeline API from a single, simple entry-point function.

lotsa built-in steps & your own custom tasks

pypyr has >30 ready-made built-in steps that you can use as you see fit in your own pipelines. Coding your own step is as simple as bit of Python in a single function definition. Your own custom steps co-exist with built-in steps with exactly the same power and functionality for loops, retries & flow control without you having to write any extra code.

compose your tasks with modular step sequences

Pipelines can call other pipelines. Organize your tasks into repeatable sequences using step groups within a pipeline so you modularize & isolate more complex task sequences.

lots of documentation

Plenty clear help documentation with examples. You’re reading it right now. If you’re new, let’s start at the very beginning.

devops, ci & cd automation tool

You can automate any task sequence you want with pypyr. It so happens that pypyr is particularly strong at consolidating the ad hoc scripts that you tend to accrete over time for your CI/CD devops functions. Run the exact same CI/CD process locally as you would on your cloud provider. No more commits like “ci build test 5 please work this time”.

agentless pipeline execution

Quick install via pip. pypyr runs on Linux, MacOS & Windows - anywhere with a Python runtime. With bigger devops & workflow automation platforms providing the execution environment in itself is a not a trivial task. pypyr is a lightweight Python application with no further dependencies. You can even run pypyr from its ready-made docker container.