Important: While this tool aims to install Python as required, we're still working through issues getting flexible binaries. For now, assume you must have the version of Python you wish to use installed, and available on the path.
This tool manages Python installations and dependencies. It implements PEP 582 -- Python local packages directory and Pep 518 -- Specifying Minimum Build System Requirements for Python Projects
It keeps the latter isolated in the project
directory, and runs
Python in an environment which uses this directory. Per PEP 582, dependencies
are stored in the project directory → __pypackages__
→ 3.7
(etc) → lib
.
Goal: Make using and publishing Python projects as simple as possible. Understanding Python environments shoudn't be required to use dependencies safely.
Only works with Python ≥ 3.4. You don't need Python installed to use it; it will install the specified version of Python if not already installed.
There are 2 ways to install:
- Download a binary from the releases
page. Installers are available for Debian/Ubuntu, and Windows. On Debian or Ubuntu, download and run
this deb.
On Windows, download and run
this installer.
Alternatively, download the appropriate binary (ie pyflow.exe
or pyflow
) and place it somewhere
accessible by the system path. For example, /usr/bin
in linux,
or ~\AppData\Local\Programs\Python\Python37\bin
in Windows.
cargo install pyflow
.pyflow init
in an existing project folder, or pyflow new projname
to create a new project folder. init
imports data from requirements.txt
or Pipfile
; new
creates a folder with the basicspyflow install
to set up Python, and sync dependencies with pyproject.toml
, or add dependencies to itpyflow python
to run Python__requires__ = [numpy, requests]
somewhere in the script, where numpy
and requsts
are dependencies
the script requires. Run pyflow script myscript.py
, where myscript.py
is the name of your script.
This will automatically set up an isolated environment for this script, and install
dependencies as required, without altering any other environment. This is a safe way
to run one-off Python files that aren't attached to a project, but have dependencies.Pipenv
and Poetry
both address part of this problem.
Some reasons why this tool is different:
It manages Python installations - lets you choose which Python version (≥ 3.4)
to use. If one's installed, it uses that. If not, it downloads a binary, stores it
in ~/python-installs
, and uses that. It lets the user select which Python
version to use in pyproject.toml
, then uses that version, installing
as required.
By not using Python to install or run, it remains intallation-agnostic and
environment-agnostic. This is important for making setup and use as simple and decison-free as
possible. It's especially important on Linux, where there may be several versions
of Python installed, with different versions and access levels. This avoids
complications, especially for new users. It's common for Python-based CLI tools
to not run properly when installed from pip
due to the PATH
not being configured in the expected way.
Its dependency resolution and locking is faster due to using a cached database of dependencies, vice downloading and checking each package, or relying on the incomplete data available on the pypi warehouse.
It keeps dependencies in the project directory, in __pypackages__
. This is subtle,
but reinforces the idea that there's
no hidden state to be concerned with.
It will always use the specified version of Python. This is a notable issue, for example,
with Poetry
; it
may pick the wrong installation (eg Python2 vice Python3), with no obvious way to change it.
Multiple versions of a dependency can be installed, allowing resolution
of conflicting sub-dependencies. (ie: Your package requires Dep A>=1.0
and Dep B
.
Dep B
requires Dep A==0.9
) There are many cases where Poetry
and Pipenv
will fail
to resolve dependencies, but we're able to by doing this. Try it for yourself with a few
random dependencies from pypi; there's a good change you'll
hit this problem using Poetry
or Pipenv
. Limitations: This will not work for
some compiled dependencies, and attempting to package something using this will
trigger an error.
Hopefully we're not replacing one problem with another.
Some people like the virtual-environment workflow - it requires only tools included with Python, and uses few console commands to create, and activate and environments. However, it may be tedius depending on workflow: The commands may be long depending on the path of virtual envs and projects, and it requires modifying the state of the terminal for each project, each time you use it, which you may find inconvenient or inelegant.
If you're satisified with an existing flow, there may be no reason to change, but I think we can do better. This is especially relevant for new Python users who haven't groked venvs, or are unaware of the hazards of working with a system Python.
Pipenv
improves the workflow by automating environment use, and
allowing reproducable dependency resolution. Poetry
improves upon Pipenv's
API,
speed, and dependency resolution, as well as improving
the packaging and distributing process by using a consolidating project config. Both
are sensitive to the Python environment used to run them. This tool
attempts to improve upon both in the areas listed in the section above. Its goal is to be
as intuitive as possible.
Conda
addresses these problems elegantly, but maintains a separate repository
of binaries from PyPi
. If all packages you need are available on Conda
, it may
be the best solution. If not, it requires falling back to Pip
, which means
using two separate package managers.
When building and deploying packages, a set of degenerate files are
traditionally used: setup.py
, setup.cfg
, requirements.txt
and MANIFEST.in
. We use
pyproject.toml
as the single-source of project info required to build
and publish.
These tools have different scopes and purposes:
| Name | Pip + venv | Pipenv | Poetry | pyenv | pythonloc | Conda |this |
|------|------------|--------|--------|-------|-----------|-------|-----|
| Manages dependencies | ✓ | ✓ | ✓ | | | ✓ | ✓|
| Manages Python installations | | | | ✓ | | ✓ | ✓ |
| Py-environment-agnostic | | | | ✓ | | ✓ | ✓ |
| Included with Python | ✓ | | | | | | |
| Stores packages with project | | | | | ✓ | | ✓|
| Locks dependencies | | ✓ | ✓ | | | ✓ | ✓|
| Requires changing session state | ✓ | | | ✓ | | | |
| Slow | | ✓ | | | | | |
| Clean build/publish flow | | | ✓ | | | | ✓ |
| Buggy | | | | | | | ✓ |
| Supports old Python versions | with virtualenv
| ✓ | ✓ | ✓ | ✓ | ✓ | |
pyproject.toml
file in your project directory. Otherwise, this
file will be created automatically. You may wish to use pyproject new
to create a basic
project folder (With a .gitignore, source directory etc), or pyproject init
to populate
info from requirements.txt
or Pipfile
. See
PEP 518 for details.Example contents: ```toml [tool.pyflow] py_version = "3.7" name = "runcible" version = "0.1.0" author = "John Hackworth"
[tool.pyflow.dependencies]
numpy = "^1.16.4"
diffeqpy = "1.1.0"
``
The
[tool.pyflow]section is used for metadata. The only required item in it is
py_version, unless unless
building and distributing a package. The
[tool.pypyackage.dependencies]section
contains all dependencies, and is an analog to
requirements.txt`.
You can specify extra
dependencies, which will only be installed when passing
explicit flags to pyflow install
, or when included in another project with the appropriate
flag enabled. Ie packages requiring this one can enable with
pip install -e
etc.
toml
[tool.pyflow.extras]
test = ["pytest", "nose"]
secure = ["crypto"]
If you'd like to an install a dependency with extras, use syntax like this:
toml
[tool.pyflow.dependencies]
ipython = { version = "^7.7.0", extras = ["qtconsole"] }
For details on
how to specify dependencies in this Cargo.toml
-inspired
semvar format,
reference
this guide.
We also attempt to parse metadata and dependencies from tool.poetry
sections of pyproject.toml
, so there's no need to modify the format
if you're using that.
pyflow install
- Install all packages in pyproject.toml
, and remove ones not (recursively) specified.pyflow install requests
- If you specify one or more packages after install
, those packages will
be added to pyproject.toml
and installed.pyflow install numpy==1.16.4 matplotlib>=3.1.
- Example with multiple dependencies, and specified versionspyflow uninstall requests
- Remove one or more dependenciespyflow python
- Run a Python REPLpyflow python main.py
- Run a python filepyflow ipython
, pyflow black
etc - Run a CLI script like ipython
. This can either
have been installed by a dependency, or specified under [tool.pyflow]
, scripts
pyflow run ipython
- alternate syntax for the abovepyflow script myscript.py
- Run a one-off script, outside a project directory, with per-file
package managementpyflow package
- Package for distribution (uses setuptools internally, and
builds both source and wheel if applicable.)pyflow package --features "test all"
- Package for distribution with features enabled,
as defined in pyproject.toml
pyflow publish
- Upload to PyPi (Repo specified in pyproject.toml
. Uses Twine
internally.)pyflow list
- Display all installed packages and console scriptspyflow new projname
- Create a directory containing the basics for a project:
a readme, pyproject.toml, .gitignore, and directory for codepyflow init
- Create a pyproject.toml
file in an existing project directory. Pull info from
requirements.text
and Pipfile
as required.pyflow reset
- Remove the environment, and uninstall all packagespyflow clear
- Clear the global cache of downloaded packages, in ~/python-installs/dependency-cache
.
and the global cache of one-off script environments, in ~/python-installs/script-envs
.pyflow -V
- Get the current version of this toolpyflow help
Get help, including a list of available commandsRunning pyflow install
syncs the project's installed dependencies with those
specified in pyproject.toml
. It generates pyflow.lock
, which on subsequent runs,
keeps dependencies each package a fixed version, as long as it continues to meet the constraints
specified in pyproject.toml
. Adding a
package name via the CLI, eg pyflow install matplotlib
simply adds that requirement before proceeding.
pyflow.lock
isn't meant to be edited directly.
Each dependency listed in pyproject.toml
is checked for a compatible match in pyflow.lock
If a constraint is met by something in the lock file,
the version we'll sync will match that listed in the lock file. If not met, a new entry
is added to the lock file, containing the highest version allowed by pyproject.toml
.
Once complete, packages are installed and removed in order to exactly meet those listed
in the updated lock file.
This tool downloads and unpacks wheels from pypi
, or builds
wheels from source if none are availabile. It verifies the integrity of the downloaded file
against that listed on pypi
using SHA256
, and the exact
versions used are stored in a lock file.
When a dependency is removed from pyproject.toml
, it, and its subdependencies not
also required by other packages are removed from the __pypackages__
folder.
Compatible versions of dependencies are determined using info from
the PyPi Warehouse (available versions, and hash info),
and the pydeps
database. We use pydeps
, which is built specifically for this project,
due to inconsistent dependency information stored on pypi
. A dependency graph is built
using this cached database. We attempt to use the newest compatible version of each package.
If all packages are either only specified once, or specified multiple times with the same newest-compatible version, we're done resolving, and ready to install and sync.
If a package is included more than once with different newest-compatible versions, but one of those newest-compatible is compatible with all requirements, we install that one. If not, we search all versions to find one that's compatible.
If still unable to find a version of a package that satisfies all requirements, we install multiple versions of it as-required, store them in separate directories, and modify their parents' imports as required.
Note that it may be possible to resolve dependencies in cases not listed above, instead of installing multiple versions. Ie we could try different combinations of top-level packages, check for resolutions, then vary children as-required down the hierarchy. We don't do this because it's slow, has no guarantee of success, and involves installing older versions of packages.
pypi
(eg repos)In order to build and publish your project, additional info is needed in
pyproject.toml
, that mimics what would be in setup.py
. Example:
```toml
[tool.pyflow]
name = "everythingkiller"
pyversion = "3.6"
version = "0.1.0"
author = "Fraa Erasmas"
authoremail = "raz@edhar.math"
description = "Small, but packs a punch!"
homepage = "https://everything.math"
repository = "https://github.com/raz/everythingkiller"
license = "MIT"
classifiers = [
"Topic :: System :: Hardware",
"Topic :: Scientific/Engineering :: Human Machine Interfaces",
]
scripts = { activate = "jeejah:activate" }
[tool.pyflow.dependencies] numpy = "^1.16.4" manim = "0.1.8" ipython = {version = "^7.7.0", extras=["qtconsole"]} ```
If you’d like to build from source, download and install Rust,
clone the repo, and in the repo directory, run cargo build --release
.
Ie on Linux:
bash
curl https://sh.rustup.rs -sSf | sh
git clone https://github.com/david-oconnor/pyflow.git
cd pyflow
cargo build --release
If installed via Cargo
, run cargo install pyflow --force
.
If you notice unexpected behavior or missing features, please post an issue, or submit a PR. If you see unexpected behavior, it's probably a bug! Post an issue listing the dependencies that did not install correctly.
https://pydeps.herokuapp.com/requests
,
https://pydeps.herokuapp.com/requests/2.21.0
.
This pulls all top-level
dependencies for the requests
package, and the dependencies for version 2.21.0
respectively.
There is also a POST
API for pulling info on specified versions.
The first time this command is run
for a package/version combo, it may be slow. Subsequent calls, by anyone,
should be fast. This is due to having to download and install each package
on the server to properly determine dependencies, due to unreliable information
on the pypi warehouse
.pyflow
binary is accessible in your path. If installing
via a deb
, msi
, or Cargo
, this should be set up automatically.__pypackages__
and .venv
are in your .gitignore
file.