„ionel” is read like „yonel”, @ionelmc, blog.ionelmc.ro
setup.py It's a really nasty archiver
setuptools adds very useful improvements (detailed later on). There's no reason to not use it. Even pip depends on it now-days.
Boils down to having a file setup.py with:
from setuptools import setup setup(name="mypackage", packages=["mypackage"], **lots_of_kwargs)
And running python setup.py sdist bdist_wheel.
Mandatory clarifications: packages vs distributions
importable packages:
├── package1
│ ├── __init__.py
│ ├── module.py
│ └── subpackage
│ └── __init__.py
└── package2
├── __init__.py
├── module.py
└── subpackage
└── __init__.py
distribution packages:
lazy-object-proxy-1.2.0.tar.gz lazy_object_proxy-1.2.0-cp27-none-win32.whl lazy_object_proxy-1.2.0-cp27-none-win_amd64.whl lazy_object_proxy-1.2.0-cp34-none-win32.whl lazy_object_proxy-1.2.0-cp34-none-win_amd64.whl
They are actually called distributions.
packaging.python.org calls them distribution packages to avoid some of the confusion.
Two kinds:
They have different rules for gathering the files because they generally have different files.
├── foo
│ ├── __init__.py
│ ├── utils.py
│ └── bar
│ └── __init__.py
└── other
├── __init__.py
├── module.py
└── subpackage
└── __init__.py
Packages for that:
Don't hard-code the list of packages, use setuptools.find_packages()
Don't:
setup( ... # everything is fine and dandy until one day someone # converts foo/utils.py to a package # and forgets to add `foo.utils` packages=['foo', 'foo.bar', 'other', 'other.subpackage'] )
Do:
setup( ... packages=find_packages() )
├── docs
│ ├── changelog.rst
│ ├── conf.py
│ ├── index.rst
│ ├── installation.rst
│ └── usage.rst
└── mypackage
├── __init__.py
├── static
│ ├── button.png
│ └── style.css
├── templates
│ └── base.html
└── views.py
Too fine grained, missing files:
recursive-include mypackage *.html *.css *.png *.xml *.py include docs/changelog.rst
├── docs
│ ├── changelog.rst
│ ├── conf.py
│ ├── index.rst
│ ├── installation.rst
│ └── usage.rst
└── mypackage
├── __init__.py
├── static
│ ├── button.png
│ └── style.css
├── templates
│ └── base.html
└── views.py
Just take whatever you have on the filesystem:
graft mypackage docs global-exclude *.py[cod] __pycache__ *.so
When choosing the MANIFEST.in commands consider that dirty releases are better than unusable releases.
A couple harmless stray files less bad than missing required files.
Use Git/Mercurial, don't release with untracked files.
Use check-manifest (integrate it in your CI or test suite).
Consider using setuptools_scm extension instead of MANIFEST.in (it takes all the files from Git/Mercurial)
With distutils you'd have to use package_data to include data files in packages.
However, setuptools add the include_package_data option:
If True then files from MANIFEST.in will get included, if they are inside a package.
Do not use package_data:
Don't use both MANIFEST.in and package_data. Use the easiest (MANIFEST.in + include_package_data=True)
Why is MANIFEST.in better? Because less code in your setup.py.
Less code, more configuration.
data_files=[('config', ['cfg/data.cfg']), ('/etc/init.d', ['init-script'])]
Avoid like the plague. Too inconsistent to be of any general use:
- deb (dh-virtualenv, py2deb)
- rpm
- pynsist (Windows)
- or your own CustomThing™ (NSIS, makeself etc)
setup( ... install_requires=[ 'Jinja2', ... ] )
Don't do this:
from setuptools import setup from mypackage import __version__ setup( name='mypackage', version=__version__, ... )
Supported since Python 2.7 (python -m mypackage to run):
mypackage ├── __init__.py ├── cli.py └── __main__.py
In __main__.py you'd have something like:
from mypackage.cli import main if __name__ == "__main__": main()
You should never import anything from __main__ because python -m mypackage will run it as a script (thus creating double execution issues).
Then in setup.py:
setup( ... entry_points={ 'console_scripts': [ 'mytool = mypackage.cli:main', ] } )
Advantages over using setup(scripts=['mytool']):
setup( ... extras_require={ 'pdf': ['reportlab'], }, )
Then you pip install "mypackage[pdf]" to get support for pdf output.
Some people abuse this feature for development/test dependencies.
It works but you entangle your setup.py with development concerns.
Tox is a good solution for development environments.
# content of: tox.ini , put in same dir as setup.py [tox] envlist = py26,py27 [testenv] deps=pytest # install pytest in the venvs commands=py.test # or 'nosetests' or ...
An underused feature. Declarative conditional dependencies:
setup( ... extras_require={ ':python_version=="2.6"': ['argparse'], ':sys_platform=="win32"': ['colorama'], }, )
Why: you can build universal wheels that have conditional dependencies.
Environment markers are supported since setuptools 0.7
More reading: wheel docs, PEP-426.
Easy to do on Linux:
export CFLAGS=-coverage python setup.py clean --all build_ext --force --inplace # run tests
Example on Coveralls:
Twine - secure upload to PyPI:
twine upload dist/*
Interesting recent change: PyPI doesn't allow reuploading distributions anymore. You can only delete.
There's a cookiecutter template that bakes in a lots of the ideas presented here:
Thank you!
| Table of Contents | t |
|---|---|
| Exposé | ESC |
| Full screen slides | e |
| Presenter View | p |
| Source Files | s |
| Slide Numbers | n |
| Toggle screen blanking | b |
| Show/hide slide context | c |
| Notes | 2 |
| Help | h |