[10:39:11] <t4nk606> Hi! I'm working on a build pipeline for my org. I want to build a wheel, with dependencies on a custom devpi server, upload it to another custom devpi server, and then run tests using tox. I'm a bit baffled by the range of possible commands to build wheels using `pip wheel`, `easy_install`, `python setup.py bdist_wheel`. Are any of these considered to be bast-practise?
[10:43:12] <dstufft> t4nk606: well easy_install doesn't build wheels
[10:43:26] <dstufft> and pip wheel just calls python setup.py bdist_wheel under the covers
[10:45:19] <t4nk606> Fair enough. Is there any reason to use 'pip wheel` in that case?
[10:46:58] <dstufft> t4nk606: setup.py bdist_wheel will only build a wheel for the project whose setup.py it is, it also won't go out and fetch the tarball so you'll haev to do that yourself
[10:47:35] <dstufft> pip wheel adds the normal dependency resolution and fetching of tarballs to build wheels for ontop of that
[10:48:15] <dstufft> if you only want to build wheels for things you already have sitting on your desk, I'd probably just use setup.py bdist_wheel, if you want to build a bunch ofw heels for something from an index and all of it's dependencies, i'd use pip wheel
[10:49:18] <t4nk606> dstufft: When you say "won't go out and fetch the tarball", do you mean that setuptools won't fetch all of the dependencies?
[10:50:01] <t4nk606> I'm trying to do this to create a local devpi server to distribute apps across my organisation.
[10:50:58] <t4nk606> So it sounds like `pip wheel` is the way to go?
[10:51:15] <dstufft> t4nk606: python setup.py bdist_wheel won't fetch any dependencies, it'll build a wheel for just that project
[10:51:23] <dstufft> t4nk606: you might be itnerested in a different project actually
[10:55:20] <t4nk606> I'm trying to set up a CI pipeline, but this is my first time looking at python dependency management. I'm still confused by the flurry of standards and names that seem to be out there.
[12:40:05] <HolyGoat> Hi. It appears that a virtualenv is not entirely self contained, because it seems that when I pick up the virtualenv I created and copy it to a fresh new machine (a docker image in my case), it cannot find any of the system packages. Is this intended behavior?
[12:40:32] <doismellburning> HolyGoat: this has been the cause of much sadness for me for a long time
[12:42:01] <mgedmin> for fast deployments you'll want to build wheels with pip wheel and then create a new virtualenv in your new docker image then pip install them
[12:42:18] <HolyGoat> If I would want to ship the python interpreter, I'd have to compile it for my target platform, and use this binary to run my virtualenv, right?
[12:42:33] <HolyGoat> I see, I need to look into wheels
[12:42:52] <mgedmin> a virtualenv contains a copy of the python interpeter, but no standard library modules that link against it
[12:43:06] <HolyGoat> I'm not actually looking to ship docker containers; rather, I'm using docker containers to build OS packages from my Python projects
[12:43:11] <mgedmin> you can't use a virtualenv with a different python interpreter from the one you used to create the virtualenv
[12:43:23] <mgedmin> (this also means if you upgrade your system python, you have to fix all your virtualenvs)
[12:43:29] <HolyGoat> Ah, got it. Thanks. That's important information for me
[12:44:12] <mgedmin> also, a virtualenv is not relocatable (by default; there's a flag but it's not 100% reliable): you can't copy a virtualenv to a different filesystem path and expect it to work
[12:44:16] <HolyGoat> So that essentially means that if I want to create for example a 2.7 virtualenv on an OS that depends on 2.6, it cannot be done?
[12:44:34] <HolyGoat> I know about this (and about --relocatable)
[12:45:45] <HolyGoat> that's an option I suppose.. and then put that interpreter as first in the PATH when executing things in my virtualenv, you mean?
[12:46:06] <mgedmin> no: every script in a virtualenv's bin/ has the correct #! line and uses the right interpreter
[12:46:19] <mgedmin> just create a virtualenv with -p /path/to/your/python2.x
[12:46:28] <HolyGoat> so I'll need to update all scripts in the virtualenv instead
[12:46:42] <HolyGoat> Ah, right. Of course I'd be creating that virtualenv with the newly installed interpreter
[12:46:58] <mgedmin> uh, binary ABI problems will sabotage any desire to "convert" a Python 2.x virtualenv into a Python 2.y virtualenv
[12:48:48] <HolyGoat> mgedmin: So, say I fire up a docker machine, compile a new Python 2.7 there (alongside the system's 2.6), git clone my project, create a virtualenv from it, run python setup.py install and wrap up all this to deploy on a fresh container, that should work?
[12:52:48] <doismellburning> I seem to be on 1.11.6
[12:52:54] <HolyGoat> mgedmin: thanks a lot for your help!
[12:53:12] <mgedmin> actually I can't reproduce the error on 12.0.5
[12:53:30] <mgedmin> https://github.com/pypa/virtualenv/issues/671 is still bugging me, but I can work around that by pip uninstall pies2overrides
[13:39:56] <ionelmc> mgedmin: this seems to work for me on windows https://github.com/ionelmc/virtualenv/commit/aff2ca08509be795650e38c94f1f4442d7b227fa
[14:06:20] <ionelmc> Ivo: it seems to work fine with mgedmin's fix: https://github.com/ionelmc/virtualenv/commit/aff2ca08509be795650e38c94f1f4442d7b227fa
[14:06:20] <mgedmin> and it doesn't fix #671 because the error happens on line 8, during "import base64", before sys.path[0] is removed
[14:06:53] <Ivo> there was yet another pull to put the import base64 before the sys.path[0] removal
[14:07:04] <ionelmc> well anyway, i don't wanna really push for a fix on this, i can live with my patched version
[14:07:25] <ionelmc> hopefully i'll dogfood the virtualenv rewrite soon :-)
[14:07:34] <ionelmc> dstufft: how the review going?
[14:07:40] <Ivo> I'm just wondering if there was any safe pull amongst the reverted ones that wasn't part of the crash-everything ones
[14:08:49] <dstufft> ionelmc: i've poked at it a little bit, I don't know if I'll get to it today, I'm probably going to need to leave sooner or later. I slipped on some ice and hurt my ankle and i'm probably going to need to get it looked at
[14:09:40] <ionelmc> dstufft: well anyway, for 3.2/pypy support ...
[14:09:49] <ionelmc> looks like i'll need a real module locator
[14:10:06] <ionelmc> a-la the legacy virtualenv, but without triggering imports
[14:10:11] <dstufft> what do you mean by module locator
[16:29:10] <mgedmin> there isn't one; you can use https://warehouse.python.org/project/pyOpenSSL/ instead ("All Versions" in the sidebar on the right) or
[18:03:09] <mgedmin> "setuptools-11.3.1-py2.6.egg/setuptools/command/egg_info.py:171: DeprecationWarning: Parameters to load are deprecated. Call .resolve and .require separately."
[18:03:30] <mgedmin> this shows up on Python 2.6 only
[18:30:56] <indygreg> i'm trying to get review board to switch from easy_install/eggs to pip/wheels and i could use some technical expertise to answer some of the maintainer's concerns. https://groups.google.com/forum/#!topic/reviewboard-dev/5_H9jULhYUs
[18:32:08] <indygreg> they have a use case where commercial extensions are currently distributed in .pyc-only eggs (so source code doesn't leak). what's the non-egg story to support this use case? afaik wheels purposefully do not do .pyc distribution
[18:33:39] <Wooble> distributing .pyc files for security reasons is just giving yourself a false sense of security.
[18:34:21] <tomprince> In any case, distributing a wheel for the base package doesn't prevent extensions from distributing other things.
[18:34:59] <indygreg> indeed. but some people believe in security through obscurity :/
[18:35:45] <indygreg> what about pkg_resources in wheels?
[19:46:40] <wsanchez> Given my project's setup.py, is it possible to get pip to compute the dependency graph for me and print it out?
[19:48:29] <tdsmith> mostly no afaict but the tl.eggdeps project can tell you dependency graphs for installed packages
[19:50:56] <doismellburning> or a few short lines of networkx code
[19:53:01] <wsanchez> I want to have a requirements-stable.txt file that lists every dependency with a specific version, which is "this is what we tested".
[19:53:33] <wsanchez> And then various versions of that, so buildbots can test different combos in virtualenvs, etc.
[19:55:08] <tdsmith> doismellburning: what api knows about dependencies?