PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Wednesday the 7th of January, 2015

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[00:10:28] <prologic> wrt to installed wheels vs eggs the difference is .dist-info vs .egg-info in site-packages right?
[00:19:30] <ionelmc> prologic: eggs also can install as zip files, and you can end up with multiple versions of same package in site-packages
[00:19:55] <ionelmc> pretty sure there are other differences
[00:27:49] <prologic> sure
[00:27:56] <prologic> I'm working in clean virtualenvs
[00:28:00] <prologic> so no risk of multiple versions hopefully
[00:28:01] <prologic> :)
[01:05:43] <prologic> Cleaning up...
[01:05:43] <prologic> pip install --no-index -r requirements.txt 9.92s user 2.95s system 66% cpu 19.369 total
[01:05:43] <prologic> :)
[01:06:02] <prologic> wheels are great
[10:39:11] <t4nk606> Hi! I'm working on a build pipeline for my org. I want to build a wheel, with dependencies on a custom devpi server, upload it to another custom devpi server, and then run tests using tox. I'm a bit baffled by the range of possible commands to build wheels using `pip wheel`, `easy_install`, `python setup.py bdist_wheel`. Are any of these considered to be bast-practise?
[10:39:19] <t4nk606> best*-practise
[10:42:44] <t4nk606> s/easy_install//
[10:43:12] <dstufft> t4nk606: well easy_install doesn't build wheels
[10:43:26] <dstufft> and pip wheel just calls python setup.py bdist_wheel under the covers
[10:45:19] <t4nk606> Fair enough. Is there any reason to use 'pip wheel` in that case?
[10:46:58] <dstufft> t4nk606: setup.py bdist_wheel will only build a wheel for the project whose setup.py it is, it also won't go out and fetch the tarball so you'll haev to do that yourself
[10:47:35] <dstufft> pip wheel adds the normal dependency resolution and fetching of tarballs to build wheels for ontop of that
[10:48:15] <dstufft> if you only want to build wheels for things you already have sitting on your desk, I'd probably just use setup.py bdist_wheel, if you want to build a bunch ofw heels for something from an index and all of it's dependencies, i'd use pip wheel
[10:49:18] <t4nk606> dstufft: When you say "won't go out and fetch the tarball", do you mean that setuptools won't fetch all of the dependencies?
[10:50:01] <t4nk606> I'm trying to do this to create a local devpi server to distribute apps across my organisation.
[10:50:58] <t4nk606> So it sounds like `pip wheel` is the way to go?
[10:51:15] <dstufft> t4nk606: python setup.py bdist_wheel won't fetch any dependencies, it'll build a wheel for just that project
[10:51:23] <dstufft> t4nk606: you might be itnerested in a different project actually
[10:51:23] <dstufft> sec
[10:51:37] <dstufft> t4nk606: https://warehouse.python.org/project/devpi-builder/
[10:52:18] <t4nk606> Interesting.
[10:55:20] <t4nk606> I'm trying to set up a CI pipeline, but this is my first time looking at python dependency management. I'm still confused by the flurry of standards and names that seem to be out there.
[10:55:25] <t4nk606> Cheers for your help.
[12:40:05] <HolyGoat> Hi. It appears that a virtualenv is not entirely self contained, because it seems that when I pick up the virtualenv I created and copy it to a fresh new machine (a docker image in my case), it cannot find any of the system packages. Is this intended behavior?
[12:40:32] <doismellburning> HolyGoat: this has been the cause of much sadness for me for a long time
[12:41:34] <mgedmin> it's intended
[12:42:01] <mgedmin> for fast deployments you'll want to build wheels with pip wheel and then create a new virtualenv in your new docker image then pip install them
[12:42:18] <HolyGoat> If I would want to ship the python interpreter, I'd have to compile it for my target platform, and use this binary to run my virtualenv, right?
[12:42:33] <HolyGoat> I see, I need to look into wheels
[12:42:52] <mgedmin> a virtualenv contains a copy of the python interpeter, but no standard library modules that link against it
[12:43:06] <HolyGoat> I'm not actually looking to ship docker containers; rather, I'm using docker containers to build OS packages from my Python projects
[12:43:11] <mgedmin> you can't use a virtualenv with a different python interpreter from the one you used to create the virtualenv
[12:43:23] <mgedmin> (this also means if you upgrade your system python, you have to fix all your virtualenvs)
[12:43:29] <HolyGoat> Ah, got it. Thanks. That's important information for me
[12:44:12] <mgedmin> also, a virtualenv is not relocatable (by default; there's a flag but it's not 100% reliable): you can't copy a virtualenv to a different filesystem path and expect it to work
[12:44:16] <HolyGoat> So that essentially means that if I want to create for example a 2.7 virtualenv on an OS that depends on 2.6, it cannot be done?
[12:44:34] <HolyGoat> I know about this (and about --relocatable)
[12:44:40] <mgedmin> essentially yes
[12:44:53] <HolyGoat> I see.. that's a shame
[12:45:09] <mgedmin> an OS depending on Python 2.6 doesn't preclude you from installing Python 2.7 side-by-side next to it
[12:45:22] <mgedmin> e.g. for Ubuntu there's a PPA that has all the pythons from 2.6 to 3.4
[12:45:26] <mgedmin> (excluding 3.0, iirc)
[12:45:45] <HolyGoat> that's an option I suppose.. and then put that interpreter as first in the PATH when executing things in my virtualenv, you mean?
[12:46:06] <mgedmin> no: every script in a virtualenv's bin/ has the correct #! line and uses the right interpreter
[12:46:19] <mgedmin> just create a virtualenv with -p /path/to/your/python2.x
[12:46:28] <HolyGoat> so I'll need to update all scripts in the virtualenv instead
[12:46:42] <HolyGoat> Ah, right. Of course I'd be creating that virtualenv with the newly installed interpreter
[12:46:58] <mgedmin> uh, binary ABI problems will sabotage any desire to "convert" a Python 2.x virtualenv into a Python 2.y virtualenv
[12:47:40] <mgedmin> doismellburning, https://github.com/pypa/virtualenv/issues/673 perhaps?
[12:48:25] <doismellburning> oooh perhaps, thanks!
[12:48:48] <HolyGoat> mgedmin: So, say I fire up a docker machine, compile a new Python 2.7 there (alongside the system's 2.6), git clone my project, create a virtualenv from it, run python setup.py install and wrap up all this to deploy on a fresh container, that should work?
[12:49:10] <mgedmin> HolyGoat, yes, afaics
[12:52:48] <doismellburning> I seem to be on 1.11.6
[12:52:54] <HolyGoat> mgedmin: thanks a lot for your help!
[12:53:12] <mgedmin> actually I can't reproduce the error on 12.0.5
[12:53:30] <mgedmin> https://github.com/pypa/virtualenv/issues/671 is still bugging me, but I can work around that by pip uninstall pies2overrides
[13:39:56] <ionelmc> mgedmin: this seems to work for me on windows https://github.com/ionelmc/virtualenv/commit/aff2ca08509be795650e38c94f1f4442d7b227fa
[13:40:10] <ionelmc> on top of latest virtualenv
[13:43:47] <ionelmc> wadda ya mean you can't repro it
[13:45:30] <mgedmin> I mean virtualenv -p python3.4 /tmp/py34 works fine for me on ubuntu 12.10
[13:45:45] <mgedmin> if I pip upgrade virtualenv to 12.0.5
[13:46:11] <mgedmin> I'm talking about the error in https://github.com/pypa/virtualenv/issues/673
[13:49:40] <ionelmc> mgedmin: ah right, it's fine for linux
[13:49:46] <ionelmc> on windows is borken
[13:50:33] <mgedmin> oh, interesting
[13:51:58] <Ivo> ionelmc: wat, a subprocess'ed python can pollute the parent's sys.path?
[13:52:14] <Ivo> ...only in windows?
[13:52:39] <Ivo> (relating to the commit you linked)
[13:54:04] <mgedmin> Ivo, os.path.dirname(sys.argv[0]) ends up on sys.path of the child process
[13:54:53] <mgedmin> so when you run python3.4 /usr/lib/python2.7/site-packages/virtualenv.py, you end up with the 2.7 site-packages on the sys.path
[13:55:23] <Ivo> mgedmin: think about just using the cwd= argument to Popen?
[13:55:34] <mgedmin> it's not cwd, it's the directory of the script, iirc
[13:55:36] <mgedmin> am I mistaken?
[13:56:22] <Ivo> oh, wasn't there another pull to try and fix that
[13:56:48] <mgedmin> I created two alternative fixes (#672, #674), one of them got merged, then reverted
[13:57:34] <ionelmc> i've added some examples here https://github.com/pypa/virtualenv/issues/705
[13:58:07] <mgedmin> ionelmc, looks like https://github.com/pypa/virtualenv/issues/671
[13:58:16] <ionelmc> yup
[13:58:30] <mgedmin> do you have pies2overrides installed?
[13:58:43] <Ivo> mgedmin: you know what it broke that made it need reverting?
[13:58:58] <Ivo> or did dstufft just revert all that on masse
[13:59:04] <mgedmin> Ivo, no, dstufft linked to an etherpad with some vague notes, but I wasn't able to reproduce :(
[13:59:16] <Ivo> I wish I got to tell him to just do some more 1.11 releases
[13:59:27] <dstufft> I was able to reproduce it
[13:59:38] <Ivo> virtualenv develop really wasnt in a solid state when it was pushed to 12.0
[14:00:33] <ionelmc> that sys.path pollution from script name is very weird
[14:01:43] <mgedmin> ionelmc, do you have pies2overrides installed?
[14:01:58] <dstufft> I tried everything I could think of and it wasn't working, so I reverted us back to 1.11 state
[14:02:18] <Ivo> so https://github.com/pypa/virtualenv/pull/648 was one direct fix that got pulled
[14:02:19] <dstufft> and started on https://github.com/pypa/virtualenv/pull/697 to remove the terrible -p behavior.
[14:02:41] <ionelmc> mgedmin: nope
[14:02:43] <ionelmc> https://www.irccloud.com/pastebin/GcJyRo03
[14:02:53] <Ivo> dstufft: did you check for brokenness with each different pull or not enough time to at the time?
[14:03:00] <mgedmin> ionelmc, it's not importable; check with pip list
[14:03:14] <dstufft> If I recall, all I had to do to reproduce it was spin up an ubuntu 14.04 box and attempt to create a virtualenv with -p
[14:03:23] <ionelmc> mgedmin: shiet.... it's installed
[14:03:26] <ionelmc> what the hell :)
[14:03:27] <mgedmin> ah, 14.04!
[14:03:42] <mgedmin> ionelmc, some packages use it instead of 'six' for python2-and-3 compatibility in a single codebase
[14:03:46] <dstufft> one fix would make it so -ppython2 would work but -ppython3 would break, and the other would reverse it
[14:03:48] <mgedmin> 'isort' is one such
[14:04:00] <dstufft> if I remember correctly
[14:04:11] <dstufft> I believe the error also occured in ubuntu 12.04, but I didn't personally try it
[14:04:45] <dstufft> Ivo: I believe I tried each pull and tried some in combination
[14:05:00] <Ivo> ionelmc: you could try adding in https://github.com/pypa/virtualenv/pull/648 again for the sys.path thing you were working on
[14:05:33] <mgedmin> Ivo, that commit is buggy: it causes https://github.com/pypa/virtualenv/issues/673
[14:05:41] <ionelmc> Ivo: was that reverted in 12?
[14:05:46] <mgedmin> because sys.path[0] gets removed twice!
[14:05:52] <mgedmin> once at the top level, and once inside a function
[14:05:55] <Ivo> it was reverted in a 12.0.x
[14:06:06] <Ivo> =_= lol
[14:06:20] <ionelmc> Ivo: it seems to work fine with mgedmin's fix: https://github.com/ionelmc/virtualenv/commit/aff2ca08509be795650e38c94f1f4442d7b227fa
[14:06:20] <mgedmin> and it doesn't fix #671 because the error happens on line 8, during "import base64", before sys.path[0] is removed
[14:06:53] <Ivo> there was yet another pull to put the import base64 before the sys.path[0] removal
[14:06:58] <Ivo> **after
[14:07:00] <mgedmin> that was #674
[14:07:04] <ionelmc> well anyway, i don't wanna really push for a fix on this, i can live with my patched version
[14:07:25] <ionelmc> hopefully i'll dogfood the virtualenv rewrite soon :-)
[14:07:34] <ionelmc> dstufft: how the review going?
[14:07:40] <Ivo> I'm just wondering if there was any safe pull amongst the reverted ones that wasn't part of the crash-everything ones
[14:08:49] <dstufft> ionelmc: i've poked at it a little bit, I don't know if I'll get to it today, I'm probably going to need to leave sooner or later. I slipped on some ice and hurt my ankle and i'm probably going to need to get it looked at
[14:09:40] <ionelmc> dstufft: well anyway, for 3.2/pypy support ...
[14:09:49] <ionelmc> looks like i'll need a real module locator
[14:10:06] <ionelmc> a-la the legacy virtualenv, but without triggering imports
[14:10:11] <dstufft> what do you mean by module locator
[14:10:27] <ionelmc> ok
[14:10:42] <ionelmc> so, remember this https://github.com/pypa/virtualenv/commit/a54138b1721baa353f4445e0c418c802828d4af0 ?
[14:10:57] <ionelmc> the bootstrap modules can be all over the place
[14:11:05] <ionelmc> so i'd need to have something like that
[14:11:19] <dstufft> ah, I see
[14:11:21] <ionelmc> i think i can just pull out sys.path from the subinterpreter
[14:11:37] <ionelmc> and then just look up all those for the bootstrap stuff
[14:11:42] <ionelmc> copy the first match
[14:11:49] <ionelmc> no imports :)
[14:12:08] <ionelmc> that's my plan for 3.2/pypy support
[14:12:38] <ionelmc> that way i don't need to code for any implementation-specific lib paths
[14:16:09] <dstufft> there might be something in importlib that we can steal
[14:24:56] <ionelmc> dstufft: it also need to copy dirs like config, which are not packages
[14:25:29] <ionelmc> so i think it can be very simple (dumb), no facy package handling and checks
[14:25:36] <ionelmc> s/facy/fancy/
[14:43:25] <mgedmin> did setuptools 11.x change the way ez_setup works?
[14:43:40] <mgedmin> https://travis-ci.org/buildout/buildout/jobs/46070031#L98 is a new failure in buildout's tests
[14:44:01] <mgedmin> this is what it's doing and failing: https://github.com/buildout/buildout/blob/master/dev.py#L50-L61
[14:46:49] <mgedmin> I think it's assuming zipped eggs for setuptools and pkg_resources
[14:50:17] <mgedmin> ah no, it's assuming a single zipped egg for both setuptools and pkg_resources
[14:50:20] <mgedmin> should be easy to fix
[14:52:32] <mgedmin> wait, no, it's assuming pkg_resources.py is a module and not a package?
[14:53:35] <mgedmin> I just can't today
[16:18:50] <mgedmin> ok I think I have a fix for buildout
[16:23:51] <Ivo> pkg_resources is a module?
[16:23:59] <mgedmin> used to be, before setuptools 8.3
[16:26:08] <doismellburning> am I being dense? where's the link to "all versions" on https://pypi.python.org/pypi/pyOpenSSL etc. ?
[16:29:05] <ionelmc> doismellburning:
[16:29:06] <ionelmc> https://pypi.python.org/pypi/pyOpenSSL/json
[16:29:10] <mgedmin> there isn't one; you can use https://warehouse.python.org/project/pyOpenSSL/ instead ("All Versions" in the sidebar on the right) or
[16:29:16] <doismellburning> ah, thanks
[16:29:20] <mgedmin> https://pypi.python.org/simple/pyopenssl/
[16:29:25] <doismellburning> (also, _ah_, there's _only_ 0.14)
[18:03:09] <mgedmin> "setuptools-11.3.1-py2.6.egg/setuptools/command/egg_info.py:171: DeprecationWarning: Parameters to load are deprecated. Call .resolve and .require separately."
[18:03:30] <mgedmin> this shows up on Python 2.6 only
[18:30:56] <indygreg> i'm trying to get review board to switch from easy_install/eggs to pip/wheels and i could use some technical expertise to answer some of the maintainer's concerns. https://groups.google.com/forum/#!topic/reviewboard-dev/5_H9jULhYUs
[18:32:08] <indygreg> they have a use case where commercial extensions are currently distributed in .pyc-only eggs (so source code doesn't leak). what's the non-egg story to support this use case? afaik wheels purposefully do not do .pyc distribution
[18:33:39] <Wooble> distributing .pyc files for security reasons is just giving yourself a false sense of security.
[18:34:21] <tomprince> In any case, distributing a wheel for the base package doesn't prevent extensions from distributing other things.
[18:34:59] <indygreg> indeed. but some people believe in security through obscurity :/
[18:35:45] <indygreg> what about pkg_resources in wheels?
[18:35:49] <indygreg> does that "just work"?
[18:36:08] <indygreg> (this is unclear from the Packaging User Guide and the pkg_resources docs)
[18:36:16] <tomprince> Well, wheels are expanded on install.
[18:36:28] <tomprince> Running them zipped isn't supported.
[18:38:01] <indygreg> but pkg_resources.resource_filename and pkg_resources.resource_string etc will continue to work with wheels?
[18:38:20] <tomprince> Yes.
[18:38:26] <indygreg> thanks
[19:46:40] <wsanchez> Given my project's setup.py, is it possible to get pip to compute the dependency graph for me and print it out?
[19:48:29] <tdsmith> mostly no afaict but the tl.eggdeps project can tell you dependency graphs for installed packages
[19:50:56] <doismellburning> or a few short lines of networkx code
[19:53:01] <wsanchez> I want to have a requirements-stable.txt file that lists every dependency with a specific version, which is "this is what we tested".
[19:53:33] <wsanchez> And then various versions of that, so buildbots can test different combos in virtualenvs, etc.
[19:55:08] <tdsmith> doismellburning: what api knows about dependencies?
[19:55:23] <doismellburning> tdsmith: ?
[19:55:43] <tdsmith> trying to parse "or a few short lines of network code"
[19:55:52] <tdsmith> oh, networkx is a thing?
[19:56:12] <doismellburning> yes
[19:56:16] <tdsmith> so it is
[19:56:32] <doismellburning> and fair, I assumed already-installed
[22:58:41] <wsanchez> tdsmith: eggdeps is totally good for what I need right now, thanks.
[23:00:31] <tdsmith> cool!