[00:05:17] <TRManderson> I have fully PEP440 compliant versions (e.g. project-1.0.0.dev0+hash) for my packages, and pip won't recognise packages with versions similar to my example when using -i or --find-links
[00:05:21] <TRManderson> Is this intentional? Can I get around this while retaining my version structure somehow?
[00:06:14] <TRManderson> It was suggested I cc dstufft
[00:06:19] <dstufft> TRManderson: what version of pip?
[10:01:29] <apollo13> pip is already complex enough, having users muck around with it's internals would hardly be helpful
[10:04:21] <zmo> apollo13 - well, offering a public API that's just between the CLI interface and the real internal code would be nice to have, so that then we could call the same commands, but get python objects and iterators instead!
[10:04:38] <apollo13> zmo: that is easier said than done
[10:04:48] <apollo13> what about handling the interactive choices pip provides etc etc…
[10:04:51] <zmo> it feels just wrong to get through the shell to call another python thing
[10:05:07] <apollo13> I wish I had your problems (no offense) :D
[10:05:26] <apollo13> at the end of the day, using a shell here is probably the easiest and nicest way
[10:05:49] <apollo13> zmo: it would not just be some python objects and iterators, but also callback functions and whatnot
[10:06:00] <zmo> yup, sure, that's not a horrible thing
[10:06:19] <zmo> and even for stuff that requires interactivity, I'm pretty sure there's a way to impose a strategy
[11:17:02] <dstufft> zmo: One of the key problems is that invoking pip introspects the state of a Python environment, and the mechanisms it uses to do that are caching-- so once pip modifies that environment, if you don't start up a new process it's possible (and likely tbh) that pip will be operating based on a stale view of the world
[12:46:43] <carlesc> hi there, I was wondering, is it possible to create a pip package that contains pre-built shared libraries for multiple platforms? I want to avoid the user having to compile them since they require a lot of dependencies to build
[12:54:12] <carlesc> I read the following thread on SO, but it seems that there's still quite a few loose threds: http://stackoverflow.com/questions/31380578/how-to-avoid-building-c-library-with-my-python-package
[12:54:35] <carlesc> just like anyone else I would package both 32 and 64-bit shared libraries for all 3 platforms (Win, Linux, OS X)
[14:56:11] <count> i.e.: pypi XMLRPC sees to be broken on at least one endpoint
[14:56:29] <count> the problem is that this is hardcoded in &%$@!! Puppet with 10s timeout, with no way to elegantly change it
[14:56:48] <count> this means it's currently broken for everyone installing pip packages via Puppet via that endpoint
[14:59:23] <count> (can't change the puppet code for anyone else to point to https://, nor mine :()
[16:16:55] <count> how/where do I report this correctly?
[16:20:47] <count> seems like the XMLRPC interface is lacking proper monitoring :)
[16:21:26] <count> at least this was apparently fixed by the Puppet guys in their very very latest release 13 days ago: https://github.com/puppetlabs/puppet/commit/152299cc859fc74343c697841848086d4e41b6f8
[16:50:34] <dstufft> count: we've been under a bit of a DoS, it appears from people using puppet on EC2 and we're working to rate limit/block the IP that's degrading our service ATM
[16:50:41] <dstufft> the XMLRPC api is really old, and creally crappy
[16:55:55] <count> dstufft: yeah, I figure the bit about old and crappy. do you have an IP (range) for the AWS hosts? hope it's none of the services from our organzation :D
[16:56:43] <count> dstufft: I've no idea who decided when and why to integrate freaking puppet with hardcoded _EVERYTHING_ against it
[17:16:15] <count> $ time python -c 'import xmlrpclib ; server = xmlrpclib.ServerProxy("http://pypi.python.org/pypi") ; print server.package_releases("pip")'
[17:18:29] <count> EWDurbin: not sure whether 200 OK is the right thing to check for, or </methodResponse> .. a changing version might not be clever
[17:19:15] <count> EWDurbin: taking up dstufft on the extra XMLRPC call (like 'health', which does some internal check for sanity etc) might be clever, but should excercise the end-point end-to-end, I'd say
[17:19:39] <EWDurbin> for sure, i think making a real call will be sufficient to determine health
[17:35:31] <count> okay, thanks for the help everyone :) I'll fall back into silence and shall eventuall drop out of the channel
[20:19:09] <harijay> I am using pip 8.1.2 from /usr/local/lib/python2.7/dist-packages (python 2.7) . SOme of my packages ( “argparse , “wheel” , “wsgiref” ) dont get listed with pip 8.1.2 but were listed fine with the OS ( Ubuntu 14.04) supplied pip 1.5.4
[20:19:58] <dstufft> harijay: do you mean pip list or pip freeze
[20:22:53] <harijay> #dstufft That revealed wheel but “wsgiref” and “argparse” are still hidden to pip freeze —all
[20:23:09] <dstufft> harijay: oh, wsgiref and argparse are hidden because they're part of the stdlib
[20:24:28] <harijay> Oh I see…because argparse was not before…so then why does wheel get revealed only after —all. Also if I say pip install argparse it still says its installed, I assumed as a separate package
[20:28:27] <dstufft> harijay: --all exposes stuff hidden b/c it is installed by default in a virtualenv
[20:28:44] <dstufft> and yea you can still installa rgparse, but the stdlib argparse shadows it