PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Friday the 22nd of April, 2016

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[03:31:48] <kwlzn> whats up with pypi?
[03:32:15] <kwlzn> https://bitbucket.org/pypa/pypi/issues/442/500-server-error-trying-to-register
[03:32:53] <kwlzn> we're also seeing this
[08:59:45] <ssc> hi all
[08:59:56] <ssc> I have some weird installation problems under OS X
[09:00:33] <ssc> I can directly install packages with pip, but when it tries to download a dependency, I get an error "[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed"
[09:01:17] <ssc> e.g., if I do "pip install pkgconfig", it complains that it cannot find/install nose and fails. But I can install nose manually via "pip install nose" and don't get no errors.
[09:01:59] <ssc> Using pip 8.1.1 on Python 3.5.1
[09:11:00] <ssc> The error seems to be raised when "python setup.py egg_info" is called to get the dependencies.
[09:11:23] <ssc> Maybe Python is using other certificates / looking for them at another place than pip?
[10:04:11] <puiterwijk> dstufft: Looks like I'm not the only one having issues, and they weren't fixed yet: https://bitbucket.org/pypa/pypi/issues/442/500-server-error-trying-to-register
[10:10:41] <mgedmin> ssc, is pkgconfig using setup_requires?
[10:11:22] <mgedmin> wheee pypi is donw
[10:11:25] <mgedmin> and warehouse is down too
[13:36:48] <elarson> dstufft: I was thinking of pushing out a new cachecontrol release but wanted to check in with you as it includes a fix for streaming responses. are you able to test in pip CI with the latest cachecontrol ref?
[13:37:39] <dstufft> elarson: I can do that yea, or if you want you can just submit a PR with cache control revendored and just say do-not-merge and that'll test it too :]
[13:38:15] <elarson> ah cool. I'll take a look then
[13:47:08] <dstufft> elarson: the only part that's not trivial at all (and isn't itself very hard) is just making sure you adjust the imports to import dependencies from pip._vendor too
[13:47:33] <dstufft> mostly I just do cd pip/_vendor && rm -rf cachecontrol && pip install -t . cachecontrol==wahtever
[13:47:40] <dstufft> and then make sure the imports are right
[13:47:43] <elarson> dstufft: I'm starting by updating vendor.txt and using the Makefile in _vendor
[13:48:04] <dstufft> I haven't used that makefile in awhile so it may or may not still work
[13:48:11] <elarson> ah ok
[13:48:25] <elarson> if it's broken, maybe I can fix ;)
[14:20:41] <elarson> dstufft: do I add `do-not-merge` in the commit message?
[14:20:56] <dstufft> elarson: just a comment in the PR is good enough
[14:25:41] <elarson> got it
[14:30:59] <elarson> https://github.com/pypa/pip/pull/3621
[14:32:12] <dstufft> elarson: your imports aren't right
[14:32:21] <dstufft> look at the diff on https://github.com/pypa/pip/pull/3621/files
[14:32:37] <dstufft> you'll need to manually patch cachecontrol to do imports from pip._vendor
[14:32:41] <elarson> ah ok
[14:32:48] <elarson> I misunderstood
[14:34:40] <dstufft> one of these days I'm going to make a proper vendoring tool that will automatically fix up imports like that
[15:30:11] <vincentll> why does "python setup.py bdist_wheel" adds ".dev1" to my package version? (0.1.1 -> 0.1.1.dev1)
[15:33:54] <elarson> I broke the build!
[15:33:57] <elarson> https://travis-ci.org/pypa/pip/jobs/125036026
[15:50:42] <elarson> dstufft: that failure looks incorrect. in inspecting the test and the output (based on travis log) both strings exist.
[15:51:35] <elarson> that said, it does look like some other output is getting mixed in
[20:20:29] <dbrecht> dstufft: spent some time evaluating pex yesterday. unfortunately (unless i'm missing something), it doesn't /quite/ satisfy all of the requirements i'm after. the one thing it's missing is not being dependent on system python. pex files don't seem to contain a standalone interpreter as virtualenvs do.
[20:21:10] <dbrecht> gotta do some investigation on my end, hoping that it's not a hard requirement, because it's a pretty slick packaging system otherwise
[20:42:32] <dstufft> dbrecht: ah, yea it's still dependent on system python :( (well virtualenv is too really, for most of the stdlib)
[20:43:08] <dstufft> I have some ~plans~ to make it so CPython can run in a manner where you compile it with some flag and then you just concat a zip file to it
[20:43:12] <dstufft> but that's a down the road thing
[20:44:30] <dbrecht> virtualenv is only dependent on system python at build time, not runtime tho no?
[20:55:30] <dstufft> dbrecht: No, it doesn't copy in the entire stdlib
[20:55:44] <dstufft> only enough to bootstrap it before i can modify sys.path to point to the system stdlib
[20:57:51] <dbrecht> dstufft: hrmph. guess that throws a wrench into one of the ideas i was going to investigate then. no option to force full stdlib inclusion at build time?
[20:58:13] <dstufft> dbrecht: at that point you don't have a virtual environment, you just have a full build of CPython :)
[20:58:20] <dbrecht> heh
[20:59:32] <dbrecht> fair enough, but unfortunately that's kinda what i'm looking for. in a nutshell, we're currently using cx_freeeze to satisfy all of our production requirements. there have been a few cases where it hasn't panned out as we were hoping it would, so i'm evaluating workarounds.
[21:00:14] <dbrecht> i'm gonna dig into the "decoupled from system python" requirement tho, hopefully it's a requirement that can be removed, and then that opens the doors for a couple options
[21:01:50] <dstufft> dbrecht: Why not just compile a copy of Python if you want the whole thing?
[21:02:50] <dbrecht> dstufft: that's pretty much what i'm thinking at this point
[21:03:30] <dbrecht> optimally that requirement goes away and i don't have to roll something new, but will add that to the list of solutions to evaluate should it remain a requirement.
[21:03:31] <dbrecht> thanks
[21:08:33] <dbrecht> dstufft: you don't have a PEP written for that idea by any chance, do you? (compiling CPython with the zip file concat'd to it)
[21:08:41] <dstufft> dbrecht: nope
[21:15:17] <FRidh> dstufft: I'm building a script that retrieves url and sha256 for packages listed in a requirements.txt file. This is for the Nix package manager. Do you have any recommendations? pip.python.org has only md5, but warehouse.python.org does have sha256. Is it possible to use the pip PackageFinder in combination with warehouse to extract the sha256?
[21:18:19] <tdsmith> FRidh: https://github.com/tdsmith/homebrew-pypi-poet is some prior art that may or may not be helpful, although it deals with hashes by just downloading the tarballs and hashing them since pypi doesn't serve them
[21:19:56] <tdsmith> pip doesn't have a python API so importing things from pip may be fragile
[21:20:21] <tdsmith> !logs
[21:20:21] <pmxbot> http://chat-logs.dcpython.org/channel/pypa
[21:20:54] <FRidh> tdsmith: thanks I'll have a look at it.
[21:21:26] <tdsmith> this is a weird dns response:
[21:21:29] <tdsmith> chat-logs.dcpython.org has address 92.242.140.2
[21:21:29] <tdsmith> Host chat-logs.dcpython.org not found: 3(NXDOMAIN)
[21:21:40] <FRidh> I was using the API's before as well, but now I would like to use requirements.txt parsing which pip has
[21:24:03] <FRidh> ohh, I see warehouse is now at pypi.io, and the simple API gives sha256. The only issue seems to be lack of newlines.
[21:24:21] <FRidh> guess that's causing the parsing to fail
[21:48:05] <dstufft> FRidh: pypi.io/pypi/<whatever>/json might be friendlier
[21:50:12] <gchristensen> dstufft: looks like nix is double-teaming you (hi FRidh) :)
[21:53:38] <dstufft> I can also add blake2b w/ 32 byte digest to pypi.io if htat's more your style
[21:53:57] <dstufft> to the json view that is
[21:58:40] <FRidh> dstufft: just the sha256 hash and filename is fine. It's what the json view now shows so that is good. I was experimenting with using PackageFinder and parse_requirements, since I would like to use a requirements.txt file as input. I've used https://pypi.io/simple as url but unfortunately I get a DistributionNotFound error
[22:01:03] <dstufft> FRidh: PackageFinder isn't a great API :/ Warehouse should be compatible with it though