PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Wednesday the 4th of April, 2018

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[09:34:11] <lazka> If I push a release with a dev version number like "1.0.0.dev0" to PyPI will it implicitly show up as a dev release, or do I have to do something different on the upload side of things?
[09:43:47] <pradyunsg> lazka: it'll show up as a dev release.
[09:44:05] <lazka> pradyunsg, ok, thanks!
[13:20:42] <tryingtogetitrig> My google fu has failed me and I was wondering if someone here could help:
[13:21:24] <tryingtogetitrig> I'm trying to push code to a private pypiserver over https. The cert was generated by my company's domain controller and is trusted by my OSes (both mac and windows) but I can't get the pip tooling to trust it.
[13:23:55] <mgedmin> pip --cert /path/to/cert.pem ? and/or same thing in a pip config file
[13:41:27] <tryingtogetitrig> I'd been using setup.py to attempt to upload. But I don't even see an upload command when I run `pip --help`
[13:42:13] <mgedmin> oh, sorry, brainfart
[13:42:27] <mgedmin> twine is the tool for uploading, pip is the tool for downloading, my suggestion was wrong
[13:42:48] <mgedmin> twine supports --cert
[13:54:49] <tryingtogetitrig> I try that, but I get Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))
[13:55:02] <tryingtogetitrig> Is there a way to get more info about what went wrong?
[13:55:10] <tryingtogetitrig> There doesn't seem to be a --verbose flag for twine.
[13:57:06] <tos9> tryingtogetitrig: that error is saying that hostname verification failed I believe.
[13:57:16] <tos9> your openssl doesn't trust the certificate it got back
[13:57:34] <tos9> check first with `openssl s_client`, see if you get the same error
[14:01:52] <mgedmin> is the cert you're passing to twine your internal CA cert?
[14:02:26] <mgedmin> does the cert served by your internal pypiserver contain a correct subjectAltName that matches the hostname of your internal pypiserver?
[14:02:30] <tryingtogetitrig> It's the cert.pem that's printed out when I run print(ssl.get_default_verify_paths())
[14:02:38] <tryingtogetitrig> I added the cert to that file.
[14:02:55] <tryingtogetitrig> The cert I added is the internal CA cert.
[14:04:45] <tryingtogetitrig> Chrome is happy with the cert served by my internal pypiserver
[14:08:23] <tos9> twine possibly uses requests, which I am pretty sure vendors certifi these days or something
[14:08:43] <tos9> I don't remember any of these details thankfully, but basically check to make sure that the bundle you just looked in is the one twine is using
[14:09:01] <tos9> and in general you should use your OS's mechanism to manage those trusted certs, though that possibly won't have helped here
[14:09:58] <tryingtogetitrig> openssl s_client -connect isn't very clear. It says both that "verify error:num=2:unable to get issuer certificate" and that the verify return code is 0 (ok)
[14:10:13] <tryingtogetitrig> So I'm not sure whether it verified or not.
[14:10:43] <lukasa> It did not
[14:11:45] <tryingtogetitrig> Output at https://pastebin.com/nj10x1ym if anyone's interested.
[14:12:19] <tryingtogetitrig> I'm not sure how to debug this, so if anyone has advice, I'm all ears/eyes.
[14:13:16] <lukasa> What's the content of /private/etc/ssl/cert.pem?
[14:16:26] <tryingtogetitrig> I get the same behavior from openssl when I use a .pem with just the internal CA
[14:16:34] <tryingtogetitrig> Just the internal pem can be found at https://pastebin.com/3AWYaMpw
[14:18:01] <mgedmin> openssl s_client is a terrible thing
[14:18:27] <mgedmin> you have to pass -CApath /etc/ssl/certs or something so it'll be able to verify certs
[14:18:53] <mgedmin> and I'm not sure it'll be useful, given that we already know chrome accepts the certificate
[14:19:22] <tryingtogetitrig> I ran openssl s_client -CAfile /tmp/justourca.pem -connect p-sec-sbx:443
[14:19:55] <mgedmin> ah, hm, yes, ugh
[14:20:02] <mgedmin> I don't know how to interpret that output either
[14:20:32] <mgedmin> wait, maybe that cert you gave is just an intermediate?
[14:20:52] <mgedmin> do you have a self-signed root cert (the one shown as CN=IT-CERTROOT01-CA in the issuer field)?
[14:21:13] <mgedmin> ... is the root cert necessary for verification if you have the intermediate marked as trusted?
[14:23:54] <tryingtogetitrig> Here's where my knowledge of certs gets hazy. I think what I have is right. Internal pypiserver has a cert that's signed by our internal CA. Internal CA is self-signed, but is the thing in /tmp/justourca.pem
[14:24:10] <tryingtogetitrig> no other certs in the chain.
[14:24:34] <tryingtogetitrig> Not certain that answers your question, though.
[14:50:32] <aruna1234> hey
[14:52:57] <aruna1234> I have multiple python files and I have imported one file to another however one particular variable that is in the this other file isnt getting imported
[14:53:08] <aruna1234> and is showing NameError
[16:28:19] <tryingtodotherig> super-exciting resolution to my pypiserver/openssl problems from earlier -- I'd configured pypiserver w/a cert that had 3 certs in the full trust chain. 1) organization's root 2) intermediate issuing cert 3)pypiserver cert.
[16:28:34] <tryingtodotherig> My pypiserver only served up the bottom 2.
[16:29:41] <tryingtodotherig> So when I had the public cert for the issuing CA, openssl wasn't happy because it only got as far up the chain as the issuer, not the root.
[16:30:04] <tryingtodotherig> That is, when I specified -CAfile and pointed to the public cert for the issuing CA...
[16:30:23] <sumanah> tryingtodotherig: thanks for the resolution report!
[16:30:34] <sumanah> tryingtodotherig: you've told pypiserver? (bug report or similar?)
[16:30:42] <sumanah> (assuming you think they ought to know/fix anything)
[16:30:43] <tryingtodotherig> What I needed to do was to put both the issuing CA and the root CA in the file pointed to by -cert and then it all worked.
[16:30:50] <sumanah> (or add docs)
[16:30:55] <tryingtodotherig> Not sure they really need to fix anything. I think it was on my side.
[16:31:00] <sumanah> ah ok
[16:31:15] <tryingtodotherig> An extra paragraph to docs could probably save people in the future a bunch of time.
[16:31:46] <tryingtodotherig> Who would I talk to if I had something to add to general docs/troubleshooting docs?
[16:31:57] <Wooble> aruna1234: #python is probably a better place to ask, but use a pastebin with code and output in any case.
[16:32:11] <aruna1234> yeah
[16:32:12] <aruna1234> okay
[16:32:16] <Wooble> (uh, also I totally didn't notice that was 2 hours ago)
[16:32:25] <aruna1234> hahahh
[16:32:28] <aruna1234> thats okay
[16:32:36] <aruna1234> got it solved
[16:34:00] <sumanah> tryingtodotherig: my experience with https://github.com/pypiserver/pypiserver has been good for talking to mplanchard -- a "here's what I ran into, what docs ought it go into?" would probably go fine IMO
[16:34:27] <sumanah> tryingtodotherig: what docs did you look at when you were trying to fix it? http://packaging.python.org/ or something else?
[16:36:42] <tryingtodotherig> sumanah: github and https://python-packaging-user-guide.readthedocs.io/ and a bunch of random stackoverflow posts
[16:37:01] <tryingtodotherig> sumanah: What's the best way to get in touch with mplanchard?
[16:37:29] <sumanah> tryingtodotherig: posting an issue on https://github.com/pypiserver/pypiserver , IMO, @-mention @mplanchard and you can say that @brainwane suggested you do so :)
[16:38:11] <sumanah> tryingtodotherig: if you prefer you could open an issue at https://github.com/pypa/python-packaging-user-guide/issues/ which is about the Python Packaging User Guide. Again, feel free to @-mention me (@brainwane) to say I suggested it was a reasonable idea
[16:38:44] <sumanah> either/both (PyPUG & pypiserver) seem, to me, reasonable places to say: I had this experience, what can we do to point others in the right direction?
[16:50:21] <jellycode_> hi all, new to packaging
[16:50:34] <sumanah> Hi jellycode_. Welcome.
[16:50:47] <jellycode_> Can't seem to get my python packages to upload to my private pypi repository... it was working the other day
[16:51:13] <sumanah> jellycode_: please tell us more specifics :)
[16:51:15] <jellycode_> The error is: HTTPError: 400 Client Error: Bad Request
[16:51:30] <jellycode_> I've spent about 3 hours testing, I think i've narrowed it down
[16:51:39] <sumanah> jellycode_: ok. when was the last time it was working, and when did it stop working? and what specific software is your private repo running?
[16:51:49] <jellycode_> Working on typing it out :)
[16:51:55] <sumanah> sure :)
[16:52:16] <sumanah> jellycode_: if you need to paste a log, I find http://hastebin.com/ useful
[16:52:28] <jellycode_> Long-story-short, I think it might be related to this line in PKG-INFO: Metadata-Version: 2.1
[16:52:52] <jellycode_> If i download a package i upload the other day, and re-upload it, it works.
[16:53:00] <sumanah> jellycode_: oh maybe the private repo software you're running doesn't know how to deal with https://www.python.org/dev/peps/pep-0566/ which is new
[16:53:08] <jellycode_> Yes, i think that must be it
[16:53:37] <jellycode_> I just tried downgrading twine to 1.10.0, but then i realized that file is created during the command `python setup.py sdist`
[16:53:48] <sumanah> jellycode_: twine, pkginfo, setuptools, wheel, and other related Python packaging tools for PyPI.org now have support for this new metadata
[16:54:03] <sumanah> jellycode_: I suggest you talk to your vendor or developer to get them to follow suit.
[16:54:21] <jellycode_> Currently, what I would need is a flag to pass to python setup to force it to use the old metadata
[16:54:24] <sumanah> jellycode_: Here are the Core Metadata Specifications https://packaging.python.org/specifications/core-metadata/
[16:54:47] <sumanah> jellycode_: this sounds like a feature request for setuptools http://setuptools.readthedocs.io/ to me
[16:54:50] <di_codes> jellycode_: I think that can be done by omitting any 2.1 metadata from your `setup.py`
[16:54:56] <sumanah> oh is that so?
[16:55:06] <di_codes> which would be `project_urls` or `long_description_content_type`
[16:55:12] <sumanah> jellycode_: are you making an sdist or a wheel? or both?
[16:55:32] <jellycode_> sdist
[16:55:37] <jellycode_> don't have either of those fields currently
[16:55:49] <sumanah> di_codes: I could be wrong but it could be that jellycode_'s internal pypi chokes specifically on anything other than "Metadata-Version: 2.0"
[16:56:09] <di_codes> i’m pretty sure setuptools will only write out the 2.1 version if either of those fields are present
[16:56:15] <sumanah> Oh! ok
[16:56:39] <di_codes> jellycode_: what version of setuptools are you using?
[16:56:39] <sumanah> jellycode_: I defer to di_codes on this because he literally wrote many of these improvements.
[16:56:40] <jellycode_> that theory doesn't appear to agree with experiment, but let me try more
[16:57:20] <jellycode_> Version: 39.0.1
[16:57:22] <sumanah> jellycode_: is it possible for you to tell us the name of your vendor for your private pypi repo?
[16:58:05] <jellycode_> Yeah it's JFrog's Artifactory, latest version, 5.10.1.
[16:58:21] <jellycode_> They're awesome, so I'm sure they're working on a patch
[16:58:39] <di_codes> jellycode_: oh actually it’s `long_description_content_type` and `provides_extras`, do you have the latter?
[16:58:40] <di_codes> <https://github.com/pypa/setuptools/blob/a0723a66bf7950ee470971ac9931d751a7dd76f3/setuptools/dist.py#L41-L42>
[16:58:41] <jellycode_> but, even if it's released tomorrow, will take some time to get it applied, so need the workaround for now
[16:58:47] <jellycode_> yeah, provides_extras
[16:59:08] <jellycode_> oh no, sorry
[16:59:11] <di_codes> yeah, so that’s a metadata 2.1 field: <https://www.python.org/dev/peps/pep-0566/#provides-extra-optional-multiple-use>
[16:59:14] <jellycode_> extras_require
[16:59:51] <jellycode_> so, i'm using somebody elses setup.py template
[17:00:10] <jellycode_> let me just start hacking and slashing and see if i can get it to produce the older version
[17:00:46] <sumanah> jellycode_: also are you aware of https://github.com/pypa/warehouse/issues/3275 and https://pyfound.blogspot.com/2018/03/warehouse-all-new-pypi-is-now-in-beta.html ?
[17:01:00] <di_codes> jellycode_: you could also just downgrade your setuptools: `pip install setuptools==38.5.2` should do the trick I think
[17:02:13] <jellycode_> wasn't aware of either
[17:02:26] <sumanah> jellycode_: if your local Artifactory instance also pulls from pypi.python.org, it may be affected (https://wiki.python.org/psf/WarehouseRoadmap is our rollout roadmap). I believe the current plans mostly ensure you will have uninterrupted service as long as JFrog solves their redirect issue by the end of this month.
[17:02:38] <sumanah> jellycode_: you may wish to spread the word among other Artifactory users.
[17:02:54] <jellycode_> sweet mother of god
[17:02:59] <jellycode_> thank you so much di_codes
[17:03:23] <jellycode_> setting up CI for this package was supposed to be a 5 minute detour, just spent 4 hours on it
[17:03:31] <sumanah> jellycode_: di_codes has been volunteering, for free, to support this work (PEP 566)
[17:03:32] <di_codes> jellycode_: no problem!
[17:03:53] <sumanah> jellycode_: for the improved pypi.org, di_codes and I are two of the contractors on a grant-funded project https://pyfound.blogspot.com/2017/11/the-psf-awarded-moss-grant-pypi.html
[17:04:02] <jellycode_> thanks, i'll let artifactory team know
[17:05:40] <jellycode_> wow, would love to get such a grant for my team, we're all volunteer OSS packagers as well: https://github.com/bincrafters
[17:05:52] <di_codes> jellycode_: *also note that if you’re building a wheel you’ll probably need to do the same: `pip install wheel==0.30.0`)
[17:05:55] <jellycode_> We get some small donations and sponsorship from JFrog
[17:06:01] <jellycode_> but, it only pays for the CI
[17:06:32] <sumanah> jellycode_: are any of you coming to PyCon North America or EuroPython? https://wiki.python.org/psf/PackagingSprints we'd love to see you there
[17:06:35] <jellycode_> If you would be willing to talk about how you arrived at such a grant, I would love to learn about that process (offline)
[17:07:21] <sumanah> jellycode_: I can briefly say that the MOSS application process is fairly straightforward! https://wiki.python.org/psf/PackagingWG looked at https://wiki.mozilla.org/MOSS and put together a proposal and submitted it
[17:09:42] <sumanah> jellycode_: I don't have personal experience getting a grant from these, but you may also be interested in https://www.opentech.fund/ which has a lot of different requests for proposals, or https://foundation.mozilla.org/fellowships/apply/
[17:09:51] <sumanah> jellycode_: and I presume you've heard about http://tidelift.com/ which just launched a thing
[17:11:05] <ngoldbaum> the PSF also does grants
[17:11:10] <ngoldbaum> smaller scale usually
[17:11:17] <jellycode_> thank you so much, definitely applying for this. Also, thank you for the invitation, I might actually try to come to pycon just to meet you guys and talk a bit since we all work on packaging. I live in MI, so it's a short trip. The only challenge is that I'm already committed to https://swampup.jfrog.com/
[17:11:31] <jellycode_> And, insane a coincidence as it might be, the dates actually overlap :(
[17:11:31] <ngoldbaum> jellycode_: the sad fact is that many of these funding opportunities are undersubscribed because people don't apply for them!
[17:12:20] <jellycode_> Thanks for the info ngolbaum
[17:12:21] <sumanah> jellycode_: May is a hard month for conference overlap. :)
[17:12:42] <jellycode_> well, since you guys are leading a charge on OSS packaging for python, i think JFrog would love to have you come to swampupo
[17:12:50] <sumanah> jellycode_: if you don't already, I recommend you subscribe to http://pyfound.blogspot.com/ which has info about grants, etc
[17:12:53] <jellycode_> obviously, it looks unlikely at the moment
[17:13:11] <jellycode_> but, if you weren't aware of it, you are now :)
[17:14:52] <jellycode_> also, in case you're not aware, Conan is a newer package manager for C and C++ which is built on Python. we have a pretty healthy community seeding the central repository now, and the core devs are obviously python developers. so, there's probably a lot of overlap in challenges and solutions we could discuss in time
[17:15:06] <jellycode_> i'll made this a favorite on my irc channels, and stay in touch
[17:15:10] <jellycode_> thanks again for all the hlep
[17:16:47] <sumanah> jellycode_: thanks for the heads-ups and I do encourage you to let JFrog Artifactory users know about about the redirect issue so they can stay apprised
[17:16:54] <sumanah> just in case something goes wonky
[17:21:25] <davidlloyd> can anyone confirm if https://pypi.python.org/pypi/pip is meant to point at v10.0.0b2? that sounds like a beta, and is causing bugs for my company
[17:23:03] <ngoldbaum> davidlloyd: that's the old interface for pypi, which doesn't know about prereleases
[17:23:14] <ngoldbaum> davidlloyd: see https://pypi.org/project/pip/
[17:23:33] <ngoldbaum> davidlloyd: if you are upgrading and getting the prerelease pip 10 that's either a bug or you are updating from a very old pip
[17:24:02] <davidlloyd> i'm updating using virtualenv/pypy and it is putting me on 10.0.0b2
[17:24:15] <ngoldbaum> someone in here mentioned a bug in virtualenv that caused similar behavior
[17:24:18] <ngoldbaum> the other day
[17:24:29] <ngoldbaum> try updating virtualenv?
[17:24:35] <ngoldbaum> it shouldn't be installing prerelease packages
[17:24:48] <ngoldbaum> ofc pip 10 will come out soon, so you might want to prepare for that
[17:25:25] <davidlloyd> i'm currently weighing up working out why pip 10 is broken vs why we are installing pip 10 in the first place
[17:26:39] <davidlloyd> using pip 10 with pypy 5.10 and virtualenv we get errors installing Cython
[17:28:18] <davidlloyd> thanks @ngoldbaum, that at least allays my initial suspicion
[17:29:57] <ngoldbaum> you're using pypy with cython? interesting
[17:40:33] <jellycode_> the metadata issue would affect wheels also yes?
[17:45:22] <di_codes> jellycode_: yeah, if you’re building a wheel you’ll probably need to do `pip install wheel==0.30.0`
[17:47:24] <jellycode_> thx, also, i'm seeing conflicting examples with underscores and dashes in packages
[17:47:35] <jellycode_> can someone clear up for me what i should be doing?
[17:48:23] <ngoldbaum> it's legal to put dashes in package names but not in the name of a module
[17:48:29] <jellycode_> I think i tried to use underscores everywhere, but somewhere still has dashes...
[17:48:39] <ngoldbaum> IMO naming your package something different from the modules it installs is a bad idea
[17:49:00] <jellycode_> so, undercores everywhere i can than?
[17:49:02] <ngoldbaum> so, IMO, no dashes in package names, if you really need the spaces use underscores or whatever
[17:49:14] <ngoldbaum> that's just my opinion though
[17:52:14] <jellycode_> thx
[18:29:57] <sumanah> also jellycode_ thea is very receptive to suggestions for improvements in the http://packaging.python.org/ tutorials, guides, and discussions
[18:35:03] <jellycode_> thanks
[18:35:38] <jellycode_> Question, for our CI, any tips on the best way to add our private repository as an additional repo?
[18:35:45] <jellycode_> i really like the TWINE environment variables
[18:40:09] <jellycode_> maybe this: PIP_INDEX_URL=https://myserverurl ?
[18:46:16] <di_codes> jellycode_: you probably want EXTRA_INDEX_URL
[18:47:16] <jellycode_> i believe the artifactory "virtual" pypa repo's do pass-through, where it tries to resolve packages locally, and then fall-back on central repo if not
[18:47:32] <jellycode_> seems to be working that way in my experiment
[18:49:14] <jellycode_> Also, in my CI, i want to say "pip install my_package --upgrade --force-reinstall". However, i only want to force-reinstall my package. I'm thinking of doing these two commands to satisfy... is there a better way? 'pip install mypackage' + 'pip install mypackage --no-deps --upgrade --force-reinstall'
[18:49:49] <di_codes> jellycode_: separate commands are the way to go
[18:49:54] <jellycode_> thx
[19:49:17] <eric97477> I'm here.
[19:49:59] <sumanah> Hi eric97477
[19:50:10] <sumanah> eric97477: you said: 'pip download Sphinx --no-binary :all:' results in an error message on macOS running 10.13 for me.
[19:50:15] <sumanah> what error message?
[19:50:16] <eric97477> https://gist.github.com/EricG-Personal/4d7cfe6fdb2cdda15ac48f41157a0f83 --
[19:50:34] <eric97477> I can provide the complete output if needed.
[19:50:44] <eric97477> but, that part seemed to be the most relevant.
[19:51:11] <sumanah> eric97477: thank you. Is this happening for other packages you download via pip also?
[19:52:05] <eric97477> I have not found it happening with other packages, but my tests have not been exhaustive. I simply needed a solution for the issue mentioned in #distutils and that is how I came across the problem with Sphinx.
[19:52:11] <sumanah> ah ok
[19:52:31] <sumanah> I hadn't used "--no-binary" before and I see that the usage help for it notes:
[19:52:31] <sumanah> Note that some packages
[19:52:31] <sumanah> are tricky to compile and may fail to install when this option is used on them.
[19:52:46] <sumanah> so maybe the particular Sphinx component there is one of those.
[19:53:07] <ngoldbaum> it looks like it's crashing while setuptools is importing itself
[19:53:14] <ngoldbaum> so your setuptools is broken?
[19:53:37] <di_codes> eric97477: <https://github.com/pypa/setuptools/issues/885>
[19:53:40] <ngoldbaum> or, more properly, when setuptools tries to use pkd_resources while it imports itself
[19:53:53] <eric97477> Understood and that is perfectly fine. I am simply looking for the best solution to distributing packages to offine machines. It would have been nice if 'pip download' was that generic solution, but as you mention, some packages are tricky and there is no easy answer for that.
[19:56:04] <eric97477> So, I just skimmed through the bug report...seems like it could be related to python 3.5, which I am currently using, and fixed for 3.6?
[19:56:39] <eric97477> I can confirm it works with python 3.6.
[19:58:32] <ngoldbaum> ah, namespace packages are the worst confirmed ;)
[19:58:38] <eric97477> Now, if anyone here is a python packaging and distribution guru, I've got a question for you that I have been asking around....but seems off topic for this channel.
[19:58:56] <sumanah> maybe #pypa-dev?
[20:01:15] <eric97477> ngoldbaum: I have a situation where I needed a namespace package solution...I just selected the "easy button" and prefixed my packages name with <namespace_name>_<package_name>. Working so far. I looked at the various namespace solutions and they all seem a bit gnarly.
[20:01:35] <eric97477> thanks sumanah. I will try that channel.
[20:04:00] <ngoldbaum> eric97477: IMO namespace packages are more trouble than they're worth
[20:05:27] <eric97477> I would agree. They don't seem to buy you much...or at least I don't see the compelling need over just prefixing package names. But, if someone wants to prove me wrong, I always like to learn something new.