[09:34:11] <lazka> If I push a release with a dev version number like "1.0.0.dev0" to PyPI will it implicitly show up as a dev release, or do I have to do something different on the upload side of things?
[09:43:47] <pradyunsg> lazka: it'll show up as a dev release.
[13:20:42] <tryingtogetitrig> My google fu has failed me and I was wondering if someone here could help:
[13:21:24] <tryingtogetitrig> I'm trying to push code to a private pypiserver over https. The cert was generated by my company's domain controller and is trusted by my OSes (both mac and windows) but I can't get the pip tooling to trust it.
[13:23:55] <mgedmin> pip --cert /path/to/cert.pem ? and/or same thing in a pip config file
[13:41:27] <tryingtogetitrig> I'd been using setup.py to attempt to upload. But I don't even see an upload command when I run `pip --help`
[13:54:49] <tryingtogetitrig> I try that, but I get Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))
[13:55:02] <tryingtogetitrig> Is there a way to get more info about what went wrong?
[13:55:10] <tryingtogetitrig> There doesn't seem to be a --verbose flag for twine.
[13:57:06] <tos9> tryingtogetitrig: that error is saying that hostname verification failed I believe.
[13:57:16] <tos9> your openssl doesn't trust the certificate it got back
[13:57:34] <tos9> check first with `openssl s_client`, see if you get the same error
[14:01:52] <mgedmin> is the cert you're passing to twine your internal CA cert?
[14:02:26] <mgedmin> does the cert served by your internal pypiserver contain a correct subjectAltName that matches the hostname of your internal pypiserver?
[14:02:30] <tryingtogetitrig> It's the cert.pem that's printed out when I run print(ssl.get_default_verify_paths())
[14:02:38] <tryingtogetitrig> I added the cert to that file.
[14:02:55] <tryingtogetitrig> The cert I added is the internal CA cert.
[14:04:45] <tryingtogetitrig> Chrome is happy with the cert served by my internal pypiserver
[14:08:23] <tos9> twine possibly uses requests, which I am pretty sure vendors certifi these days or something
[14:08:43] <tos9> I don't remember any of these details thankfully, but basically check to make sure that the bundle you just looked in is the one twine is using
[14:09:01] <tos9> and in general you should use your OS's mechanism to manage those trusted certs, though that possibly won't have helped here
[14:09:58] <tryingtogetitrig> openssl s_client -connect isn't very clear. It says both that "verify error:num=2:unable to get issuer certificate" and that the verify return code is 0 (ok)
[14:10:13] <tryingtogetitrig> So I'm not sure whether it verified or not.
[14:20:02] <mgedmin> I don't know how to interpret that output either
[14:20:32] <mgedmin> wait, maybe that cert you gave is just an intermediate?
[14:20:52] <mgedmin> do you have a self-signed root cert (the one shown as CN=IT-CERTROOT01-CA in the issuer field)?
[14:21:13] <mgedmin> ... is the root cert necessary for verification if you have the intermediate marked as trusted?
[14:23:54] <tryingtogetitrig> Here's where my knowledge of certs gets hazy. I think what I have is right. Internal pypiserver has a cert that's signed by our internal CA. Internal CA is self-signed, but is the thing in /tmp/justourca.pem
[14:24:10] <tryingtogetitrig> no other certs in the chain.
[14:24:34] <tryingtogetitrig> Not certain that answers your question, though.
[14:52:57] <aruna1234> I have multiple python files and I have imported one file to another however one particular variable that is in the this other file isnt getting imported
[16:28:19] <tryingtodotherig> super-exciting resolution to my pypiserver/openssl problems from earlier -- I'd configured pypiserver w/a cert that had 3 certs in the full trust chain. 1) organization's root 2) intermediate issuing cert 3)pypiserver cert.
[16:28:34] <tryingtodotherig> My pypiserver only served up the bottom 2.
[16:29:41] <tryingtodotherig> So when I had the public cert for the issuing CA, openssl wasn't happy because it only got as far up the chain as the issuer, not the root.
[16:30:04] <tryingtodotherig> That is, when I specified -CAfile and pointed to the public cert for the issuing CA...
[16:30:23] <sumanah> tryingtodotherig: thanks for the resolution report!
[16:30:34] <sumanah> tryingtodotherig: you've told pypiserver? (bug report or similar?)
[16:30:42] <sumanah> (assuming you think they ought to know/fix anything)
[16:30:43] <tryingtodotherig> What I needed to do was to put both the issuing CA and the root CA in the file pointed to by -cert and then it all worked.
[16:34:00] <sumanah> tryingtodotherig: my experience with https://github.com/pypiserver/pypiserver has been good for talking to mplanchard -- a "here's what I ran into, what docs ought it go into?" would probably go fine IMO
[16:34:27] <sumanah> tryingtodotherig: what docs did you look at when you were trying to fix it? http://packaging.python.org/ or something else?
[16:36:42] <tryingtodotherig> sumanah: github and https://python-packaging-user-guide.readthedocs.io/ and a bunch of random stackoverflow posts
[16:37:01] <tryingtodotherig> sumanah: What's the best way to get in touch with mplanchard?
[16:37:29] <sumanah> tryingtodotherig: posting an issue on https://github.com/pypiserver/pypiserver , IMO, @-mention @mplanchard and you can say that @brainwane suggested you do so :)
[16:38:11] <sumanah> tryingtodotherig: if you prefer you could open an issue at https://github.com/pypa/python-packaging-user-guide/issues/ which is about the Python Packaging User Guide. Again, feel free to @-mention me (@brainwane) to say I suggested it was a reasonable idea
[16:38:44] <sumanah> either/both (PyPUG & pypiserver) seem, to me, reasonable places to say: I had this experience, what can we do to point others in the right direction?
[16:50:47] <jellycode_> Can't seem to get my python packages to upload to my private pypi repository... it was working the other day
[16:51:13] <sumanah> jellycode_: please tell us more specifics :)
[16:51:15] <jellycode_> The error is: HTTPError: 400 Client Error: Bad Request
[16:51:30] <jellycode_> I've spent about 3 hours testing, I think i've narrowed it down
[16:51:39] <sumanah> jellycode_: ok. when was the last time it was working, and when did it stop working? and what specific software is your private repo running?
[16:51:49] <jellycode_> Working on typing it out :)
[16:52:16] <sumanah> jellycode_: if you need to paste a log, I find http://hastebin.com/ useful
[16:52:28] <jellycode_> Long-story-short, I think it might be related to this line in PKG-INFO: Metadata-Version: 2.1
[16:52:52] <jellycode_> If i download a package i upload the other day, and re-upload it, it works.
[16:53:00] <sumanah> jellycode_: oh maybe the private repo software you're running doesn't know how to deal with https://www.python.org/dev/peps/pep-0566/ which is new
[16:53:08] <jellycode_> Yes, i think that must be it
[16:53:37] <jellycode_> I just tried downgrading twine to 1.10.0, but then i realized that file is created during the command `python setup.py sdist`
[16:53:48] <sumanah> jellycode_: twine, pkginfo, setuptools, wheel, and other related Python packaging tools for PyPI.org now have support for this new metadata
[16:54:03] <sumanah> jellycode_: I suggest you talk to your vendor or developer to get them to follow suit.
[16:54:21] <jellycode_> Currently, what I would need is a flag to pass to python setup to force it to use the old metadata
[16:54:24] <sumanah> jellycode_: Here are the Core Metadata Specifications https://packaging.python.org/specifications/core-metadata/
[16:54:47] <sumanah> jellycode_: this sounds like a feature request for setuptools http://setuptools.readthedocs.io/ to me
[16:54:50] <di_codes> jellycode_: I think that can be done by omitting any 2.1 metadata from your `setup.py`
[16:55:37] <jellycode_> don't have either of those fields currently
[16:55:49] <sumanah> di_codes: I could be wrong but it could be that jellycode_'s internal pypi chokes specifically on anything other than "Metadata-Version: 2.0"
[16:56:09] <di_codes> i’m pretty sure setuptools will only write out the 2.1 version if either of those fields are present
[16:59:51] <jellycode_> so, i'm using somebody elses setup.py template
[17:00:10] <jellycode_> let me just start hacking and slashing and see if i can get it to produce the older version
[17:00:46] <sumanah> jellycode_: also are you aware of https://github.com/pypa/warehouse/issues/3275 and https://pyfound.blogspot.com/2018/03/warehouse-all-new-pypi-is-now-in-beta.html ?
[17:01:00] <di_codes> jellycode_: you could also just downgrade your setuptools: `pip install setuptools==38.5.2` should do the trick I think
[17:02:26] <sumanah> jellycode_: if your local Artifactory instance also pulls from pypi.python.org, it may be affected (https://wiki.python.org/psf/WarehouseRoadmap is our rollout roadmap). I believe the current plans mostly ensure you will have uninterrupted service as long as JFrog solves their redirect issue by the end of this month.
[17:02:38] <sumanah> jellycode_: you may wish to spread the word among other Artifactory users.
[17:03:53] <sumanah> jellycode_: for the improved pypi.org, di_codes and I are two of the contractors on a grant-funded project https://pyfound.blogspot.com/2017/11/the-psf-awarded-moss-grant-pypi.html
[17:04:02] <jellycode_> thanks, i'll let artifactory team know
[17:05:40] <jellycode_> wow, would love to get such a grant for my team, we're all volunteer OSS packagers as well: https://github.com/bincrafters
[17:05:52] <di_codes> jellycode_: *also note that if you’re building a wheel you’ll probably need to do the same: `pip install wheel==0.30.0`)
[17:05:55] <jellycode_> We get some small donations and sponsorship from JFrog
[17:06:01] <jellycode_> but, it only pays for the CI
[17:06:32] <sumanah> jellycode_: are any of you coming to PyCon North America or EuroPython? https://wiki.python.org/psf/PackagingSprints we'd love to see you there
[17:06:35] <jellycode_> If you would be willing to talk about how you arrived at such a grant, I would love to learn about that process (offline)
[17:07:21] <sumanah> jellycode_: I can briefly say that the MOSS application process is fairly straightforward! https://wiki.python.org/psf/PackagingWG looked at https://wiki.mozilla.org/MOSS and put together a proposal and submitted it
[17:09:42] <sumanah> jellycode_: I don't have personal experience getting a grant from these, but you may also be interested in https://www.opentech.fund/ which has a lot of different requests for proposals, or https://foundation.mozilla.org/fellowships/apply/
[17:09:51] <sumanah> jellycode_: and I presume you've heard about http://tidelift.com/ which just launched a thing
[17:11:17] <jellycode_> thank you so much, definitely applying for this. Also, thank you for the invitation, I might actually try to come to pycon just to meet you guys and talk a bit since we all work on packaging. I live in MI, so it's a short trip. The only challenge is that I'm already committed to https://swampup.jfrog.com/
[17:11:31] <jellycode_> And, insane a coincidence as it might be, the dates actually overlap :(
[17:11:31] <ngoldbaum> jellycode_: the sad fact is that many of these funding opportunities are undersubscribed because people don't apply for them!
[17:12:20] <jellycode_> Thanks for the info ngolbaum
[17:12:21] <sumanah> jellycode_: May is a hard month for conference overlap. :)
[17:12:42] <jellycode_> well, since you guys are leading a charge on OSS packaging for python, i think JFrog would love to have you come to swampupo
[17:12:50] <sumanah> jellycode_: if you don't already, I recommend you subscribe to http://pyfound.blogspot.com/ which has info about grants, etc
[17:12:53] <jellycode_> obviously, it looks unlikely at the moment
[17:13:11] <jellycode_> but, if you weren't aware of it, you are now :)
[17:14:52] <jellycode_> also, in case you're not aware, Conan is a newer package manager for C and C++ which is built on Python. we have a pretty healthy community seeding the central repository now, and the core devs are obviously python developers. so, there's probably a lot of overlap in challenges and solutions we could discuss in time
[17:15:06] <jellycode_> i'll made this a favorite on my irc channels, and stay in touch
[17:15:10] <jellycode_> thanks again for all the hlep
[17:16:47] <sumanah> jellycode_: thanks for the heads-ups and I do encourage you to let JFrog Artifactory users know about about the redirect issue so they can stay apprised
[17:16:54] <sumanah> just in case something goes wonky
[17:21:25] <davidlloyd> can anyone confirm if https://pypi.python.org/pypi/pip is meant to point at v10.0.0b2? that sounds like a beta, and is causing bugs for my company
[17:23:03] <ngoldbaum> davidlloyd: that's the old interface for pypi, which doesn't know about prereleases
[17:23:14] <ngoldbaum> davidlloyd: see https://pypi.org/project/pip/
[17:23:33] <ngoldbaum> davidlloyd: if you are upgrading and getting the prerelease pip 10 that's either a bug or you are updating from a very old pip
[17:24:02] <davidlloyd> i'm updating using virtualenv/pypy and it is putting me on 10.0.0b2
[17:24:15] <ngoldbaum> someone in here mentioned a bug in virtualenv that caused similar behavior
[18:29:57] <sumanah> also jellycode_ thea is very receptive to suggestions for improvements in the http://packaging.python.org/ tutorials, guides, and discussions
[18:46:16] <di_codes> jellycode_: you probably want EXTRA_INDEX_URL
[18:47:16] <jellycode_> i believe the artifactory "virtual" pypa repo's do pass-through, where it tries to resolve packages locally, and then fall-back on central repo if not
[18:47:32] <jellycode_> seems to be working that way in my experiment
[18:49:14] <jellycode_> Also, in my CI, i want to say "pip install my_package --upgrade --force-reinstall". However, i only want to force-reinstall my package. I'm thinking of doing these two commands to satisfy... is there a better way? 'pip install mypackage' + 'pip install mypackage --no-deps --upgrade --force-reinstall'
[18:49:49] <di_codes> jellycode_: separate commands are the way to go
[19:50:34] <eric97477> I can provide the complete output if needed.
[19:50:44] <eric97477> but, that part seemed to be the most relevant.
[19:51:11] <sumanah> eric97477: thank you. Is this happening for other packages you download via pip also?
[19:52:05] <eric97477> I have not found it happening with other packages, but my tests have not been exhaustive. I simply needed a solution for the issue mentioned in #distutils and that is how I came across the problem with Sphinx.
[19:53:40] <ngoldbaum> or, more properly, when setuptools tries to use pkd_resources while it imports itself
[19:53:53] <eric97477> Understood and that is perfectly fine. I am simply looking for the best solution to distributing packages to offine machines. It would have been nice if 'pip download' was that generic solution, but as you mention, some packages are tricky and there is no easy answer for that.
[19:56:04] <eric97477> So, I just skimmed through the bug report...seems like it could be related to python 3.5, which I am currently using, and fixed for 3.6?
[19:56:39] <eric97477> I can confirm it works with python 3.6.
[19:58:32] <ngoldbaum> ah, namespace packages are the worst confirmed ;)
[19:58:38] <eric97477> Now, if anyone here is a python packaging and distribution guru, I've got a question for you that I have been asking around....but seems off topic for this channel.
[20:01:15] <eric97477> ngoldbaum: I have a situation where I needed a namespace package solution...I just selected the "easy button" and prefixed my packages name with <namespace_name>_<package_name>. Working so far. I looked at the various namespace solutions and they all seem a bit gnarly.
[20:01:35] <eric97477> thanks sumanah. I will try that channel.
[20:04:00] <ngoldbaum> eric97477: IMO namespace packages are more trouble than they're worth
[20:05:27] <eric97477> I would agree. They don't seem to buy you much...or at least I don't see the compelling need over just prefixing package names. But, if someone wants to prove me wrong, I always like to learn something new.