[01:46:23] <prometheanfire> did pypi change it's url generation for source urls?
[01:46:26] <prometheanfire> https://pypi.python.org/packages/b3/0e/5f3ee8884866a3d5e3b8ba86e9caa85ecdec75adabac8924b1c122339e7f/ansible-2.0.2.0.tar.gz vs https://pypi.python.org/packages/source/a/ansible/ansible-2.0.2.0.tar.gz
[01:46:31] <prometheanfire> the second way is the standard way we (gentoo) access python source for packaging
[01:51:58] <njs`> ngoldbaum: oh haha don't ask. In theory I am supposed to be figuring out how to convince UC that contributing to FSF projects is okay. Actually I am ignoring it and hoping it goes away, because UC seems to be under the impression that GPLv3 is the devil and that if they stick their heads under a rock it might go away.
[01:54:28] <prometheanfire> Basically, did pypi change the url pattern for files? If so, we need to know so we can change our package manager to compensate.
[01:57:04] <gchristensen> prometheanfire: heh, us too ... great question.
[02:03:25] <Arfrever> prometheanfire: "b3/0e/5f3ee8884866a3d5e3b8ba86e9caa85ecdec75adabac8924b1c122339e7f" is so unpredictable that there would be no way to use it.
[02:04:05] <prometheanfire> Arfrever: we'd be able to look up the link from https://pypi.python.org/simple/ansible
[02:07:15] <Arfrever> Hardcoding such strings in Gentoo ebuilds would be rather unacceptable.
[02:08:24] <Arfrever> dstufft: https://pypi.python.org/packages/source/${first_character_of_package_name}/${package_name}/${file} scheme is still supported?
[02:08:45] <prometheanfire> it would be that the pypi url helper would do a url lookup in the /simple page first, not perfect by any means
[02:08:52] <prometheanfire> Arfrever: ya, that's what I want answered
[02:14:16] <njs`> Arfrever: ...don't you need to get the URL from some independent index anyway? How do you know that you want "2.0.2.0" unless you've looked in some index?
[02:15:09] <njs`> (and I assume that URL is a hash; I'm surprised if gentoo considers putting hashes into ebuilds to be unacceptable; I would guess that it was mandatory :-))
[02:15:48] <prometheanfire> that's the source uri to just about every python package I package
[02:16:02] <njs`> prometheanfire: ah, so the issue is just that right now some maintainer has to look up the latest version and plug it into the file, but with the new urls they'd have to look up both the latest version and the latest version's hash?
[02:16:05] <gchristensen> Arfrever: is it a hash of some sort ? or just ... random?
[02:18:56] <njs`> (to be clear I can see how this might be annoying, esp since it breaks stuff that works, and have no idea if it was intentional. just trying to understand the issue)
[02:27:00] <tdsmith> otoh: python -c 'from requests import get; r = get("https://pypi.python.org/pypi/{package}/{version}/json"); print [u for u in r.json()["urls"] if u["packagetype"] == "sdist"][0]["url"]'
[04:56:29] <dstufft> it's a blake2b hash of the file contents with digest_size set to 32 bytes
[04:57:03] <dstufft> it solves a few bugs that i'm not awake enough to go into details right now, but if you're still around tomorrow I can explain better
[04:57:38] <prometheanfire> dstufft: I'll be around, this heavilly alters the workflow of our packaging system
[11:33:46] <paultjuh> ionelmc: Skipping file:///tmp/wheelhouse/scipy-0.17.0-cp34-cp34m-manylinux1_x86_64.whl because it is not compatible with this Python
[11:34:37] <paultjuh> but a similair package numpy is installed fine
[11:35:11] <paultjuh> installing numpy with the wheel numpy-1.11.0-cp34-cp34m-manylinux1_x86_64.whl (I think it is actually build by the pip wheel step)
[11:37:00] <paultjuh> ionelmc: are there any more details you are missing?
[11:37:25] <ionelmc> paultjuh: are you installing the wheel in a different place?
[11:37:38] <ionelmc> like somewhere where there's too old pip/wheel packages
[11:38:03] <paultjuh> ionelmc: no it is actually the same virtual env
[11:38:09] <ionelmc> the manylinux1 tag is fairly new, you need recent tools to install it
[11:49:16] <paultjuh> ionelmc: thanks. The python was the same, but in a different virtualenv, while the wheel builder virtualenv had an upgrade pip , and the installing virtualenv didn't have it
[11:49:57] <paultjuh> it is the pip coming with some version of ubuntu :)
[15:18:27] <prometheanfire> dstufft: thanks for the link
[15:30:34] <prometheanfire> dstufft: well, just so you know the impact, it's going to disallow any updates sourced from pypi to our distro (gentoo), at least til we can fix and distribute the fix to all users
[15:30:48] <dstufft> prometheanfire: gchristensen no problelm :) Hopefully it's not too much of a burden to y'all. I just assumed that it wouldn't be too hard for folks. Unfortunately I can't keep the constraints that every single consumer uses in my head :( and Debian had switched to a translator previously (and it worked out well for them) so I just assumed that anyone else could adjust without much trouble.
[15:31:09] <gchristensen> dstufft: yeah, you do you :)
[15:31:26] <prometheanfire> ya, I think the second option will be best
[15:31:47] <gchristensen> dstufft: NixOS uses sha256s internally to verify (as a requirement for all packages) so it won't be that terrible. will old packages continue being at the same URL?
[15:32:25] <prometheanfire> ya, we use a few diferent sha types
[15:32:36] <prometheanfire> you'd have to break them all at the same time
[15:32:56] <prometheanfire> sha256 sha512 and whirlpool
[15:33:54] <prometheanfire> dstufft: made https://bugs.gentoo.org/show_bug.cgi?id=580648 if you are intrested
[15:37:09] <dstufft> gchristensen: Currently I'm migrating the old packages so they are available at both the new and the old URL (which involves making a copy of all ~267GB of PyPI). So as far as PyPI is concerned the old URLs will be gone, but we serve /packages/ directly from S3. When I did the migration on Test PyPI I just deleted the old files once the migration was done so that they were only at the new URL. I can probably think of something that will keep
[15:37:10] <dstufft> the old URLs working though without needing to keep a huge copy. though if you're using a mirror (like made by bandersnatch) instead of PyPI you won't get the old URLs
[15:38:40] <dstufft> So for now I'll make sure the old URLs still work and let me see what I Can come up with to keep them working forever.
[15:39:23] <dstufft> probably I can just write a tiny placeholder file to S3 with a Location header and have Fastly redirect to the new URL if it finds one of those files
[15:40:03] <prometheanfire> Arfrever: dunno if here or -python is better for us
[15:40:44] <gchristensen> ah, ok. thank you for the clarification, dstufft. What is the best way to get notified about changes to the existing URLs, should they no longer work?
[15:40:45] <Arfrever> dstufft: Maybe use symlinks for old URLs.
[15:40:58] <gchristensen> S3 doesn't have symlinks
[15:41:00] <prometheanfire> dstufft: so the new method would be canonical and the old method would be legacy, more or less
[15:41:29] <Arfrever> dstufft: Any type of automatic lookup at Gentoo side is not possible.
[15:42:54] <dstufft> Yea, the small placeholder file will basically be a fake symlink that gets turned into a HTTP redirect by Fastly
[15:42:56] <dstufft> since we front everything with Fastly
[15:44:10] <dstufft> gchristensen: I'm pretty sure I'll figure out a way to make them work. I wasn't planning on it, but to be honest that's mostly because it slipped my mind that some downstreams didn't rehost things in their own repositories and instead just pointed to PyPI with some hashes to ensure integrity (and maybe with some patch files)
[15:44:25] <dstufft> I don't want to break the world of existing packages that point to those urls like that
[15:45:02] <dstufft> idc if I break the world on TestPyPI though, if you use that you get what you deserve :P
[15:45:36] <gchristensen> dstufft: would you prefer if we did more actively rehost? we do rehost things we've built, but (like gentoo) we do some source-based packages, so users might look to the origin.
[15:46:43] <dstufft> gchristensen: Nah, it doesn't matter to me. It's a perfectly valid way to use PyPI and I want to support it.
[15:46:49] <prometheanfire> Arfrever: it'd be nice if we could do auto-lookups since we are still protected by shas/signing
[15:50:42] <dstufft> I don't know how you generate those files like that. Is it by hand or is it done by a tool?
[15:51:26] <dstufft> THere is a b2sum utility which will let you compute the blake2 hash of a file on disk
[15:51:44] <prometheanfire> I manually went and did it
[15:52:06] <gchristensen> dstufft: while I've got you on the horn here, I think the way packaging of pypi packages in nixos is a bit brittle, and could be improved. is there someone I could collaborate with from pypa who would provide guidance and advice on making it more robust and in sync with how pypa expects things to work?
[15:53:01] <dstufft> gchristensen: I'm probably your best bet. I think I'm the person who has the largest amount of the entire stack in their head.
[15:53:20] <dstufft> I need to go put something together for my wife real quick though :]
[15:53:49] <gchristensen> dstufft: I'm not ready to do this either, could we schedule something for later today, or even another day?
[15:54:13] <dstufft> gchristensen: sure, I'm typically around though tomorrow isn't good for me
[15:55:48] <gchristensen> dstufft: yay! I'll try and ping you later today, if I can reschedule some things.
[16:00:55] <tdsmith> dstufft: thanks for keeping existing URLs live! i would be interested to know if that's a transition state that will go away or something that'll be permanently supported
[16:01:06] <tdsmith> net impact for homebrew is small in the former case and zero in the latter case
[16:27:04] <dstufft> tdsmith: I'll figure out a way to keep the existing urls "permanently" (as in, no plan to remove them now. If in some time I look at the logs and I see nobody actually using them anymore because it's been X years and everyone's since upgraded to pointing at software that's been released since the new urls went live maybe I'll remove them then)
[16:28:02] <dstufft> tdsmith: that being said, there's a non zero chance they end up broken without me noticing because the support for them won't be in warehouse, it'll be in Fastly which we don't have any testing for and I probably won't remember to test them manually when I make VCL changes
[16:28:43] <dstufft> if that happens and someone notices and tells me, I'll fix it, but jsut as an FYI that there's still a benefit to switching older packages to the new URLs
[16:46:09] <gchristensen> dstufft: curious that pypi is continuing with the md5 in the URL, do you have plans to update that?
[16:59:24] <dstufft> gchristensen: I'm planning to replace the legacy pypi.python.org code base with the code base that is currently running at warehouse.python.org
[16:59:28] <dstufft> which includes sha256 in the URL :]
[18:59:07] <dstufft> I just mean it means I need to care a bit more about making sure the thing is reliable and scales than if a few build machines for various distros were going to be using it
[18:59:54] <dstufft> it's not hard though, it's a stateless HTTP process and I can throw it behind Fastly too
[20:07:10] <floppym> Sorry, I was in a meeting. Any questions for me?
[20:07:28] <floppym> Yes, portage just calls wget, which should follow HTTP redirects.
[20:16:27] <floppym> And yes, the hostname can be different, so long as we can grab both old and new tarballs from the new hostname with a path scheme like matching */f/foo/foo-1.tar.gz.
[20:17:05] <floppym> If we need to change the path, that means updating ~2000 files.
[20:17:54] <gchristensen> floppym: how do those versions get updated currently?
[20:18:51] <floppym> So say we have a package called setuptools in Gentoo.
[20:18:57] <floppym> We have a directory of "ebuilds":
[20:19:47] <floppym> portage parses those into PN = setuptools and PV = 1, PV = 2 ...
[20:20:25] <floppym> In the ebuild, we have something like SRC_URI="mirror://pypi/s/${PN}/${PN}-${PV}.tar.gz"
[20:20:48] <floppym> We can easily adjust mirror://pypi/ to point somewhere else.
[20:22:13] <floppym> If nothing changes from right now, we would need to make a manual adjustment to SRC_URI when the next setuptools version is released.
[20:22:35] <floppym> Versus just copying setuptools-2.ebuild to setuptools-3.ebuild.
[22:05:07] <dbrecht> has anyone ever brought up adding a venv distribution type in addition to wheels and [sb]dist? i have a use case where it would be optimal to package up and deploy a self contained venv, along with all dependencies. currently using cx_freeze to facilitate this kind of thing, but seems like it would be a nice addition to the official packaging framework
[22:06:51] <dbrecht> i now remember coming across it some time ago before i started looking into this for my current environment. will spend some time reading up on it. thanks
[22:07:59] <dbrecht> ah yeah.. one of the things that i'd /like/ to be able to keep is the ability to debug a system on a target host in prod. yeah yeah.. hand bombing is bad, but there are /some/ cases where debugging on a remote host is invaluable
[22:08:08] <dstufft> To be fair, I haven't actually used it myself, but I think it is in the vein of what you're talking about
[22:08:19] <dstufft> dbrecht: you mean modify the running files?
[22:09:02] <dbrecht> not necessarily (altho you could, say, stick in a pdb breakpoint in a file and hope that apache or whatever webserver you're running doesn't reload while it's there ;))
[22:09:23] <dstufft> Not sure if pex has that capablity :/
[22:09:23] <dbrecht> more likely just using -m pdb to start something
[22:13:53] <njs> dstufft: I guess my one question after all of that stuff about the URLs is much less important than the others. I'm just curious why blake2b :-)
[22:14:33] <njs> IIUC pex files are basically just a zip file containing a venv, minus the actual python executable and stdlib
[22:14:53] <dstufft> njs: faster than sha256 with better security properties
[22:18:11] <dstufft> njs: it is at least as secure as sha3, but is faster.
[22:19:01] <njs> ngoldbaum: annoyingly, even py35 doesn't ship newer than sha2 :-/
[22:19:25] <ngoldbaum> njs: sure, but I can always use shasum -a 256 or sha256sum
[22:19:46] <njs> dstufft: huh, cool. I have tried to look up hash comparisons a few times recently and not found any good references
[22:20:09] <dstufft> py35 doesn't ship sha3 becasue sha3 wasn't officially finalized before py35 was released
[22:20:33] <dstufft> would have been real crummy to have a sha3 that wasn't actually sha3 if they adjusted the parameters or anything
[22:20:46] <njs> ngoldbaum: sha256 is one of the sha-2 varieties; at least python 2.7 and 3.4+ always have sha256 (don't have older versions handy to test)
[22:21:32] <njs> oh huh, I missed that there were 3 years between when they announced the contest winner and when they actually finalized it
[22:21:33] <dstufft> for what it's worth, blake2 isn't going to be replacing sha2 for what pip and such actually verify or what is on the /simple/ page. I just used it for what's in the URL because I could mostly :]