[02:46:00] <tdsmith> New tool to help package Python applications and their dependencies into Homebrew formulas: https://github.com/tdsmith/homebrew-pypi-poet
[02:59:17] <dstufft> tdsmith: neat, I thought homebrew didn't package stuff that could be pip installed, is that wrong?
[03:04:52] <dstufft> tdsmith: btw, pip 6.0.8 and virtualenv 12.0.7 released, jsut a fyi!
[03:17:28] <tdsmith> current thinking is we don't like packaging libraries unless they're difficult to pip-install but we don't discriminate against command-line apps that happen to be written in python, dstufft
[03:19:51] <tdsmith> we bundle apps and their python dependencies together and keep everything out of the global site-packages which is actually kinda nice
[03:25:36] <dstufft|laptop> tdsmith: are you using something like pipsi for that or doing it manually?
[03:31:55] <tos9> pipsi has some fairly annoying parts
[03:32:02] <tos9> probably I should send mitsuhiko a PR at some point to remove some of them
[10:46:22] <mgedmin> ok, I think I know what the problem is: xargs runs ['pip', 'install', '-e git+https://...'] instead of ['pip', 'install', '-e', 'git+https://...']
[10:46:51] <mgedmin> fixup with sed 's/^-e /-e' maybe?
[10:48:56] <kevc> that seems to work, althought not quite clear why. Now it gets ['pip', 'install', '-egit+https://...'] and it works
[10:51:01] <mgedmin> the why is clear to me: pip thinks the URL you specify in '-e git+https://...' is ' git+https://...' with a leading space
[13:26:48] <theuni2> I uploaded a file to pypi with setup.py upload. The upload failed (Upload failed (503): backend read error) but the file shows up.
[13:27:04] <theuni2> Also, the file is a valid zip file with no errors but it has a different md5sum than I do.
[13:27:04] <mgedmin> see also: the byzantine generals problem
[13:27:25] <theuni2> And I'm not sure what triggered the 503
[13:27:32] <mgedmin> if you cmp it with your copy, what happens? is it an exact prefix?
[13:27:53] <theuni2> as i can't delete files any longer on pypi (yay) i wonder whether i will keep triggering errors on the server doing new releases ...
[13:33:46] <mgedmin> http://mina.naguib.ca/blog/2012/10/22/the-little-ssh-that-sometimes-couldnt.html is an amazing story about in-flight data corruption
[13:33:50] <theuni2> it always got a byte value 216 to 132
[13:47:58] <dstufft> mgedmin: mirrors are useful in some situations, public mirrors are probably not generally useful tot he average person, though some cases like China they can make a lot of sense
[13:48:20] <dstufft> (I don't think its a coincidence that most of the mirrors are in CN)
[13:49:42] <dstufft> CN is an interesting problem, because they have decent bandwidth inside of CN, but the pipes in and out are heavily congested
[13:50:16] <dstufft> and Fastly doesn't have a CDN pop in China because of the way the laws of China are setup, that Fastly the company can become liable for what all of their customers do on the CDN, as if Fastly itself were doing it
[13:50:50] <dstufft> Fastly does offer a thing where they can setup a Fastly POP custom for the customer on customer owned hardware inside China, but that's $$$
[13:52:48] <theuni2> also i like the control i get with a really local mirror in my datacenter :)
[13:53:01] <theuni2> i think fastly does a more than decent job
[13:53:08] <theuni2> but i really hate transparent middleboxes
[13:53:22] <theuni2> we had a couple of incidents in our data center where fastly problems look like our problems
[13:53:59] <theuni2> also, the mirrors are a nice insurance that does not require a central authority
[13:54:12] <theuni2> fastly itself may be distributed, but then again its a single commercial entity we all start to relying upon
[13:54:25] <theuni2> adding a bit of self-sustainability is nice in itself
[13:55:01] <theuni2> i should have just done it and add a self-updater right into bandersnatch
[13:55:29] <theuni2> i mean. it is mirroring newer versions of itself anyway
[13:56:43] <dstufft> the mirroring protocol is an important feature, it's really just the public mirrors that for the average person aren't generally worth it (though it's also not hard to make a mirror public if you're running one in your DC)
[13:57:37] <dstufft> which is why PEP wahtever just ditched the mirror discovery protocol and not mirroring all together :D
[13:59:03] <theuni2> the weird thing is that bandersnatch was my way to respond to the previous version that was _really_ hard to keep running
[13:59:08] <dstufft> The current pypi-mirrors.org is a pretty good example of why Fastly is a better solution than mirroring for the common case
[13:59:12] <theuni2> now you still need to update the software every now and than
[13:59:34] <theuni2> yeah, a good commercial entity usually has better follow-through
[13:59:50] <theuni2> especially if they give you something for free that they depend upon with their core business, not a side-thing
[14:00:16] <dstufft> (I <3 Bandersnatch though, and pypi-mirrors.org, between the two of them we can pretty much detect whenever we have a broken purge somewhere)
[14:00:35] <dstufft> I know we discovered a bunch of bugs in Fastly in the begining from it
[14:00:50] <theuni2> heterogenity keeps everyone on their toes :)
[14:01:19] <dstufft> I know that Openstack loves their bandersnatch mirrors too
[14:01:26] <dstufft> they run an insane number of test jobs in a day
[14:01:37] <theuni2> yeah, got quite some good feedback and really exotic edge cases from them
[14:01:42] <theuni2> i was hoping we won't have those
[14:01:53] <theuni2> but luckily we're better at dealing with them than the old client
[14:02:05] <dstufft> a non trivial amount of them would fail if they relied on the CDN, not because of the CDN being bad but because the internet doesn't actually work
[14:03:28] <theuni2> reminds me of the reasoning of one of the py core guys why google runs the python core test suite the way they do
[14:03:35] <theuni2> no tolerance for false positives
[17:54:19] <tomprince> .egg-info or .dist-info directories in site-packages.
[17:57:04] <famille> don't find them... Is it true on windows also ?
[17:58:31] <famille> ah, ok independant files you mean . thanks !
[17:59:50] <famille> I hoped it was centralised in a file, apparently not
[18:33:35] <ggherdov`> Hello. I made my virtualenv with "virtualenv --no-site-packages foo", but when I run "activate" and run some python program looks like my "imports" still go to global site-packages.
[18:33:35] <ggherdov`> Running "yolk -l" from inside the env confirms that I have duplicate packages, and the global ones are "active". How do I isolate my env from the external world?