[07:12:23] <kennethreitz> always been that way though, and it's gotten a lot better
[07:12:34] <kennethreitz> but the better it gets, the higher the expectations get
[07:12:43] <apollo13> yeah, but first things first
[07:15:06] <kennethreitz> agreed. my first thing's mental health, so i ain't going near any of that :P
[07:45:42] <ionelmc> kennethreitz: for packaging arbitrary files see https://github.com/pytest-dev/pytest-cov for an example (it add that special ".pth" file)
[07:46:18] <ionelmc> it handles more than just the sdist of course
[16:13:59] <dstufft> kennethreitz: pip itself caches /simple/*/ for ~10 minutes
[16:14:51] <dstufft> We do instantly purge on the Fastly side, though if you're uploading to PyPI instead of Warehouse that sometimes fails and doesn't get retried
[16:15:12] <dstufft> and if you're using the latest twine to upload, you're uploading to Warehouse by default ;)
[16:22:49] <xafer> hello dstufft, regarding https://github.com/pypa/setuptools/pull/631 , I don't see why we wouldn't want to implement Requires-External ? I agree it won't be generally useful but it has some usecases.
[16:24:24] <dstufft> xafer: it's the same level of attractive nuissance as ``requires`` is. It's something that sort of sounds like the thing you want, but because it's completely advisory it will likely just end up confusing people
[16:27:22] <xafer> Well I did not push External-Requires because at the time, I thought I could easily piggyback on requires instead but requires performs some data validation
[16:27:51] <xafer> I would not push for it on public packages
[16:28:29] <xafer> But in the context of private packages where you can control the metadata, it would be the perfect place to define system dependencies
[21:02:00] <[Tritium]> how are packages counted for the from page of legacy pypi?
[21:02:24] <[Tritium]> is that individual releases? each file in a release?
[21:03:15] <dstufft> [Tritium]: what is the "from page"
[21:04:28] <[Tritium]> https://pypi.python.org/pypi ...the front page
[21:04:49] <[Tritium]> "The Python Package Index is a repository of software for the Python programming language. There are currently 85194 packages here."
[21:05:15] <dstufft> [Tritium]: oh, that's "projects" in Warehouse terms
[21:05:28] <dstufft> e.g. top level names like "Django", "requests", etc
[21:05:46] <[Tritium]> ok, that is the final nail in the coffin of a bad idea
[21:06:19] <dstufft> there are 525,563 releases (e.g., name + version tuples, ("Django", "1.0"), and 640,927 files.
[21:06:27] <[Tritium]> it was suggested in -offtopic that pypi should... provide a torrent tracker for packages.
[21:07:42] <[Tritium]> average file size, as far as i can tell then, is 524K
[21:10:42] <[Tritium]> Is that information in the public data?
[21:11:38] <dstufft> [Tritium]: um, well you could compute it by making 85194 HTTP requests
[21:11:55] <[Tritium]> I am not going to subject my pipe to that heh
[21:15:19] <[Tritium]> A robust statistical api is something i might suggest/request/...pr ...in a year or two
[21:24:26] <dstufft> [Tritium]: sure! :) As far as stats go, a lot of that largely comes down to me not being very good at extracting what's reallly useful out of some data so not really knowing what to expose
[21:24:35] <dstufft> that's why the BigQuery stuff just exposes more or less raw data
[21:25:33] <[Tritium]> aye. that makes sense. The fire-hose method is usually the right way to go unless you have a reason not to