[18:02:30] <dstufft> <1MB for compress/decompress of gzip, 2-7MB for compress of bzip2, 1-4 MB for decompress of bzip2, 2-311MB for compression of lzma, 1-33MB for decompression
[18:02:53] <dstufft> but yea, I think lzma is wholly better than bzip2 (and gzip unless you care about speed)
[18:03:10] <Ivo> dstufft: that site does not say differently, it agrees with me
[18:03:18] <Ivo> lzma takes longer to compress than bzip2
[18:03:40] <dstufft> oh I'm stupid, I was looking at the wrong column
[18:04:07] <dstufft> I still think it's wholly better for packaging, you only compress them once :D
[18:04:13] <Ivo> but anyway compression time is the thing you don't look at for packaging
[18:05:15] <Ivo> but dstufft if wheels are lzma how will we break our own advice and magically import and run them with virtualenv and get-pip.py
[18:05:50] <dstufft> well probably we'd have to continue to use ZIP_DEFLATED for a long time
[18:06:04] <Ivo> could always write the 10 more lines to extract it first
[18:06:05] <dstufft> esp since LZMA is only on Python 3.3+
[18:06:31] <Ivo> and if it's still a .whl, how will a pip on python2 know not to grab it
[18:10:57] <dstufft> besides, putting the compression into a filename isn't unreasonable
[18:13:50] <dstufft> Ivo: oh and why use zipfile_with_lzma instead of just xz, because zipfile is a container format, xz as a container format doesn't support multiple files, it only holds a compressed stream
[18:13:56] <dstufft> so you have to pair xz up with something like tar
[18:15:44] <dstufft> FWIW zipfile compresses each indidivdual file on it's own, whereas .tar.xz compresses all of the files at once. The difference there is that I'm pretty sure .tar.xz will get better compression, but zipfile makes it easier to reach in and decompress just one file
[18:16:38] <Ivo> conventionally the overarching purpose of wheels would be to unpack them entirely
[18:17:48] <Ivo> I kinda want to think about an sdist 2.0 before a wheel 2.0
[18:17:51] <dstufft> sure, I'm not saying that switching to tar wouldn't be reasonable, although it's not unreasonable to think that Python might at some point support importing xz/bz2 from a zipfile since it should be as easy as supporting other compression types vs needing to write a compressed tar importer
[18:18:07] <Ivo> because the former could inform the latter
[18:18:12] <dstufft> and keeping it with zipfile makes it easy to keep the same code for handling wheels
[21:12:38] <Ivo> dstufft: if you want dependency_links to kick the bucket, have you had any thought about enabling people to host interdependant packages in source control through any other means / method?
[21:34:37] <tomprince> Put up an index with vcs links.
[21:47:49] <Ivo> tomprince: does everybody really have to host their own http server, together with some tool for compiling links, for a couple of packages?
[21:49:36] <tomprince> It could be a gist or in a repo (accessed raw)
[21:50:48] <Ivo> and because there is no md5 pip will constantly spit warnings that you need to use --allow-unverified
[21:51:56] <tomprince> Not for vcs hosted packages, which is the only real use caes I can see for dependency_links.
[22:03:20] <dstufft> Metadata 2.0 includes the idea of direct references
[22:04:48] <dstufft> which is like dependency links except not dumb
[22:04:49] <dstufft> also they'll be blocked on uploads to PyPI
[22:04:49] <dstufft> so you can't use a direct reference on something you publish to PyPI