[16:25:54] <ronny> dstufft: i reached a point where i need to wire up source distributions for gumby lf made python pckages, and the more i wade into things, the more i want to avoidd creating setuptools based sdists
[19:14:01] <ronny> dstufft: btw, how does this copr thing work?
[19:30:00] <dstufft> ok, so that's a 8+ feature, so you'll probably just need to do the rebase, sorry :/ I was going to say if it was a 7 feature I could rebase mine ontop of yours
[19:30:18] <dstufft> I'm going through the pending issues for 7 and kicking them out of 7 or doing them
[19:31:33] <dstufft> lifeless: re: using ~= in openstack, fwiw there is a corner case where people have an old setuptools but a new pip, it'll install fine but bomb out at runtime
[19:32:09] <dstufft> it's not as uncommon as you'd think to be in that scenario, because setuptools is a dependency in apt/rpm/whatever for a fair number of things, while pip isn't
[19:58:49] <lifeless> it will allow not running setup.py egg_info to discover deps
[19:58:55] <lifeless> it won't avoid the backtracking costs
[19:59:36] <lifeless> but, 300 deps should be big enough for most python apps
[20:00:21] <lifeless> and I am moderately confident that incremental installs will be snappyas long as nothing conflicts, since the current version will be probed first
[20:02:03] <dstufft> dependency_links have next to zero tests
[20:02:10] <dstufft> I think when I tried to remove them, they had 1 test?
[20:03:14] <lifeless> ok they are back in, in principle
[20:03:45] <lifeless> I still think they're a bad idea :)
[20:06:54] <lifeless> dstufft: can we drop the __PKG_NAME__ thing in egg_info's setup.py call ?
[20:07:14] <lifeless> dstufft: it only works for name distributions anyway, and ends up as None for path ones
[20:07:38] <lifeless> hah, in fact its not referenced in the template anymore anyway
[20:47:19] <ronny> dstufft: btw, what is the general consens on multi version installs?
[20:48:14] <ronny> dstufft: the more i look into pkg-resources/setuptools, the more it seems pip and virtualenv are just plainly wrong and add to the problem by looking simple
[20:50:36] <dstufft> multiversion installs don't allow you to have more than one version in a particular process, so you're still limited to 1 version per process. So a multi version installs only really gives you something that virtualenv doesn't in one case: You have a tool that _must_ be installed into the same environment as your application but _does not_ import your application
[20:50:56] <dstufft> off the top of my head... I think only packaging tools falls into that category.
[20:51:31] <ronny> dstufft: i think the more than one version per process is fixable (by adding something that does not use sys.modules/sys.path
[20:52:03] <ronny> dstufft: i have a direct use case for that as well, testing libs using a different version of a plugin lib than the app under test
[20:52:30] <ronny> (py.test has aextracted its plugin system and devpi/tox are starting to use that
[20:52:52] <dstufft> I'm not sure I grok that sentence, what is a "different version of a plugin lib than the app under test"
[20:53:12] <ronny> dstufft: pluggy is the pyest plugin framework extracted
[20:53:40] <dstufft> armin ronacher had a terrible hack that makes multi version imports in the same process work, but it's a terrible hack and is unlikely to work in a ton of use cases
[20:53:44] <ronny> dstufft: now tox for example uses it, and its impossilbe to test a tox version thats dependent on a different version
[20:54:26] <ronny> dstufft: dash from the #python channel created a toolset for doing multi version imports in a different way, unfortunately i dont recall the name
[20:54:51] <ronny> but bascially sys.modules and sys.path is a utter killer or such sytems from one side
[20:55:05] <ronny> and tracebacks lacking object metadata from the other side
[20:55:20] <dstufft> ronny: sorry, I don't grok the tox case here, why does tox need two different versions of pluggy
[20:55:55] <ronny> dstufft: imagine a tox version requiring one version being tested by a pytest version requiring a different version
[20:56:34] <dstufft> ronny: ok, and if pytest imports version X of pluggy, you can't import verison Y inside that same process so you're limited to only tests via subprocess then
[20:57:00] <ronny> dstufft: i think its entirely possible to have different loader contexts for module sets
[20:57:39] <ronny> but it requires a system without filling sys.modules
[20:58:57] <dstufft> ronny: ok, so until something that does that sanely exists, multi version installs don't enable much that virtualenv doesn't enable (and they come with their own set of problems). I'd argue that focusing on multi version installs before you have multi version import is putting the cart before the horse
[20:59:38] <ronny> dstufft: without multi version installs you cant de-vendorize things like setuptools and pip
[21:00:47] <dstufft> I would be opposed to de-vendoring pip even if we had multi version installs
[21:01:14] <ronny> for what reason? it wouldtn be possible to break it any more
[21:01:50] <tomprince> It would be a cool idea, if it worked, but the semantics of python aren't really compatible with what it wants to do.
[21:01:52] <ronny> tomprince: what actually breask it/
[21:03:12] <tomprince> python allows you do to imports at any-time. exocet tries to work by wrapping import to do special stuff while importing something via exocet, but if what you import does any lazy-imports, things break.
[21:03:37] <tomprince> That's the one I ran into, trying to use it. I'm sure there are other gotchas.
[21:03:40] <dstufft> ronny: it wouldn't be possible to break it by installing an incompatible version, but it's still more fragile, for instance if you trigger something that causes one of pip's dependencies to be reinstalled and pip gets interupted you'll end up without one of the dependencies installed. It's impossible to remove that situation completely (since you can interupt during a reinstall of pip itself) but each dependency is another window of chance
[21:04:19] <dstufft> There's also an increased surface area for bugs, since we don't rely on just basic import semantics anymore, but we now also rely on the mutli version import hook thing to be working
[21:04:56] <dstufft> we also end up having to ship all of our dependencies around anyways because a lot of cases we have pip installing itself from a wheel by adding itself to sys.path, so any of our dependencies end up needing to be available anytime pip is around anyways
[21:05:08] <lifeless> ronny: import X is affected by global state
[21:05:38] <lifeless> ronny: the only way to do multi-version imports is to fix the global state, which means the import statement needs to change: its a language change, not a library change
[21:05:49] <dstufft> vendoring is kind of yucky, but for pip it really is the least worst option
[21:05:53] <ronny> dstufft: tools like pip could pretty much set the requirements in stone in the scripts, and only sucessfull installs of the requirements would e allowed to rewrite the scripts
[21:06:15] <tomprince> ronny: I think multi-version installation would be interesting, in a language structured in a way that can deal with that (JS is like this). But python *isn't*, so pretending otherwise doesn't add any value.
[21:06:16] <ronny> dstufft: there is a reaon why setuptools put all requirements and versions into the scripts
[21:06:42] <dstufft> mutli version installation (and multi version inside on process) comes with it's own problems too
[21:08:32] <dstufft> As far as I can tell, we've had zero bugs as a result of pip bundling software, we've had a good number from attempts to make it possible to bundle software
[21:09:04] <ronny> dstufft: without multi version installs its impossible to unbundle to begin with
[21:09:48] <ronny> (there is a reason why tools pin versions, people make messes, without pinning things break)
[21:09:59] <dstufft> Debian has a hack for mutli version installs just for pip, where it installs the things pip needs to a special directory and then modifies sys.path to make sure the things we need come before anything else people install
[21:11:05] <ronny> dstufft: interesting, what about that brok? distribute vs setuptools?
[21:12:02] <dstufft> I'm pretty sure it was related to pkg_resources trying to handle multi version installs too :V
[21:12:35] <dstufft> I didn't do the root cause analysis on it, because once I figured out it was debian's fault I passed it onto them and forgot about it
[21:12:48] <dstufft> well stopped thinking about it
[21:13:47] <ronny> dstufft: currently it neither correctly handles setuptools vs distribute nor does it handle packages shadowing of packages using dist-info
[21:14:08] <ronny> basically the ditriubte to setuptools merge opened a whole can of worms that never was closed
[21:14:21] <ronny> and it unfolds in unexpected ways
[21:14:23] <dstufft> ronny: I guess a more accurate representation of my view points are, if someone invents a multi version install / mutli version import that actually works, I would investigate using it, but given what I know about Python semantics I don't think such a thing is actually possible without some fairly major changes to the language or by essentially creating a second import system
[21:14:50] <dstufft> and I'm not personally interested in trying to work on such a thing because I don't think it's workable in Python
[21:14:58] <ronny> dstufft: thats for certain, one has to leave out sys.modules
[21:15:13] <lifeless> dstufft: I find it useful to know the version being installed
[21:15:31] <lifeless> dstufft: any objection to the 'installing pacakges' line including it - e.g. foo(1.2) ?
[21:15:52] <tomprince> I'd say it partially closed a can of worms, but that we are still feeling some of the reprecussions of the split.
[21:15:53] <ronny> dstufft: multi version install is manageable sanely, multi version in a single process is something solveable but not in near future
[21:16:12] <tomprince> But the first doesn't have any significant value without the second.
[21:16:17] <lifeless> ronny: I think we have different definitions of sanely.
[21:16:35] <dstufft> lifeless: No objections, we already include what version is installed at the end of it the process, adding it in th emiddle seems reasonable too
[21:16:42] <ronny> lifeless: i think we are talking about different things
[21:16:49] <tomprince> lifeless: I'd be nice if it printed something that could be copy-pasted as a specifier.
[21:17:05] <lifeless> tomprince: I could do foo===version
[21:17:22] <dstufft> lifeless: there's a bug around the "what version jsut got installed code" so if you're poking at it looking at that wouldn't be unwelcome :)
[21:17:35] <ronny> lifeless: but ever since i understood setuptools encapsulated distributions i cantt help but thing that pip+virtualenv is a massive step back
[21:18:43] <lifeless> ronny: one name one thing in memory
[21:18:45] <dstufft> there was a reason I was against doing foo==version when we added it to the end of the pip install, but I don't remember what it was
[21:19:25] <tomprince> It is a step back in terms of features, but not in terms of *useful* features, and a step forward in terms of simplicity and comprehensibility.
[21:19:27] <lifeless> dstufft: oh thats an interesting bug
[21:19:37] <lifeless> dstufft: I saw that too, pkg_resources.WorkingSet returning cached info
[21:20:02] <lifeless> dstufft: I think there is a hidden cache somewhere. I worked around it by ignoring pkg_resources.WorkingSet :)
[21:20:08] <dstufft> we started out with easy_install and multi version installs
[21:20:13] <ronny> lifeless: current setuptools uses that, and its a major prevention of different contexts
[21:20:21] <dstufft> then pip and virtualenv came along and people preferred them
[21:20:47] <lifeless> ronny: but its the fundamental thing right? Changing anything that manipulates modules etc requires talking about that with e.g Guido
[21:21:08] <ronny> dstufft: of course its easy to prefer pip + virtualenv, they seem more easy, but they make so many details a mess
[21:21:43] <ronny> dstufft: the fact that you cant possibly devendor pip because it breaks things is a massive code smell that shows there is a problem
[21:22:32] <ronny> because vendoring is the practical equivalent of 'sorry, copy&paste was the only solution we could hanlde'
[21:22:57] <dstufft> ronny: if the only problem is pip I feel pretty good about that, because pip is an oddball
[21:23:09] <tomprince> ronny: I don't think that is really a code smell, because pip is an exceptional case that needs to work even if everything else broken.
[21:23:40] <dstufft> because pip has a bootstrapping problem, and pip has a case where pip needs to try really hard to keep working if someone mutates their environment because if you break pip you break their ability to fix their environment in many ways
[21:23:44] <ronny> dstufft: but well, i have a major peolbem there, if multi version install is killed and utterly unsupported, i cant hope to work ever on a working multi version in single-process
[21:23:59] <lifeless> dstufft: ronny you can't ever hope to work on that anyway
[21:25:45] <dstufft> ronny: well I mean pip and virtualenv made a decision, setuptools made a different decision. You can still do mutli version installs to this day, you just use easy_install instead of pip
[21:26:28] <dstufft> pip isn't going to add support for a feature that might someday be useful if a mess of other problems gets sorted out first
[21:26:44] <dstufft> the right order is to prove it's useful before it goes inside pip
[21:26:46] <ronny> dstufft: but eggs are DEAD, so doing anything that way is just ridign adead horse
[21:27:13] <lifeless> ronny: thats rather the point isn't it?
[21:27:14] <tomprince> So are multiple versions installs.
[21:28:06] <ronny> my basic impression is that if we dont make multi version installs possibe, and allow people to experiment in multiversion in process, then the packging system will drag python down
[21:28:22] <lifeless> ronny: they can experiment without pip support.
[21:28:38] <lifeless> ronny: its not a dependency on that.
[21:28:56] <tomprince> And, the packaging system was dragging python down, but the work of the pypa team has turned that around.
[21:29:01] <dstufft> lifeless: seems OK to me, the PR I linked was adding it to the final statement, the "installed prettytable-0.7.2 pystache-0.5.4" logging statement that happens at the very end of a pip install
[21:29:25] <ronny> lifeless: its a major complication, and not fixin it leaves python way behind
[21:29:30] <lifeless> dstufft: ok. so that one I would like to change too, because its not copy-pastable
[21:30:22] <dstufft> lifeless: I remember why i was against foo==1.0, it was just "maybe we'll want to change it someday and people will rely on it and then yell at me for changing it". Probably I don't care so much about that anymore (though I think foo (1.0) is a nicer looking output so either way)
[21:30:51] <lifeless> dstufft: ok, so I'll do something I like and you and tomprince and whoever else can bikeshed it out during review :)
[21:32:31] <dstufft> lifeless: pip: everything is broken
[21:32:34] <lifeless> I don't mean my code broke it(though it may have :P)
[21:32:40] <lifeless> just its conceptually incomplete
[21:32:42] <dstufft> somehow it installs things sometimes though
[21:33:33] <ronny> tomprince: basically i need a reasonable path to making multiple versions of a lib within the same python process possivle as well as a way not to break tools on package installs/upgrades
[21:33:54] <ronny> tomprince: the single version install model makes each of those completely impossible
[21:34:10] <dstufft> ronny: To be clear, I'm just one guy. I'm not Guido and I'm not a BDFL or anything. We generally have a policy in pip now that we don't add new major concepts (like packaging formats, metadata formats, etc) without a PEP backing it. Wheels didn't land without a PEP, PEP 440 didn't land without a PEP, metdata 2.0 won't land without a PEP, etc
[21:34:21] <dstufft> if you really feel strongly about this, write a PEP and propose iton distutils-sig
[21:34:39] <dstufft> we'll have a discussion and the community will find out the way we want to go
[21:35:02] <dstufft> if a PEP gets accepted, then pip can implement it
[21:35:29] <lifeless> Personally I'd resolve the testing issue by env-per-context
[21:35:47] <dstufft> We went far too long where packaging was essentially implementation defined, we're not going back to that :)
[21:35:49] <lifeless> to avoid subtle things like N-th tier deps that put global state on disk in different formats
[21:35:59] <ronny> dstufft: that practically maes it utterly impossible, im not paid to work full time on that - and it needs at least 2 motnhs fulltime work to make it propper
[21:36:12] <ronny> and thats a very optimistic estimate
[21:36:19] <lifeless> ronny: it would need that work to do it in pip either way right?
[21:36:31] <lifeless> ronny: so are you asking someone else to do it?
[21:37:21] <ronny> lifeless: experimental solutions have mich different time constraints than working out a fuly fledged pep, getting it accepted and then implementing
[21:37:39] <tomprince> What's stopping you from experimenting?
[21:37:42] <ronny> we are talking about the differences of prototyping vs publising products
[21:37:46] <lifeless> ronny: PEP's are implementation
[21:37:56] <dstufft> ronny: I'm sorry, I totally recognize I have a privilege where I could dedicate my free time (before I got paid to work on things) and then an even greater privilege where I get paid to work on it. Unfortunately OSS is built by people who are privileged enough to have that time, if you don't personally have time than find someone who does :) Or find someone who wants to sponsor you :)
[21:37:58] <ronny> tomprince: the whole ecosystem being counter the soltion im woring on
[21:38:03] <lifeless> ronny: and they're not polished products - see PEP-426 for isntance. its 90% scifi still.
[21:38:41] <tomprince> pip can install wherever you want (using --target), or you can use easy_install. And then you can experiment with your in-process work.
[21:38:59] <lifeless> ronny: ok so lets back this up a bit. The ecosystem *had* the chance with easy_install and multiple egg versions etc and has nearyl unanimously moved away from that.
[21:39:18] <tomprince> Once you have a demonstration of something workable, it would be reasonable to talk about adding support to the installation tools to support easily.
[21:39:21] <lifeless> ronny: they moved away because of reliability, robustness and cognitive overhead issues
[21:39:36] <lifeless> ronny: to turn that around you *need* to have good answers for the problems it caused.
[21:41:12] <lifeless> ronny: the root cause for most of the problems is that Python defines global state around package names.
[21:41:12] <ronny> th current situation is only robuts cause ov vendoring, all it did kill was cognitive overhad and disk space
[21:41:32] <dstufft> most of those don't really need to vendor
[21:41:39] <lifeless> ronny: no, vendoring is almost completely orthogonal to the reasons folk moved away from eggs.
[21:41:51] <ronny> dstufft: i regular run into version conflicts
[21:41:59] <dstufft> pip and setuptools are the only ones who really need to do that, and if setuptools gets rid of easy_install and becomes just a build tool, then it won't need to vendor either
[21:42:08] <lifeless> ronny: so for instance, if you wanted to convince me to enthusiastically support multiple-version installs
[21:42:25] <lifeless> you'd need to have Guido on board for an eventual multi-version-import system
[21:42:32] <ronny> dstufft: jinja 2.x needing to use the pacage name jinja2 instead of jinja pretty much shows the extend of the problem
[21:43:06] <lifeless> ronny: thats not a multi-install problem though, thats multi-import at the heart of it.
[21:43:46] <dstufft> there, you got multi version installs in pip
[21:44:14] <dstufft> you lose the pip dependency resolution and a bunch of other things and you might as well be using easy_install, but hey it's there
[21:45:15] <ronny> if its half broken in various ways, whats the practical difference from having it and just saying 'fuck you'
[21:45:31] <lifeless> ronny: because it actively harms folk running Python in production.
[21:45:51] <lifeless> ronny: *they* are the folk universally choosing to have only one version of a thing installed at once.
[21:46:06] <lifeless> ronny: buildout does this. virtualenv does this. distro packages do this.
[21:46:33] <lifeless> ronny: because troubleshooting multi-version installs is a nasty experience and no-one wants to do it twice.
[21:48:33] <dstufft> (this is a problem with the node ecosystem btw, yea you remove version conflicts at install time, but you jsut push it down into runtime where you have "weird" errors because thing A passed an object from thing Z 1.0 into thing B, thing Thing B was especting Z 2.0
[21:49:36] <dstufft> I think rust (is it rust? I think it is) has a decent solution to this, you can have multiple versions of something, but you can't expose any of it's objects in your public interface as your crate
[21:50:10] <dstufft> e.g. you can have multiple versions, but only if those multiple versions are entirelly an implementation detail of the crate, if any of them are part of the public interface of the crate, it's a public dependency and they get resolved to a single set of dependencies
[21:50:12] <ronny> not annotating tracebacks with version import contexts is a massive major blockign debugger bug
[21:50:31] <dstufft> but that relies on a typing system and interfaces and a lot of things that python doesn't have and are in many ways anti-python
[21:51:03] <lifeless> ronny: and again we're back to the core Python being a core limit
[21:51:33] <tomprince> ronny: It is isn't a fuck-you. It is just that the core python packaging tool isn't going support hack for a single person interested in experimenting with a new feature.
[21:52:00] <lifeless> ronny: but its more than that; break-on-install is arguably better than break-at-runtime, and ops reallllly prefer that
[21:52:15] <tomprince> Which doesn't in any way stop you from experimenting (and we pointed out several ways to leverage the tool to help with that experimentation).
[21:52:19] <lifeless> ronny: so as tomprince and dstufft say, this isn't the pip folk saying no.
[21:52:42] <tomprince> You could even have a fork of pip that supports what you want (either a custom local version, or with a different name.
[21:52:50] <lifeless> ronny: its them saying they are *following* not *deciding*, and if you want something in pip itself, you need to *lead* it in the right forum
[21:53:19] <ronny> unfortunately pip tourns off setuptools encapsulated installs, thus making site packages a problem
[21:53:21] <lifeless> and with that, I am EOF this discussion, since it seems to be thoroughly in circle mode.
[21:55:26] <tomprince> ronny: We've given you at least three ways to work around it. Which seems plenty sufficient, since it is an experiment. Once we you have something useful using it, a discussion can be had on wether it is a useful addtion.
[21:55:30] <ronny> i culd work perfectly fine on top of an all setuptools world, because it doesnt force stuff into site directly
[21:55:30] <lifeless> [My part in it anyhow - I don't care if you folk keep chatting :)]
[21:56:01] <dstufft> Yea, I think we are in circle mode here, I don't have anything of value to add either other than if you think you can beat the multi version import problem, the on disk layout is the least interesting part of that, it can be changed after there's a working prototype for multi version imports, and that anytihng else in pip needs PEPs and distutils-sig discussions for major new concepts. and that this is really not a fuck you to you personally,
[21:56:02] <dstufft> we just can't add in ad-hoc experimental concepts just in case they might be useful
[21:56:12] <dstufft> and with that, I'm EOF on discussino too :)
[22:46:19] <lifeless> dstufft: no, its fine. Its in my branch it gets confuddled
[22:47:17] <lifeless> and I think I see why, we consult the cached wheel for dep data in develop
[22:47:56] <lifeless> which is fine, but the RequirementCache doesn't use InstallRequirement as it traverses everything, so that lookup doth not happen
[22:49:00] <lifeless> we probably want it to, since static faster to evaluate than dynamic