PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Monday the 15th of June, 2020

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[01:33:45] <elibrokeit> is there some decent way to check if a packaging requirement is satisfied in the current environment, other than running pip install? ideally with as few dependencies as possible. AFAICT the packaging module has lots of stuff for describing one, but nothing for checking or comparing them...
[01:34:11] <tos9> elibrokeit: Not in general, considering installing packages runs arbitrary code.
[01:35:19] <elibrokeit> all I want to do is some sort of is_satisfied('requirement_string')
[01:35:33] <elibrokeit> I don't want to install things
[01:35:51] <tos9> elibrokeit: hm, how come?
[01:35:59] <tos9> What are you goign to do with the result if not install it?
[01:37:06] <elibrokeit> my immediate interest is the belief that it would be nice if python -m pep517.build could detect that the requirements for building a wheel are already installed, and NOT try to run pip to install it, since pip itself might not, in fact, be installed
[01:37:54] <elibrokeit> as the linux distribution bootstrap chain can always be aided by not requiring "literally all of pip" for every gosh-darned little thing
[01:39:34] <elibrokeit> and pep517.build is of limited use as a blessed "make me a wheel" tool, if I have built a *virtual machine* containing all the build requirements for a module, but not containing pip, and want to build a wheel, which would work fine except that it errors out in pip
[01:40:14] <tos9> it's not yet a blessed wheel tool, well, at least not universally
[01:40:21] <tos9> the only such tool so far is still pip
[01:40:39] <tos9> but I know generally in this conversation that's the thing being avoided :) -- not sure what to suggest
[01:42:17] <elibrokeit> well I also know the movement is going towards "pip is a great tool but we want to avoid locking people into it", as demonstrated by things like pep517.build which is now the actual backend which pip itself uses, and other such blessed PyPA modules like packaging, distlib, and so on
[01:42:40] <tos9> yep, definitely
[01:43:05] <elibrokeit> so I was wondering if maybe there's something I'm missing and it's actually possible to check "does this requirement string currently satisfy via within the current env"
[01:43:55] <elibrokeit> and not do so as a side effect of "pip install it and if pip exits successfully, all is well" :D
[01:46:47] <tos9> so probably if you have a concrete set of requirements and want to check each one I imagine there is such a way to check, but I don't know where it lives
[01:46:56] <tos9> pip presumably goes off looking for .dist-info dirs I guess
[01:47:02] <tos9> so maybe that lives in some factored out place, dunno
[01:47:09] <tos9> proabbly walking through the pip source code is a good way to find out
[01:47:15] <tos9> look for the Requiremnet already satisfied messages
[05:16:51] <McSinyx[m]> elibrokeit: you might want to checkout https://discuss.python.org/t/moving-python-build-to-pypa/4390
[05:18:10] <elibrokeit> I'm aware of it and I think we could have one tool instead of two...
[06:42:32] <McSinyx[m]> I don't think pep517 is supposed to be a tool, but rather a proof of concept/mockup
[09:04:08] <pradyunsg> elibrokeit: well, pep517 wasn't supposed to become a CLI tool that exposes the build logic and all.
[09:04:33] <pradyunsg> so, the `pep517.build` command was added as a PoC.
[09:05:04] <pradyunsg> at least, afaik.
[11:17:17] <toad_polo> elibrokeit: Regardless of all the stuff other people said (only some of which I agree with), it is never true that the build backend requirements are "already installed"
[11:19:03] <toad_polo> pep517.build, as recommended by the PEPs, does all the builds in an isolated build environment.
[11:19:42] <toad_polo> Having something that sometimes uses your current build environment would reality hurt reproducibility.
[13:52:37] <elibrokeit> toad_polo: try commenting out all the self.pip_install() calls and installing build-system['requires'] globally, pep517.build works perfectly fine. You're wrong, it isn't an isolated build environment, it merely installs an additional copy of the requirements and prepends this to the pythonpath; unstated dependencies which you happen to have installed, will still cause just as many problems as they ever do
[13:53:04] <elibrokeit> I'm not a complete idiot :) there's a reason I care about this.
[13:53:57] <elibrokeit> pradyunsg: and yet, it is nearly but not entirely perfect :)
[13:55:55] <toad_polo> Huh, that should... definitely be fixed.
[13:56:07] <elibrokeit> there's nothing to fix, it's performing to spec
[13:56:31] <toad_polo> Well, other than being a terrible behavior and in direct contradiction of the normative requirements in the PEP.
[13:56:32] <elibrokeit> maybe you were thinking of venvs though
[13:56:46] <toad_polo> I was not.
[13:57:01] <toad_polo> I probably never noticed because I do everything in clean isolated environments anyway.
[13:57:13] <toad_polo> And I assumed it would have the recommended behavior.
[13:57:40] <toad_polo> In any case, it won't matter for long, we can shift all recommendations over to using `python-build`, which should have the desired behavior.
[13:58:58] <elibrokeit> I can assure you with the utmost exactitude that python-build is developed by a colleague of mine with the explicit intention of creating a tool which respects the global environment and doesn't install build deps on its own
[13:59:09] <elibrokeit> also, I think you should reread the PEP: https://www.python.org/dev/peps/pep-0517/#build-environment
[13:59:36] <toad_polo> elibrokeit: As a committer to `python-build`, I can assure you that uh.. that will not be the default behavior.
[13:59:41] <elibrokeit> > We do not require that any particular "virtual environment" mechanism be used; a build frontend might use virtualenv, or venv, or no special mechanism at all. But whatever mechanism is used MUST meet the following criteria:
[13:59:41] <elibrokeit> > All requirements specified by the project's build-requirements must be available for import from Python. In particular:
[13:59:41] <elibrokeit> [..]
[14:00:01] <toad_polo> Yeah, look in the "normative recommendations" section.
[14:00:12] <toad_polo> I didn't say it violates the spec, I said it violates the recommendations
[14:00:39] <elibrokeit> > in direct contradiction of the normative requirements
[14:00:57] <elibrokeit> and now you're recanting, and saying it violates the "non-normative recommendations"
[14:01:02] <elibrokeit> not normative, non-normative
[14:01:10] <toad_polo> Sorry, I mistyped that.
[14:01:26] <toad_polo> OK, whatever.
[14:01:46] <elibrokeit> well, I was understandably confused, since you first referred to "normative" "requirements"
[14:02:03] <elibrokeit> and it turned out to be non-normative and not requirements, but merely recommendations :)
[14:02:27] <toad_polo> Regardless, YOLO-ing the requirements from whatever random environment you execute it from is the wrong behavior.
[14:02:38] <toad_polo> For a user-facing tool of that nature.
[14:02:39] <elibrokeit> it's... not?
[14:02:48] <toad_polo> It definitely is.
[14:03:28] <toad_polo> Otherwise you wouldn't have a separate makedepends and optdepends in PKGBUILD.
[14:03:34] <elibrokeit> pep517 correctly guarantees that any requirements are met in the build environment, which overrides the random environment you execute it from
[14:04:30] <toad_polo> It's a flagrant violation of the spirit of PEP 518 to not use an isolated build environment by default.
[14:05:26] <elibrokeit> it's not guaranteed to *also* be a fuzzing tool which detects errors in your build-system requires
[14:05:47] <elibrokeit> and what is this about PKGBUILD optdepends o_0
[14:06:54] <toad_polo> I don't see how it would be a "fuzzing tool"
[14:08:02] <toad_polo> It's just not a good default behavior, which is similarly why many package managers have a clear separation of build time dependencies and runtime dependencies.
[14:08:24] <elibrokeit> I assure you as the upstream comaintainer of pacman/makepkg and the PKGBUILD spec, optdepends aren't about any sort of clean builds at all
[14:08:49] <toad_polo> OK, well this has been fun.
[14:08:52] <elibrokeit> in fact, makepkg considers all runtime dependencies to be build-time dependencies too
[14:10:36] <toad_polo> Hm, that is a strange decision.
[14:11:00] <elibrokeit> well, imagine you depend on libfoo, libbar, libbaz
[14:11:01] <toad_polo> I used to be such a fan of Arch Linux packaging, but the more I learn about it the more disappointed I am. ☹
[14:11:05] <toad_polo> Never meet your heros I guess.
[14:11:18] <elibrokeit> in order to *build* your thing, you need them installed to link to them
[14:11:27] <elibrokeit> in order to run it, you *also* need them installed
[14:11:50] <elibrokeit> you can do runtime-only dependencies, by moving the dependency specifiers into the package() function
[14:11:55] <toad_polo> That's not usually true in Python.
[14:13:25] <elibrokeit> that's the case for building sdist/wheel, but not the case for running tests, so running tests in a PKGBUILD would still require you to install all runtime dependencies so you could exercise the code and its tests :)
[14:13:55] <elibrokeit> (and it's also not the case if setup.py imports the project, and thus all runtime deps. RIP)
[14:15:24] <elibrokeit> but all this is neither here nor there. Having additional packages in the build environment isn't directly harmful. The only danger it can have is "I didn't realize my build-system requires was missing XXXX, because I had it installed already"
[14:16:31] <elibrokeit> makepkg can be enhanced with an additional layer to discover such cases: build in clean chroot containers using makechrootpkg. Essentially venv for operating systems.
[14:17:32] <elibrokeit> but that doesn't mean it's wrong to build without this layer... if it works in the isolated build environment, it should always also work in a working environment
[14:18:03] <toad_polo> This would really hurt reproducibility.
[14:18:58] <elibrokeit> How? It produces the same artifacts no matter what.
[14:19:40] <toad_polo> It would mean people have to take extra steps to make their builds reproducible in many cases — a lot of code has opportunistic dependencies, for example, even if the build doesn't just straight up break when installed the wrong way.
[14:20:12] <toad_polo> Which could lead to different artifacts being produced if you have whatever random version of a setuptools plugin installed or something.
[14:20:13] <elibrokeit> opportunistic dependencies in python code? Intriguing...
[14:21:33] <toad_polo> The most common failure mode will be people who install stuff and fail to declare it, then make manual releases in such a way that `pip` will fail to install them from `sdist`. If you release a wheel you may not even notice that your `sdist` is broken for a while.
[14:21:55] <toad_polo> That alone makes it a bad default.
[14:23:01] <elibrokeit> that's not an opportunistic dependency, that's just missing requires
[14:23:12] <elibrokeit> I thought you were going to tell me python suffered from https://wiki.gentoo.org/wiki/Project:Quality_Assurance/Automagic_dependencies
[14:23:35] <toad_polo> It does.
[14:23:55] <toad_polo> It's very common, it's just going to be less common than "An error that could have been caught at build time is instead caught at runtime".
[14:24:03] <elibrokeit> (an autotools problem which is solved by specifying all optional buildtime features as requirements, or using USE flags that explicitly --enable-* or --disable-* every feature)
[14:30:18] <elibrokeit> I cannot specifically recall having seen opportunistic dependencies in action in the python packaging ecosystem, but perhaps this could benefit from something like mesonbuild's --auto-features=[auto|disabled|enabled] which allows you to elevate all opportunistic dependencies to being completely disabled by default, or else enabled as hard requirements
[14:30:36] <elibrokeit> but python packaging doesn't even have the concept of --enable-foo though, so really, how does this work o_0
[14:30:43] <toad_polo> There's no way to do this sort of thing.
[14:31:14] <toad_polo> Stuff can install random stuff in `site.py` or do any sort of magical bullshit to completely change the way that Python works.
[14:32:15] <toad_polo> And that's not including the fact of build-backend-specific plugin systems that can install hooks, or something in your dependencies that is trying to be helpful with a `try / except ImportError` clause.
[14:33:40] <toad_polo> The safest thing to do is to default to actual isolated builds, where the environment only contains what you've told it to contain.
[14:48:21] <elibrokeit> try/except ImportError in the build backend or in runtime dependencies??? In the latter, I don't see how that changes the way sdist/wheel generation works. In the former, I'm... not sure what that means.
[14:48:21] <elibrokeit> stuff which does "magical bullshit to completely change the way that python works in site.py" is bad, but if someone is screwing up their wheel generation on that level I wouldn't trust them to refrain from also screwing up the tool which creates the isolated env. It seems like more of an academic concern.
[14:49:02] <toad_polo> Regardless it doesn't matter.
[14:49:36] <toad_polo> Proper build isolation is a good idea, and I don't think we should recommend any tool that doesn't do it by default.
[14:49:43] <toad_polo> I doubt I'll get much pushback on that.
[15:11:57] <elibrokeit> ¯\_(ツ)_/¯that's not currently the case, so I thought it would be nice if pep517 had fewer sharp edges in the process. I still think there's value in making isolation an enhancement layer instead of something you cannot disentangle from the underlying build tool.
[15:21:27] <toad_polo> The current plan for `python-build` is to isolate by default and have a `--no-isolation` mode.
[15:22:16] <toad_polo> There are reasonable use cases for disabling build isolation, but it's generally preferable not to do so.
[15:22:45] <toad_polo> Or, put another way, if you don't know whether or not you want an isolated build, you want an isolated build.
[15:24:20] <toad_polo> I think originally `python-build` was going to only cater to that use case, but by expanding the scope it will reach a lot more people and also solve a serious missing feature from the current PEP 517 landscape.
[15:56:08] <pradyunsg> elibrokeit: have you seen https://discuss.python.org/t/building-distributions-and-drawing-the-platypus/2062?
[15:56:39] <pradyunsg> FWIW, the Python-build idea is basically to do the "independent tool" approach for the build systems problem we have.
[15:57:39] <pradyunsg> Both the use cases are important, and I like the idea of havinhg a single tool for this task.
[17:13:25] <elibrokeit> pradyunsg: an interesting read...
[17:15:18] <elibrokeit> I'd personally be in favor of a bunch of separate tools grouped together under one meta-tool like git
[17:16:44] <elibrokeit> ... anyway, I'm still curious about the initial thought which started this whole discussion. Regardless of whichever tool you're going to use to set up environments, it would be kind of nice to have a tool that checks if your environment currently matches some requirements. But there doesn't seem to be anything, really.
[17:19:59] <elibrokeit> it could be useful to verify that a --no-isolate environment is usable for the build, it could also be usable for things like verifying that a package isn't broken due to incompatible upgrades, and just generally seems like a missing part of the "inspect my current environment" landscape
[17:36:37] <pradyunsg> 🤷🏻‍♂️
[20:30:29] <FFY00> elibrokeit, that should be easily done with https://github.com/pypa/packaging
[20:30:46] <FFY00> + importlib.metadata
[20:31:12] <FFY00> https://github.com/FFY00/python-build/blob/master/build/__init__.py#L36
[20:32:38] <FFY00> but if you want to use it for pep517.build, I think that will be removed soon
[20:34:40] <FFY00> https://github.com/pypa/pep517/pull/83
[20:35:42] <FFY00> anyway, build.check_version is public API, you can use it if you want
[20:36:04] <FFY00> I should rename it check_requirement tho
[22:02:28] <elibrokeit> FFY00: easily done, you say?
[22:02:30] <elibrokeit> for extra in req.extras:
[22:02:30] <elibrokeit> if extra not in (metadata.get_all('Provides-Extra') or []):
[22:02:30] <elibrokeit> return False
[22:03:19] <FFY00> yes, that's if you need extras
[22:03:34] <elibrokeit> but provides-extra doesn't say whether it is currently satisfied by the distribution object's own dependencies
[22:04:03] <FFY00> what do you mean?
[22:04:58] <elibrokeit> for example, requests[security] depends on pyopenssl and cryptography
[22:05:48] <elibrokeit> r = importlib.metadata.distribution('requests')
[22:05:48] <elibrokeit> r.requires -> contains ['pyOpenSSL>=0.14; extra == "security"', 'cryptography>=1.3.4; extra == "security"']
[22:05:57] <elibrokeit> but I don't have pyopenssl installed
[22:06:50] <elibrokeit> >>> r.metadata.get_all('Provides-Extra')
[22:06:50] <elibrokeit> ['security', 'socks']
[22:07:01] <elibrokeit> for that matter, I don't have pysocks installed either
[22:07:46] <elibrokeit> so importlib.metadata only shows you that there is a key called provides-extra: security
[22:08:16] <elibrokeit> which means that requests has a known extra, called "security"... but not that requests[security] is installed
[22:10:22] <FFY00> okay, we need to read requires.txt then
[22:10:35] <FFY00> I am not sure if this is the same for wheels
[22:10:40] <FFY00> or just eggs
[22:13:11] <FFY00> just eggs
[22:13:23] <FFY00> wheels have this in METADATA
[22:13:31] <FFY00> Requires-Dist: pyOpenSSL (>=0.14) ; extra == 'security'
[22:13:38] <FFY00> Requires-Dist: cryptography (>=1.3.4) ; extra == 'security'
[22:14:13] <FFY00> if you want you can open a PR in python-build, otherwise I'll do it later
[22:15:25] <FFY00> but I am thinking a wrapper for this should probably be added to pypa/packaging
[22:15:42] <FFY00> pradyunsg, what do you think?
[22:16:52] <elibrokeit> Requires-Dist is not reliable, the version in pacman for example doesn't have it
[22:17:10] <FFY00> I don't follow
[22:17:31] <elibrokeit> r = importlib.metadata.distribution('requests')
[22:17:44] <elibrokeit> r.metadata -> reads PKG-INFO, the information may not be there
[22:17:55] <FFY00> python-requests in arch is an eggs, it doesn't have Requires-Dist
[22:18:00] <elibrokeit> r.requires -> reads requires.txt, the information is there
[22:18:08] <FFY00> for eggs you need to read requires.txt
[22:18:16] <FFY00> wheels have Requires-Dist
[22:18:42] <FFY00> *python-requests in arch is an egg
[22:20:08] <elibrokeit> yes, I'm just saying... why are you using .metadata.get_all() for this instead of .requires
[22:20:40] <elibrokeit> since the latter provides a unified way of getting the information across both eggs and wheels
[22:33:12] <FFY00> I just confirmed that, requires works in both eggs and wheels
[22:33:16] <FFY00> we should use it instead
[22:39:11] <toad_polo> Can you fix the fact that it's an egg? That seems like it should be a priority...
[22:48:36] <FFY00> what do you mean?
[23:01:01] <elibrokeit> toad_polo: that's not a priority, it's a completely unrelated topic
[23:01:01] <elibrokeit> But since you asked, the entire problem here is that there's no way to install a wheel except with pip, and hence, the "state of the art" in Linux distributions is to use python setup.py install. This produces eggs, because that's what the setuptools project and distutils before it, determined would happen.
[23:01:01] <elibrokeit> This will be fixed precisely once PyPA recommended tooling can cope with the needs of "install a wheel without using pip", and... that's exactly how we got here and ended up discussing python-build and pradyunsg/installer
[23:02:22] <elibrokeit> I would be more than delighted to give a big huge push to one particular Linux distro to ensure everything is a dist-info. Just as soon as a tool can do it. :) :) :)