[02:20:42] <toad_polo> gp: What does "use a token" mean in this context?
[02:21:00] <toad_polo> Oh, to authenticate to gitlab?
[06:33:04] <agronholm> about the newly coming pip dependency resolver: is it intended to take already installed packages into account?
[06:33:28] <agronholm> I remember back in the day when this detail was heavily debated because it would result in a lot of I/O
[14:42:39] <toad_polo> agronholm: I don't know for sure but I think so. I know that keeping track of things like installed "extras" is on the road map.
[14:42:55] <toad_polo> And I think that that is related to the solver.
[15:44:11] <techalchemy> toad_polo, I think we still need to write a PEP for extras
[15:44:55] <techalchemy> agronholm, detecting already installed packages is fairly straightforward, pip already does that when it gives you warnings about conflicts
[15:45:18] <techalchemy> I wasn't around for the last debates but I assume everyone has determined the IO cost is worth it
[16:32:05] <gp> I am trying to reuse the requirements listed in requirements.txt for my setup.py. The requirements.txt uses -r to point towards individual app requirements. Is there a method exposed in the api to return a list of requirements given a requirements file?
[16:39:54] <gp> or if you could point me towards a 3rd party tool that strips egg names out of requirements.txt that would be awesome. couldn't find one
[16:54:04] <toad_polo> gp: It is preferable not to do that.
[16:54:54] <toad_polo> Requirements.txt and install_requires serve two different purposes.
[16:56:29] <gp> toad_polo: hrm. If my requirements.txt lists https://gitlab.some.long.link.com/foo/project/archive.tar.gz?ref=master#egg=foo
[16:56:29] <gp> why is it bad to parse the requirements text so that install_requires is ['foo']?
[17:04:55] <mcepl> I know it is not exactly the right channel to ask, but could somebody suggest a better channel to disucss poetry, particularly how to build it without using poetry itself (problem of reproducibility and bootstraping)?
[17:08:15] <tos9> gp: There is no such method because pip has no such public API -- requirements.txt is not a public format (public to other tooling, it's of course public to its own users), it's pip's internal format for its command line
[17:08:43] <tos9> gp: But one cannot be generated from the other, because as toad_polo mentioned they have different purposes
[17:09:00] <tos9> gp: https://caremad.io/posts/2013/07/setup-vs-requirement/ is one post explaining the two purposes
[17:10:30] <tos9> gp: you *can* use install_requires to generate a requirements.txt, or in your case since you have a git repo, install_requires + a requirements.in file, and the 2 of those can generate you a requirements.txt
[17:13:46] <gp> tos9: I've read that. It seems like extracting all "egg" names from a requirements.txt (that links to others with -r) is what the article recommends? Not being available because there is no public api makes it impossible. But it seems like it would be the ideal way to do it
[17:14:37] <gp> the nice thing about requirements text is that it can be broken down into separate parts so you know which parts of the app require what. bundling all requirements in a list doesn't offer that level of detail
[17:14:43] <tos9> gp: where in the article does it recommend that
[17:15:26] <tos9> gp: what part of requirements allows you to do that? (breaking down the app into separate parts)
[17:15:42] <tos9> gp: nowhere in the article does it recommend parsing that from a requirements.txt
[17:16:04] <tos9> gp: the key point of the article is "you must have at least two places, though there are ways to share some information between them"
[17:17:56] <toad_polo> Yeah, generally if you want them to share information, you use `install_requires=...` and then generate a requirements.txt from there.
[17:18:31] <gp> tos9: a requirements file like this allows you to associate which parts of the app need what - when the apps are unusable without eachother but it is helpful to know what they use - just reads well https://dpaste.de/bqCK
[17:18:38] <toad_polo> Though sometimes / usually a better option is to have a `constraints.txt`, and call `pip install -c constraints.txt .`
[17:18:57] <tos9> gp: install_requires supports the same kind of thing
[17:19:06] <tos9> gp: they're called setuptools extras
[17:21:58] <gp> tos9: thanks. refactoring to be installable via setup.py. Can make that work but I guess I just like the requirements.txt layout.
[17:22:05] <gp> appreciate the right direction tho
[17:23:24] <tos9> gp: it's certainly a common thing (though personally I don't really get why, I think requirements.txt is horrid and can't wait for it to be replaced)
[17:23:35] <tos9> gp: but yeah you're very much fighting the grain when you try to parse them back and forth
[17:23:43] <tos9> a common thing to *want* I meant
[17:26:11] <gp> tos9: it's more explicit and anyone can read it and understand it. the sole purpose is for requirements. setup.py get ugly and then someone thinks its awesome to write convoluted functions inside the setup.py for compatibility checks. Anyways, rambling =)
[17:36:14] <tos9> gp: who said anything about setup.py :)
[17:36:57] <tos9> gp: (setup.cfg is how you should package a project in 2019 if you are going to use setuptools, which is what I still use/recommend for now)
[18:24:51] <techalchemy> requirements.txt is bad because it is used in multiple ways for multiple things, one of which is an unconstrained, unpinned set of top level dependencies which tend to be kind of useless, since in a requirements.txt you're usually deploying an app on the web or whatever
[18:26:18] <techalchemy> that means you're likely making adjustments and deploying to a virtualenv and with an unconstrained unpinned list you'll never know what is actually required to run your app, you might accidentally use something in your code that was just incidentally installed but isn't listed, then you wind up having to track all of the transitive dependencies down
[18:26:41] <techalchemy> people tend not to update that kind of a file because it's scary / hard / has unintended consequences due to the lack of dependency conflict resolution
[18:27:24] <techalchemy> but mainly it's just not useful on its own
[18:28:10] <techalchemy> and the alternative is the transitive closure aka 'lockfile' type of requirements file which has strict pins / versions for each dependency and is a snapshot in time, great if you just want to install exactly what you had in the first place
[18:29:02] <techalchemy> not so great if you ever want to update or install something new since it's basically impossible to figure out by hand which of the pinned dependencies will be impacted and what they now need to be updated to, etc etc
[18:40:22] <gp> techalchemy: thank you for the detailed response. makes sense
[18:42:45] <gp> tos9: It doesn't look like the extras require is analogous to my nested requirements.txt because it doesn't seem like you can reference them to be "required" and listed as "extra". For example, package foo's install requires doesn't seem to be able to list 'foo[sub1]'. That has to be listed in the project trying to install foo =/
[18:43:11] <gp> but a dirty hack building the install requires by list concat is the lesser of two evils i guess
[18:45:50] <techalchemy> gp, there is no good solution to requiring a lot of things
[18:46:57] <techalchemy> depending on what your software does (is it a deployable thing you share at a company/small group or a reusable set of utilities for example)
[18:48:24] <gp> trying to clean up the final "deployable"
[18:48:29] <techalchemy> i typically try to reconsider what is actually 'required' to do the core functionality of the software in question when I start getting large lists of dependencies in install_requires, then i move the rest into extras so people can decide whether to install the other bits add error handling for them
[18:49:02] <techalchemy> for a deployable thing it's less important because you can still just `pip install -e .[extras1,extras2]` and then `pip freeze > requirements.txt`
[18:50:54] <techalchemy> then you have both things I mentioned earlier -- the 'what did I need to run this in the first place/how did I generate the environment' -- so you can regenerate it anytime via `pip install -e .[extras1,extras2]` -- and an answer to 'what is actually installed here' from the `pip freeze`
[18:51:19] <techalchemy> that way when you finish testing you can use the requirements.txt for the production deployment
[18:54:30] <gp> techalchemy: we have that step in CI. I am struggling with keeping context of what subcomponent the requirement is for. So when things change it is easier to know what is or isn't still required -- at the review step more than actually having a final deployable that runs properly
[18:54:53] <gp> breaking it all out but the list is confusing with context of where the requirment comes from
[18:55:14] <gp> but I think its just the "state of things" as you sorta said earlier with no good solution yet
[18:55:59] <techalchemy> gp, well I maintain pipenv which is dramatically overdue for a release (I am working on it) but it does have this functionality built in
[18:56:50] <gp> techalchemy: sweet - i'll check it out! haven't heard of it
[18:57:15] <techalchemy> it does dependency resolution etc
[18:57:29] <techalchemy> but `pipenv graph` shows all of the top level dependencies and what depends on them
[18:58:36] <techalchemy> graph --reverse shows what bring each dependency in (ie how it got there)-- so starting from the children in that case and traversing up
[18:58:56] <gp> techalchemy: wow I've been missing out. You've made my afternoon haha
[18:58:59] <techalchemy> ah 'pipdeptree' is the library we are using
[18:59:45] <techalchemy> hopefully it actually helps, python dependency management is a bit messy obviously :p