PMXBOT Log file Viewer

Help | Karma | Search:

#pypa-dev logs for Tuesday the 21st of April, 2015

(Back to #pypa-dev overview) (Back to channel listing) (Animate logs)
[00:04:56] <lifeless> dstufft: would you like me to start breaking out the lower non-contentious bits of this blacklist patch?
[00:30:42] <lifeless> ok I think we're good to review
[00:30:43] <lifeless> green
[00:30:44] <lifeless> yay
[03:57:29] <lifeless> qwcode: https://github.com/pypa/pip/pull/2708 failed on a network glitch on one job, rest is ok
[04:00:03] <lifeless> 2709 is green
[04:01:12] <lifeless> 2710 network glitch
[05:54:51] <r1chardj0n3s> nice blog post dstufft
[05:55:19] <r1chardj0n3s> and of cource sigmavirus24_awa will love that final graph showing the share of python 2.6 downloading requests is *increasing* (WTF)
[06:32:35] <ronny> oh, url?
[06:37:17] <ronny> ah, i see, indeed
[08:36:06] <lifeless> r1chardj0n3s: indeed, brilliant :)
[11:51:43] <ronny> r1chardj0n3s: newer thigns probably use local caches more, thus it gets screwed
[14:06:06] <sigmavirus24> r1chardj0n3s: yeah that really really bothers me
[19:13:48] <lifeless> o/
[19:49:13] <lifeless> dstufft: had a chance to look at my patchset ?
[20:54:28] <lifeless> hmm
[20:54:31] <lifeless> travis on go-slow
[21:06:14] <dstufft> lifeless: not yet, my sleep is all kinds of screwed up past couple of days :(
[21:06:20] <dstufft> more so than normal
[21:07:01] <lifeless> dstufft: :(
[21:07:19] <lifeless> dstufft: travis seems to be off in lalalala land anyhow :/
[21:08:03] <lifeless> https://github.com/pypa/pip/pull/2708 then 2709 and 2710 the finally 2699
[21:08:10] <lifeless> are *in principle* all ready
[21:08:23] <lifeless> dstufft: got any advice for the next release-blocker I should tacle?
[21:11:31] <dstufft> lifeless: I think with https://github.com/pypa/pip/issues/2677 we should probably have --install-option and --global-option disable building/using wheels (using your new mechanism makes sense I think), and I'll open another ticket up to just remove those options all together because I don't think they make sense completely in the new world of building wheels for all installs
[21:12:25] <lifeless> double checking: disable *using* all wheels?
[21:13:14] <lifeless> For the wheel command as well? e.g. pip wheel --install-option=xxx foo # this would not use wheels for deps, just sdists ?
[21:14:46] <dstufft> I think that --install-option applies to everything being installed and right now we just silently ignore it if we're installing from a wheel
[21:14:58] <dstufft> so i think it makes sense to fix that silently ignoring bit
[21:16:38] <dstufft> pip wheel doesn't support --install-option, it does have --build-option which functions similarly, and hrm, it should probably imply that too
[21:17:11] <dstufft> and a non blocker issue should be opened to sort of the entire mess between those 3 options and figure out how to get rid of them
[21:20:43] <lifeless> ok so
[21:21:14] <lifeless> let me summarise for clarity: - --install-option, --global-option, --build-option will all disable the consumption of wheels, and any autobuilding of wheels.
[21:21:20] <lifeless> across all commands
[21:21:28] <lifeless> [that have any of those options]
[21:23:19] <dstufft> lifeless: yea, that sounds right to me, kinda sucks but those options in general kinda suck
[21:23:59] <lifeless> I'll do that in the issue about it not being honoured
[21:27:44] <lifeless> ugh, the accepting of those options in requirements files makes this uglier
[21:28:36] <lifeless> yay travis is back
[21:29:00] <lifeless> https://github.com/pypa/pip/pull/2708 should be green soon
[21:29:27] <lifeless> there's one typo in a docstring in it that a later commit fixes, I'm going to ignore that
[21:41:13] <dstufft> lifeless: is there any order I need to look at your 3 PRs in or is any order fine
[21:53:57] <lifeless> dstufft: there are four :)
[21:54:11] <lifeless> dstufft: order is 2708 2709 2710 2699
[21:55:31] <lifeless> dstufft: (if only to avoid seeing the contents of one in the next one
[21:55:53] <dstufft> lifeless: ok
[21:59:01] <qwcode> dstufft, what would the longer term plan for install option support for wheels be? new options? that don't come from setuptools?
[22:00:31] <jdunck> I'm presently wishing req_file.parse_requirements had decoupled the file open from the parsing - and I find other sorts of inconvenient coupling in various places in the code. (Another example is operations.freeze doing parse-y stuff as well.) How do folks view internal refactorings?
[22:00:40] <jdunck> (I haven't contributed to pypa before.)
[22:01:34] <jdunck> As a general rule, is any of the non-CLI API stable?
[22:02:38] <lifeless> jdunck: its not
[22:02:45] <lifeless> jdunck: folk do poke in it
[22:02:55] <lifeless> jdunck: and we do put shims in to help where we know
[22:03:02] <lifeless> but its not a fixed API today
[22:03:19] <jdunck> I'm working on a thing that I'd hope eventually would be part of pip, but for now I'm developing separately.
[22:03:31] <dstufft> qwcode: writing a comment
[22:03:31] <jdunck> And finding it rather couple-y.
[22:03:35] <dstufft> qwcode: on the thread
[22:03:39] <qwcode> ok
[22:04:10] <jdunck> Basically I'm trying to remove these warts from pip-freeze:
[22:04:18] <jdunck> # * includes -r lines in the output of pip-freeze, which is redundant
[22:04:18] <jdunck> # because it also includes the packages that were specified in the -r
[22:04:18] <jdunck> # * translates archive URLs to (incorrect) requirements specifiers
[22:04:18] <jdunck> # * Leaves requirements file in semi-random, unstable order.
[22:04:36] <jdunck> (sorry for a bit of paste spam. I'm done :P)
[22:04:50] <lifeless> jdunck: what thing ?
[22:05:04] <jdunck> wart removal &
[22:05:09] <dstufft> jdunck: refactors are totally fine fwiw
[22:05:10] <jdunck> err, ^
[22:05:15] <lifeless> jdunck: no, I mean, whats the thing you're workong on
[22:05:35] <jdunck> What I'm trying to get to is roughly gemfile.lock
[22:05:38] <dstufft> we don't have a public API so we move hings around as we see fit, sometimes with some compat shim based on how gross the shim is
[22:05:49] <jdunck> I have a hacky thing already, but pip-freeze has warts that require post-processing.
[22:11:42] <jdunck> dstufft is it fair to just do a rough PR and get speciific feedback on how to better shim it?
[22:12:35] <dstufft> jdunck: sure
[22:18:54] <qwcode> dstufft, thanks for the response. I responded back. I'm ok with the current plan, just wanted clarity and to get my 2c in.
[22:19:11] <qwcode> dstufft, one other thing, if you have the head space.... with all the talk of sat solving, it seems the quicker win is not being talked about... that is to start ANDing sub-requirements, and detecting conflicts when ANDing
[22:20:03] <dstufft> qwcode: I thought about that, the problem is dealing with backtracking
[22:20:15] <dstufft> e.g.
[22:24:27] <dstufft> qwcode: when in the process of resolving deps, we download X version Y and add it's dependencies, and we continue parsing the requirements, and we end up adding a specifier for X that excludes version Y (but doesn't exclude other valid versions)
[22:24:48] <dstufft> we need to undo all of the additional dependencies we added from X version Y because it's no longer a valid satisifier
[22:25:03] <dstufft> so we have to backtrack and resolve that part of the dep tree
[22:25:17] <dstufft> otherwise we generate conflict errors when we otherwise could have worked
[22:26:18] <qwcode> excludes version Y, meaning the dep for Y is gone, or conflicts?
[22:26:39] <qwcode> sorry, misspoke
[22:27:44] <qwcode> sorry, can't seem to parse "excludes version Y (but doesn't exclude other valid versions)"
[22:28:54] <dstufft> qwcode: say the first specifier is just "X", we resolve that to X 1.0 and download it and discover dependencies, later on we AND in another specifier that says X<1 which excludes X 1.0, but the results of ``X`` & ``X<1`` is satisfiable, we'd just need to install 0.9 or something instead
[22:30:19] <dstufft> qwcode: we need to backtrack (or undo) the things we discovered because we originally downloaded X 1.0 because now that we've added X<1 X 1.0 is no longer valid (and thus anything we added to the specifier list from it is no longer valid)
[22:31:05] <qwcode> yea, ok
[22:31:07] <dstufft> we _can_ just ignore that problem, (we're already ignoring it today) but we switch the error case from "we might install versions that don't actually satisfy the cosntraints" to "we might claim there is nothing that satisfies the constraints when there are in fact things that do"
[22:31:36] <dstufft> once you start dealing with backtracking you're already most of the way there to a brute force SAT solver
[22:35:27] <jdunck> dumb q: what's dependency_links.txt do?
[22:35:37] <jdunck> I have a bunch of them, mostly empty
[22:36:40] <qwcode> it's installation metadata. it hold setuptools "dependency links" if there are any. you can search the setuptools docs for it
[22:36:40] <ronny> jdunck: its a egg-info detail storing thecontent of the setup argument dependenc_links
[22:37:42] <jdunck> gotcha, thanks
[22:37:51] <ronny> dstufft: any oppinion on having a user/global repo of unpacked wheels and a 'virtualenv' replacement that picks some of those via a optimized meta import hook for toplevels ?
[22:38:52] <dstufft> ronny: it's an interesting idea, I'd need to experiment with an implementation of it to have a more fully formed opinion about the viability of it
[22:39:54] <ronny> dstufft: will sprint with a friend of mine on this next weekend if his health allows
[22:40:51] <ronny> dstufft: it would also allow some funky details for pip, because each "env" could automatically always get the latest one including the requirements
[22:44:21] <lifeless> sometimes I think Python just wants to tease with functional idioms
[22:45:48] <qwcode> dstufft, it seems the backtracking you describe is a bit different than what justincappos was talking about in the issue. would require a longer email to beak it down, so I won't try here : )
[22:46:35] <qwcode> what you describe sounds easier than sat solving... but I could certainly be naive
[22:46:43] <lifeless> dstufft: I;m working up a backtracker btw
[22:46:55] <lifeless> dstufft: its all part of my needs to fix setup_requires :)
[22:47:01] <qwcode> hah
[22:48:10] <lifeless> backtracking needs better management of build dirs and InstallRequirements
[22:48:30] <lifeless> since we'll re-encounter the same stuff multiple times with different constraints
[22:48:41] <lifeless> so I want a cache
[22:48:48] <lifeless> I need a cache of installable things for setup_requires too
[22:49:15] <lifeless> since we'll often need the same thing to setup something and then install that thing
[22:50:21] <dstufft> qwcode: a brute force sat solver isn't very hard really, you just try different stuff until you solve the equation, the hard parts in sAT solvers come when you want to optimize it so you don't have to redo all your guessing
[22:50:34] <qwcode> lifeless, but your backtracking implementation involves ANDing duplicate specifiers and detecting conflicts?
[22:50:53] <lifeless> qwcode: it will
[22:51:03] <qwcode> and "backtracking" any time the ANDing occurs?
[22:51:05] <lifeless> I don't have one -ready- and if someone does, for use in pip that is - great
[22:51:27] <qwcode> I was looking at how to do the conflict resolution in packaging...
[22:51:39] <qwcode> conflict detection I mean
[22:51:43] <lifeless> ah
[22:51:46] <lifeless> so
[22:52:05] <lifeless> my intent is to union the requirements together
[22:52:27] <lifeless> if something is unsatisfiable(e.g. ~=2.0 and ~=3.0) then error and backtrack
[22:52:53] <qwcode> packaging has SpeciferSets or whatever for ANDed specifiers... it seems the right place to add new logic for this
[22:53:07] <lifeless> yeah
[22:53:12] <lifeless> we'll want the glue in there to suppot it
[22:53:37] <lifeless> but the generation of possible combinations and the backtracking or annealing or whatever approach is used - thats not packaging internals IMO
[22:54:01] <qwcode> the backtracking stuff would be in pip I assume
[22:54:09] <lifeless> yah
[22:54:21] <lifeless> what we need from the data model for a contraint is
[22:54:39] <lifeless> the ability to combine a new constraint with an existing one
[22:54:56] <lifeless> have that either error at that point, or have a predicate, to check if a constraint *can* be satisfied at all
[22:55:25] <lifeless> and either that needs to return new objects (e.g. like frozenset.union(anotherfrozenset))
[22:55:30] <lifeless> or have copy/deepcopy support
[22:55:51] <lifeless> (because otherwise backtracking is eeek)
[22:56:17] <lifeless> SpecifierSet AND may be sufficient already
[22:56:21] <qwcode> I was imagining all the individual specifiers supporting a method, that knows what other specifiers, and under what condition it would conflict
[22:57:04] <qwcode> _conflict(other) # look at other, and see if I conflict with it
[22:57:23] <lifeless> I'll keep my nose out of packaging internals :)
[22:57:54] <lifeless> but - if I may observe - thats O(pyramid)
[22:58:23] <lifeless> with 4 specs, 6 checks, with 5 specs, 10 checks, with 6 specs 15,
[22:58:27] <lifeless> it grows pretty fast
[22:59:06] <qwcode> you test the unique combinations within the AND set
[22:59:07] <lifeless> mmm, perhaps not an issue if its only applied at AND time
[23:00:01] <lifeless> yes, unique combinations - N!/(n-r!)(r!)
[23:00:03] <lifeless> with r=2
[23:00:09] <lifeless> and n=specs
[23:01:01] <lifeless> so for 100 references to a package, thats 100*99*98!/98!2! => 4950
[23:01:30] <lifeless> if you check only the new edges when you're doing the AND, the aggregate cost is the above, but the cost at each AND is linear
[23:01:40] <lifeless> we're going to make a -lot- of these checks
[23:01:55] <lifeless> the other concern is whether there can be 3+ conflicts
[23:02:03] <lifeless> where any 2 don't conflict
[23:02:05] <qwcode> to be clear, the _conflict method I'm talking about is on the Specifier class... like the class for the "==" specifier....
[23:02:17] <lifeless> yes, got that
[23:02:37] <lifeless> one thing to note
[23:02:47] <lifeless> is that this is entirely a possible optimisation
[23:03:10] <qwcode> ?
[23:04:20] <dstufft> lifeless: the SpecifierSet is immutable, & on it returns a new object
[23:04:24] <lifeless> dstufft: cool
[23:04:33] <dstufft> it doesn't do conflict detection
[23:04:37] <lifeless> qwcode: if we can't predict 'this set is unsatisfiable'
[23:04:48] <lifeless> qwcode: then we'll still find out when we try all the versions against the set
[23:04:54] <qwcode> dstufft, we were talking about adding it
[23:05:03] <lifeless> so we don't need conflict detection from the sets to be able to solve the bug
[23:05:44] <qwcode> lifeless, you mean don't tell the users there's a conflict, just report "no distribution found"?
[23:06:04] <lifeless> qwcode: 'no distribution found for [union of constraints]
[23:06:25] <lifeless> qwcode: we'd log that to debug I suspect
[23:06:38] <lifeless> qwcode: remember that we're going to try all possible combinations of package releases
[23:06:59] <qwcode> I wasn't thinking that
[23:07:00] <lifeless> qwcode: the failure to find something that works at all can't be characterised by the failure of a particular combination
[23:07:33] <lifeless> making this stuff debuggable is nasty-hard, I'm not aware of any examples of it done well in the distro packaging space
[23:07:54] <lifeless> e.g. aptitude will say 'here are 40000 combinations that might work, which one do you want'
[23:09:50] <qwcode> I was imagining just failing upon a conflicting constraint
[23:10:20] <dstufft> the problem is that when you conflict for one combination, it might work for another combination
[23:10:25] <lifeless> qwcode: ^
[23:10:36] <lifeless> qwcode: a first iteration of 988
[23:10:50] <lifeless> qwcode: and 2687 too
[23:10:53] <dstufft> there might be 40000 total combinations of project + versions you can install
[23:10:53] <qwcode> right, but I wasn't trying to save Rome, just make it better
[23:10:59] <lifeless> qwcode: would be to just error
[23:11:03] <lifeless> it would be an improvement
[23:11:23] <lifeless> but we can do those without packaging detecting conflicts
[23:11:23] <qwcode> most of the cases I've seen would have been solved with a simple AND
[23:11:32] <lifeless> all we have to do is union the requirements and try
[23:12:00] <lifeless> qwcode: the downside is that without something considering the full set of possibilities many installs that work today will stop working
[23:12:17] <dstufft> in particular, many installs that work today, and that could continue to work
[23:12:21] <lifeless> e.g. the feature of requirements.txt where it overrides package dep rules
[23:12:29] <qwcode> realize when you say union, it's historically, confusing, because until PEP440, unions were odd things, not simple ANDs
[23:12:34] <dstufft> e.g. these aren't ones where the constraints are not satisifable
[23:12:44] <lifeless> qwcode: thanks, will say AND
[23:13:02] <lifeless> I think we're going to need a deprecation period
[23:13:18] <lifeless> where user specified versions continue to win even if additional constraints aren't satisfied
[23:13:21] <lifeless> but warn on it
[23:13:36] <lifeless> e.g. A i-r's B==1
[23:13:44] <lifeless> pip install B==1.2 A
[23:13:48] <qwcode> oh, I was thinking the top-level override feature would stay regardless
[23:14:31] <lifeless> its dangerous in some ways
[23:14:42] <lifeless> I think its important to have an escape mechanism
[23:15:03] <lifeless> anyhow - thats future work
[23:15:11] <dstufft> I agree with having an escape mechanism, I think the current one is problematic because people don't think of it as an escape mechanism, they just think of it as another thing to install
[23:15:21] <lifeless> yup
[23:17:05] <qwcode> lifeless, so where is you backtracker thing in your queue right now : )
[23:17:50] <lifeless> qwcode: https://mail.python.org/pipermail/distutils-sig/2015-April/026185.html
[23:18:24] <qwcode> thanks
[23:18:25] <lifeless> wth travis
[23:18:27] <lifeless> https://travis-ci.org/pypa/pip/builds/59447376
[23:55:35] <lifeless> dstufft: ok, next blocker?