PMXBOT Log file Viewer

Help | Karma | Search:

#pypa-dev logs for Thursday the 17th of April, 2014

(Back to #pypa-dev overview) (Back to channel listing) (Animate logs)
[00:00:41] <dstufft> see ya!
[17:08:17] <agronholm> finally back home
[17:08:40] <agronholm> I'll get that rebase conflict resolved as soon as my brain starts working again (which will require some sleep)
[18:45:29] <yusuket> dstufft: I’m making the changes to https://github.com/pypa/warehouse/pull/299 that you suggested, and I had a couple questions
[18:45:53] <yusuket> 1: I’m wondering what kind of normalization method you were looking for to normalize package naming
[18:46:42] <yusuket> I copied the implementation directly from pypi, so if we change the behaviour in warehouse it’s going to be act differently (just something to be aware of)
[18:47:13] <yusuket> and if we do change it, it doesn’t look like there’s much more required than the method “normalize” you wrote
[18:47:24] <yusuket> Looking at pep426 I found this: http://legacy.python.org/dev/peps/pep-0426/#name
[18:48:14] <yusuket> which makes me think it a package name validator would be the only additional piece of work we need
[18:48:47] <yusuket> mainly because right now, pypi just normalizes out characters that are invalid in the new spec
[18:49:12] <yusuket> thoughts? Also my second question had to do with the merging of noramlize and normalize_package_name
[18:49:29] <yusuket> I noticed FastlyFormatter normalizes all keys, just not those specific to package names
[18:49:47] <yusuket> so I’m not sure if it should share the same normalizer as the package name normalizer
[19:22:06] <dstufft> yusuket: sorry was away for a bit
[19:22:14] <dstufft> the FastlyFormatter only normalizes keys that have a !n
[19:22:26] <dstufft> like "{foo!n}"
[19:23:04] <yusuket> dstufft: ok got it. so passing an explicitely asks for normalization?
[19:23:09] <yusuket> an n*
[19:23:12] <dstufft> ya
[19:23:18] <dstufft> otherwise it doesn't normalize
[19:23:30] <dstufft> it's like !r and !s which are other formatting things
[19:23:34] <dstufft> for repr and str
[19:23:47] <yusuket> ok, as long as it’s ok to use the same scheme for package names and also for fastly keys
[19:24:08] <dstufft> yea, the !n in fastly keys are designed for package names
[19:24:21] <yusuket> ahh I see
[19:24:26] <yusuket> n = name?
[19:24:30] <dstufft> normalize
[19:24:50] <dstufft> are you familar with varnish at all?
[19:25:01] <yusuket> oh ok. so not explicitly for package names, but it’s ok to use the same scheme
[19:25:05] <yusuket> Only that it’s a cache
[19:25:07] <dstufft> ya
[19:25:33] <dstufft> The Fastly keys are not the cache keys, but they are just additional metadata so that you can purge multiple things at once
[19:25:58] <dstufft> e.g. all pages that have content dependent on the "warehouse" database entires, are tagged with the key "project/warehouse"
[19:26:02] <yusuket> ahhhh I see
[19:26:28] <yusuket> so a varnish cache allows you to add metadata to a cache item
[19:26:36] <yusuket> and that’s basically what the fastly keys are
[19:26:37] <dstufft> not normally, Fastly uses a customized cache
[19:26:43] <dstufft> a customzied varnish
[19:27:07] <dstufft> But that's what it does yea, basically just an one to many mapping that lets you purge a whole bunch of things
[19:27:13] <dstufft> normal varnish only lets you purge one url at a time
[19:27:22] <dstufft> which sucks when you have a whole bunch of urls you need to purge
[19:27:42] <yusuket> ok got it
[19:27:59] <dstufft> (like for Warehouse, /simple/foo/ is case insensitive on the foo part, so if you update foo, you also want to purge /simple/Foo/ and /simple/FOO/ and /simple/FoO/ etc
[19:28:06] <dstufft> so tagging makes it way easier
[19:28:22] <yusuket> ahhh yeah that makes sense
[19:28:33] <dstufft> I should probably document that
[19:28:49] <yusuket> although couldn’t you also have those three routes map to the same cache object? or is that tricky to do?
[19:30:31] <dstufft> So Varnish by default uses the url of the request, and eventually yes we'll use some custom varnish tricks so that it recognizes that the <foo> part is case insensitive so that it only caches one object for all the various spellings
[19:30:44] <dstufft> but it's not just /simple/Foo/, it's also /project/Foo/
[19:31:25] <yusuket> sorry got disconnect for some reason
[19:31:35] <dstufft> np
[19:31:40] <dstufft> [15:30:10] <dstufft> So Varnish by default uses the url of the request, and eventually yes we'll use some custom varnish tricks so that it recognizes that the <foo> part is case insensitive so that it only caches one object for all the various spellings
[19:31:40] <dstufft> [15:30:23] <dstufft> but it's not just /simple/Foo/, it's also /project/Foo/
[19:32:06] <yusuket> yeah right, definitely see the need for a more flexible cache removal method
[19:32:28] <yusuket> so here’s my other question: what did you want to specifically see different about the normalization method?
[19:32:54] <yusuket> I read thread the PEP and it looks like the normalization already does what it needs to (lowercase, ‘_’ -> ‘-‘)
[19:33:20] <yusuket> the only thing that I see we don’t have now is validation that the name doesn’t have illegal characters
[19:33:22] <dstufft> Hmm, the PEP also talks about confusable characters, and I thought I implemented that on PyPI already
[19:33:37] <yusuket> Here’s what I was looking at:
[19:33:38] <yusuket> http://legacy.python.org/dev/peps/pep-0426/#name
[19:33:58] <dstufft> Index servers MAY consider "confusable" characters (as defined by the Unicode Consortium in TR39: Unicode Security Mechanisms) to be equivalent.
[19:34:16] <yusuket> oops, missed that
[19:34:46] <dstufft> So normalize should probably bail if there are invalid characters
[19:34:51] <dstufft> ValueError or whatever
[19:34:55] <yusuket> got it
[19:35:17] <dstufft> the one trick for confusables, it needs to take into account that we're case insensitive too
[19:35:33] <dstufft> so even though L and 1 aren't considered confusable, they are for us
[19:35:41] <dstufft> or o and 0
[19:36:14] <yusuket> ok, let me read through that spec
[19:36:14] <dstufft> ah I guess I never implemented that on PyPI yet
[19:36:28] <yusuket> I’m not familiar with this concept so I’ll read up on it
[19:36:34] <dstufft> I did implement this though -> ALTER TABLE packages ADD CONSTRAINT packages_valid_name CHECK (name ~* '^([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])$'::text);
[19:36:45] <dstufft> to ensure that a package begins and ends with an alpha numeric
[19:37:05] <yusuket> ok nice
[19:37:36] <yusuket> do you think it’s worth it to re-implement that on the server side? My reasoning is it makes it clearer for someone reading the code to understand what the whole spec compliance looks like
[19:38:05] <dstufft> yes I think it is
[19:38:11] <yusuket> ok will do
[19:38:21] <dstufft> we can give better error messages in the code too
[19:38:47] <dstufft> I generally look at database constraints as "make sure that even if the app is buggy we don' get bad data"
[19:38:54] <dstufft> but the app itself should still validate data
[19:39:01] <yusuket> yeah, that makes sense
[19:39:14] <yusuket> Ok, i’ll make sure the value errors actually correlate to statements in the PEP
[19:39:53] <yusuket> how about something like: “Distribution names MUST start and end with an ASCII letter or digit. (see PEP-426)”
[19:40:17] <dstufft> works for me
[19:40:17] <yusuket> and then we can raise that message directly to a user when submitting packages
[19:40:33] <yusuket> ok, I’ll do that
[19:40:42] <yusuket> thanks!
[19:40:50] <dstufft> no problem, thank you!
[19:41:16] <yusuket> glad to help :)
[19:43:07] <dstufft> It's been mostly me up till now :] Richard has done some but he's real busy with his job, and my job gives me ~1/2 my time to work on packaging stuff
[19:43:22] <dstufft> If there's anything that confuses you or anything like that, make sure to open up issues to document stuff
[19:43:41] <dstufft> I wrote the bulk of it, so it all makes perfect sense to me :V
[19:44:11] <yusuket> nice, yeah I noticed the bulk of the code comes from you :)
[19:44:29] <yusuket> I actually haven’t encountered anything difficult to understand yet, but I’ve only touched the db code so far
[19:44:53] <yusuket> I’d like to take a crack at implementing the legacy pypi register/package submission apis after this though
[19:46:30] <dstufft> I'm changing the DB stuff around a little bit, the actual classes don't much, but instead of app.db.packaging.get_project(name) it'll be app.query(Packaging).get_project(name)
[19:48:30] <yusuket_> I see, less magic since you’re expicitly passing in the class instead of instantiating it beforehand or instantiating via reflection somewhere?
[19:49:57] <yusuket> man this chat client :(
[19:50:22] <dstufft> ya
[19:50:53] <dstufft> basically app is an instance of Warehouse, and it just has a list of classes that it instiates and sticks on app.db in it's __init__
[19:51:26] <dstufft> but you just kind of have to know it does that, and you have to kind of know that app.db.packaging is an instance of warehouse.packaging.db.Database
[19:52:17] <yusuket> yeah, and you want to make it clear that query method are obtained via a class right? and make it clear as to where you can find that class?
[19:52:48] <yusuket> because right now you just know from heuristics that packaging = packaging/db.py
[19:53:55] <dstufft> ya
[19:54:06] <dstufft> it also makes testing a lot easier
[19:54:44] <dstufft> https://gist.github.com/dstufft/8562f47203349856f480 vs https://github.com/pypa/warehouse/blob/master/tests/accounts/test_views.py#L45-L85
[19:57:53] <yusuket> ahhh I see
[19:58:03] <yusuket> you basically make it easier to pass in dummy classes
[19:58:22] <yusuket> because instantiation on app intialization is a bit more annoying to mock
[20:02:08] <dstufft> yea
[20:05:43] <ErikRose> jezdez: dstufft: Doing some reqs.txt 2.0 sketches. Findings so far: YAML is the only off-the-shelf format I like, and it's super complicated. I don't think I'll end up recommending it, but research continues.
[20:06:08] <dstufft> ErikRose: oh I should write down some ideas I had too
[20:06:17] <ErikRose> dstufft: Please do.
[20:06:52] <ErikRose> My current germ of a sketch looks like "somepackage>=3.0" for simple cases but then has indented key-value pairs underneath it for more complicated ones.
[20:06:59] <dstufft> I was thinking of a Python file for requirements 2.0 which "compiles" down to a json lock file. which made some stuff real nice, like implementing groups and such
[20:07:30] <ErikRose> I read all your comments on https://github.com/pypa/pip/issues/1175, if that's what you're repeating.
[20:07:51] <ErikRose> Ah, you're not.
[20:08:05] <ErikRose> That's an interesting idea.
[20:08:07] <dstufft> trying to find my old examples
[20:08:14] <dstufft> they are burried deep in my gists somewhere
[20:09:05] <ErikRose> I have a preference to make people deal with as few languages as possible.
[20:09:11] <ErikRose> Though that softens if Python is one of them.
[20:10:26] <ErikRose> You were talking about a separate reqsfile (supporting > and <) and lockfile (supporting only ==) in issue 1175. I would very much like one language that handles both, for mental and codebase simplicity.
[20:12:32] <ErikRose> Also, what do you mean by "groups"? Like "stuff for prod" and "stuff for dev"?
[20:12:33] <dstufft> ErikRose: well a human should never touch the lockfile directly
[20:12:59] <dstufft> it's sort of like a cache
[20:13:27] <ErikRose> You must also have some ideas, then, about additional pip commands and such to create such files. I would be eager to hear them. :-)
[20:13:38] <Ivo> does RTD read pip's docs from master or develop, or?
[20:13:45] <ErikRose> I'm so sorry you couldn't make it to PyCon!
[20:14:31] <dstufft> man this stuff is 2 years old heh
[20:14:36] <dstufft> also I sucked a lot back then
[20:14:37] <dstufft> https://gist.github.com/dstufft/0314b191adf05e3f965e lol
[20:14:46] <Ivo> we really should support ~ for verison specifiers tbh
[20:14:56] <ErikRose> We both suck a lot now, too. It'll just take us 2 years to realize it.
[20:15:08] <Ivo> there's good reason its useful and used in ruby
[20:16:00] <dstufft> Ivo: PEP440 has that
[20:16:03] <Ivo> ErikRose: you should look up toml
[20:16:04] <dstufft> we just don't implement it yet
[20:16:06] <dstufft> https://gist.github.com/dstufft/2904d2e663461f010bbf
[20:16:08] <dstufft> there we go
[20:16:10] <ErikRose> dstufft: I don't see anything in that gist that appies; is that just background?
[20:16:12] <dstufft> there's one of my things from before
[20:16:15] <ErikRose> Ivo: Will do.
[20:16:17] <dstufft> ErikRose: oh no that was just unrelated
[20:16:21] <dstufft> just something else I found back then
[20:16:24] <ErikRose> hehe, random stuff from dstufft's past
[20:16:35] <Ivo> ErikRose: its still pretty powerful, but you don't need a freaking huge c-compiled library to parse it ({lib,py}yaml)
[20:16:46] <ErikRose> Oh, I remember toml.
[20:16:46] <Ivo> there are some quite small pure python ones
[20:17:11] <Ivo> that's really the biggest problem i have with yaml :/
[20:17:23] <Ivo> that and the number of code execution exploits its enabled, lol
[20:17:28] <dstufft> older https://gist.github.com/dstufft/e4683141ceec9a93890b
[20:18:38] <dstufft> ErikRose: so the thought about groups, is they are thing likes "production" and "development", you can opt to install them (or not) but they always affective the dependency resolution
[20:18:43] <dstufft> affect*
[20:19:04] <ErikRose> Is somebody working on real dependency resolution?
[20:19:23] <dstufft> it's on my todo list, and there are a few libraries that implement something, though I'm not sure I like those libraries or not
[20:19:25] <Ivo> not actively that I know of
[20:19:31] <dstufft> but i'm not actively doing it yet
[20:19:43] <dstufft> my primary goal at the moment is getting Warehouse deployed
[20:19:44] <Ivo> I'd be quite interested to do so as well
[20:20:17] <Ivo> I've narrowed down that the best implementation would be a pure python boolean sat solver
[20:20:35] <dstufft> yea I agree
[20:20:41] <dstufft> here's another thing with some comments on it https://gist.github.com/dstufft/e61c97ee30192e575140
[20:20:41] <Ivo> people tend to implement those in C though, not python
[20:21:15] <dstufft> yea
[20:21:22] <dstufft> I have some half broken SAT solvers implemented for it
[20:21:34] <Ivo> dstufft: you should give these names other than 1.py and 2.py, is this for req2.0?
[20:21:36] <dstufft> where I tried to learn how to SAT solve like MiniSAT
[20:21:54] <dstufft> Ivo: they were my ideas 2 years ago before I was a pip developer when I was going to make my own installer
[20:22:06] <Ivo> there is actually one complete example of it, 0install implemented one in pure python
[20:22:16] <Ivo> oic
[20:22:33] <Ivo> and the guy who wrote that actually had a lot of good notes on it
[20:23:09] <Ivo> well not a lot, but at least some good emails xD
[20:23:11] <dstufft> something makes me think I looked at that but couldn't use it
[20:23:14] <dstufft> was it GPL?
[20:23:25] <ErikRose> Well, jacobian's format doesn't support nesting >1 level: https://gist.github.com/dstufft/e61c97ee30192e575140/#comment-266748
[20:23:46] <ErikRose> That's why I ruled out ini. Also, it has a Windows flavor, but that's a secondary repulsiveness.
[20:24:03] <ErikRose> Plus side, of course, is that it has stdlib support.
[20:24:03] <Ivo> toml has arbitrary nesting IIRC
[20:24:05] <dstufft> yea I don't really like ini
[20:24:09] <Ivo> its like, a better ini
[20:24:15] <dstufft> whats the use case for nesting again
[20:24:18] <ErikRose> It's verbose for nesting, though.
[20:24:39] <Ivo> do you really want too much nesting though
[20:25:00] <ErikRose> dstufft: Attributes on things, like peep-style hashes, the -e flag, or use cases like https://github.com/pypa/pip/issues/1433
[20:25:24] <ErikRose> Ivo: The motivation is to be extensible so we don't have to do reqs.txt 3.0 for awhile. :-)
[20:25:41] <Ivo> ya 0install is gpl2
[20:25:54] <dstufft> I think https://gist.github.com/dstufft/2904d2e663461f010bbf is my favorite
[20:26:13] <Ivo> But, you just need to get the overall idea from it, it is designed for its own packaging system
[20:26:49] <Ivo> dstufft: still interpreting arbitrary python?
[20:27:03] <dstufft> Ivo: arbitrary Python for a requirements file isn't that big of a deal
[20:27:12] <ErikRose> dstufft: Yes, I like your API, at a glance.
[20:27:36] <ErikRose> I'd have to work through the implications of execing arbitrary Python at install time (again).
[20:28:01] <dstufft> So the idea was it'd work sort of like Gemfile and Gemfile.lock
[20:28:08] <dstufft> if a .lock fie exists, pip just uses that
[20:28:31] <dstufft> if a lockfile doesn't exist, it does the dependency resolution using the python file, and then generates a .lock file
[20:28:35] <Ivo> IMO you should only need to execute code for building shit, not ever just specifying requirements
[20:28:43] <ErikRose> I know nothing at all about Gemfiles.
[20:28:51] <Ivo> they're pretty simply
[20:28:53] <Ivo> *simple
[20:29:11] <dstufft> you could implement it using an AST parser if you really wanted, but I don't think it's a big deal
[20:29:18] <Ivo> in many cases
[20:29:34] <dstufft> this isn't seomthing you're going to execute random untrusted requirements files like you do with setup.py
[20:29:48] <dstufft> and this isn't metadata, this is the instructions to create an environment
[20:30:27] <ErikRose> This is a somewhat deeper rabbit hole than I expected when I sat down to merge in peep-like functionality. :-)
[20:31:10] <Ivo> I haven't heard about node needing anything more than json for lockfiles
[20:31:32] <Ivo> s/heard about/heard anybody ranting about/
[20:31:40] <dstufft> editing json, or writing a new one by hand is obnoxious
[20:31:51] <ErikRose> Agreed, JSON isn't human-editable.
[20:31:52] <dstufft> I get mad at package.json constantly
[20:32:06] <dstufft> "whoops added an extra comma LOL ERROR"
[20:32:13] <dstufft> and no support for comments is shitty
[20:32:19] <ErikRose> Not to mention it has nothing to do with Python except that it happens to be in the stdlib
[20:32:48] <dstufft> ErikRose: quick overview of Gemfile // Gemfile.lock, you typically specify ranges in a Gemfile, like "ok this library uses semver and the current version is 1.4.2, so >=1.4.2,<1.5" whereas a Gemfile.lock says "1.4.6 exactly",
[20:32:55] <dstufft> this is powerful because you can do stuff like
[20:33:17] <dstufft> "update my lockfile" and it'll update it to the latest version of whatever you specify in your Gemfile
[20:33:32] <dstufft> so if you are currently on 1.4.6, and 1.4.7 is released, but also 1.5, you'll only update to 1.4.7
[20:33:53] <ErikRose> Sure, sounds like a great feature to aim for.
[20:34:01] <dstufft> it makes your deploys 100% reproduceable via the lockfile, without having to manually manage the lockfile ala requirements.txt
[20:34:21] <ErikRose> (well, it would if somebody added hash-checking to the lockfile :-))
[20:34:29] <dstufft> plus Gemfile you only specify your top level stuff, but lockfiles list the entire dependency tree
[20:34:36] <dstufft> so circling back to peep
[20:34:40] <ErikRose> riiiight
[20:35:28] <ErikRose> At the risk of replaying the whole conversation from https://github.com/pypa/pip/issues/1175 again, why not dispense with the non-pinning reqs file and favor setup.py or the upcoming declarative equivalent?
[20:35:32] <dstufft> I see two ways to do it, one is "add it onto the existing stuff" in which case i'd just shoehorn it into the existing #md5=<foo>&sha256=<bar>" stuff that we already support (we support it for #egg and #subdirectory)
[20:35:55] <dstufft> or "get a req 2.0 format figured out"
[20:35:59] <ErikRose> Granted, it would require a real dependency solver to obsolete the use case of "I'm smarter than the resolver, so I did it by hand" use case, but I'm not in a hurry.
[20:36:17] <dstufft> peep isn't really related to the req 2.0 format other than it'll provide a nicer way of specifying that info
[20:36:24] <ErikRose> Yes.
[20:36:42] <ErikRose> But. Do we support Django==1.4#sha256=abcdef ?
[20:36:49] <ErikRose> I posit that we do not.
[20:37:34] <dstufft> ErikRose: setup.py includes metadata, rquirements files do not, setup.py are "abstract" dependencies (I depend on requests, but I don't know where it's coming from because this is a package not an environment file) whereas requirements are concrete (I depend on requests, and I give an explicit source url of PyPI)
[20:37:35] <ErikRose> Very seldom do I yank something out of version control in a reqs file. Very often do I want to pin the hashes nonetheless.
[20:37:58] <dstufft> ErikRose: I'm pretty sure we don't support that at the moment, but I don't think there's any reason we couldnt is what I mean
[20:38:06] <dstufft> there's an open ticket to support hashes like that already
[20:38:17] <ErikRose> It's probably that one I keep pasting.
[20:38:22] <Ivo> current requirements is just too simple, and bolting on things makes it unwieldly; already a lot of people complain and get confused by needing various --option's for different things in req files, and its not even handled properly everywhere https://github.com/pypa/pip/pull/1720
[20:38:43] <dstufft> nah there's an older one
[20:38:45] <dstufft> i think I opened it
[20:38:52] <ErikRose> I haven't seen it; I'd like to.
[20:39:02] <dstufft> I don't think it has any info except "it would be nice to do this"
[20:39:06] <dstufft> let me see if I can find it
[20:39:10] <ErikRose> Anyway, that's a valid point about reqs.txt specifying sources.
[20:39:26] <ErikRose> It's too bad we have to repeat info from setup.py, though, like package names.
[20:39:40] <ErikRose> I guess you've got to link them somehow.
[20:39:52] <dstufft> ErikRose: well you don't have too
[20:39:59] <dstufft> in the current format you just do "."
[20:40:28] <dstufft> now for peep like functionality you have to do that because pip lacks a lockfile :)
[20:40:59] <dstufft> ah my ticket was specifically about supporting hashes on urls in a reuqirements.txt or on the cli
[20:41:14] <dstufft> not for Django==1.4#sha256=, but for http:///.....#sha256=
[20:41:15] <dstufft> https://github.com/pypa/pip/issues/468
[20:41:42] <ErikRose> Right. And, for the record, I would never propose the syntax Django==1.4#sha256=.... ;-)
[20:42:20] <dstufft> well I would, for shoe horning peep into requirements.txt 1.0 :]
[20:42:45] <ErikRose> It's quasi-consistent with the URL format. It's got that going for it. But weird? Oh boy.
[20:43:01] <dstufft> the whole requirements.txt format is wierd and inconsistent
[20:43:11] <dstufft> it's consistently inconsistent!
[20:43:17] <ErikRose> lulz
[20:43:40] <ErikRose> Alright, this is overwhelming, and I am tired. I think I've got to sleep on this and let it all settle.
[20:43:41] <Ivo> basically just stuffing the pip command line syntax into lines in a file
[20:43:42] <dstufft> so I guess it depends on how much work you want to do and/or how long you want tto wait
[20:43:53] <dstufft> shoe horning gets it in faster, and with less work :)
[20:44:01] <dstufft> ErikRose: have fun!
[20:44:03] <ErikRose> I like the less work part. :-)
[20:44:07] <ErikRose> Good night!
[20:44:39] <Ivo> dstufft: should pip docs changes be merge dto develop or master?
[20:45:11] <qwcode> dstufft, I actually did a PR for your #468 long ago https://github.com/pypa/pip/pull/735
[20:45:17] <qwcode> never merged it
[20:45:42] <dstufft> Ivo: are they updates that are useful to the current version?
[20:46:08] <Ivo> just typo fixes https://github.com/pypa/pip/pull/1730
[20:46:20] <dstufft> those can go to master
[20:46:22] <qwcode> I think that's already fixed in develop
[20:46:55] <Ivo> what is the workflow of taking things from develop to master? what's master's purpose atm? :S:S
[20:47:30] <qwcode> Ivo master = what's released
[20:47:51] <Ivo> so master should be just 1.5.4
[20:48:00] <Ivo> atm
[20:48:03] <qwcode> the last phase of the release process is usually to merge to master
[20:49:00] <Ivo> so you don't want things going willy nilly into master then?
[20:49:02] <qwcode> I occasionly cherry-pick docs stuff to master, only if it's critical, but usually after merging to develop first
[20:52:52] <dstufft> yea the same
[20:53:00] <dstufft> I will update master if it's a change to something non-release
[20:53:11] <dstufft> like the docs, or the release scripts
[20:53:43] <Ivo> Maybe I'll leave merging in docs fixes to you guys for now then ^_^
[20:54:49] <qwcode> it would be harmless to merge this really, but for me, I would close it, and just say it's already fixed in develop
[20:56:15] <Ivo> soo... anyone wanna review my PRs for pip? There are actually some that should make testing a lot faster :D
[20:57:03] <Ivo> That Guy dude thought we could cut the time by at least half, so that's exciting..
[20:58:18] <Ivo> dstufft: I take it you're trying to keep your headspace mostly in warehouse atm
[20:58:20] <qwcode> Ivo, for testing (or any "small" changes), feel free to merge yourself if you feel comfortable/confident with it. (my opinion at least)
[20:59:34] <dstufft> Ivo: more or less, though I context switch if I need to :]
[21:01:41] <Ivo> qwcode: new guy nerves...
[21:02:11] <Ivo> bam
[21:04:13] <qwcode> we had conversations in the past about trying to enforce a 2 vote rule or something... but the informal rule now seems to be solo merges are ok for "small" things, and it's a judgement call to bring in votes when you're not sure, or when it's obvious it's a major decision or controversial
[21:04:51] <Ivo> cryptography.io has a gentleman's rule that you can't merge your own PR
[21:05:26] <qwcode> good luck with that here...
[21:05:28] <Ivo> I think a good middleground would be to just look for one other dev's LGTM (i.e code review) before merging
[21:06:29] <Ivo> e.g like https://github.com/pypa/pip/pull/1725
[21:09:45] <dstufft> I think it's hard to get reviews here partially because it's really hard to actually grok what any one change does without a bunch of untangling
[21:09:59] <dstufft> and folks generally don't have time to do that for all the PRs :(
[21:11:07] <Ivo> sad that's the case... :/
[21:11:42] <dstufft> I think if we actually manage to refactor pip so that it makes sense, reviews would be a lot easier
[21:11:50] <qwcode> looking for old pypa-dev thread from a year ago...
[21:12:24] <dstufft> but part of the problem there is refactoring is hard, our test suite isn't the greatest, and I think I'm the only one who gets time in their day job to do this, so time is short :/
[21:12:28] <qwcode> yea, people are scared to approve stuff they didn't analyze for 3 hours...
[21:12:54] <qwcode> and it's usually only the author who spent 3 hours
[21:13:08] <Ivo> this is also most definitely the case for virtualenv as well, except 1000 times worse
[21:13:10] <dstufft> I think this is also why we are really bad at merging PRs from non contributors too
[21:13:35] <dstufft> yea well
[21:13:43] <dstufft> virtualenv is even more hacky than pip
[21:13:48] <dstufft> and gets less attention
[21:14:00] <pf_moore> yeah, I try to be a bit more willing to merge PRs from non-contributors than I would be for my own ones, just because it's good to give encouragement
[21:14:42] <dstufft> if I ever finish virtualenv 2.0 it should be way cleaner
[21:14:55] <dstufft> and the hacks should be isolated into one danger area
[21:14:55] <qwcode> back when I was young... https://groups.google.com/forum/#!topic/python-virtualenv/dPDkQWaBXts
[21:15:05] <Ivo> could we merge a whole lot of stuff for virtualenv, and release a beta for a month? I know you were sad that not very many people have tried your pip RCs dstufft but I can't see any other way and there are some fixes that would really help some ppl atm
[21:15:32] <qwcode> wow, I said "and wait for at least 3 votes IMO"
[21:15:36] <dstufft> qwcode: lol
[21:15:45] <Ivo> what's a free cycle?
[21:16:00] <dstufft> Ivo: we can merge stuff sure :]
[21:16:08] <dstufft> generally we release virtualenv and pip together
[21:16:15] <dstufft> but I don't really think it'd be bad to decouple those
[21:16:31] <Ivo> isn't it more of a one way relationship?
[21:16:51] <Ivo> if pip comes out there's no reason not to release a patch version for virtualenv as well
[21:17:12] <qwcode> "I agree everyone can solo merge/push simple bugs, fixes and docs"
[21:17:55] <pf_moore> what stuff for virtualenv? A lot of the recent churn seems to be around the activate scripts. I'd be tempted to say they are OK to merge just because they are (somewhat) optional.
[21:18:03] <dstufft> Ivo: I'm just saying we've historically had the virtualenv and pip release cycles aligned
[21:18:17] <dstufft> but I don't really think it'd be a bad idea to say we don't care about that
[21:18:31] <qwcode> hah, "the point of this, is so nobody has to be a super hero solo merger."
[21:18:48] <Ivo> pf_moore: just *fixing* activate scripts for shells, and also there's one that stops virtualenv working on arm atm; off the top of my head
[21:19:20] <pf_moore> dstufft: I think the main thing is not to release virtualenv with pip included then release a new pip a couple of weeks later. Otherwise, it's not a big deal
[21:19:56] <pf_moore> Ivo: yeah, I generally don't comment on those because as a non-Unix guy I don't really have a view
[21:20:09] <Ivo> pf_moore: you need to run a virtualbox more often :)
[21:20:24] <pf_moore> :-)
[21:20:52] <Ivo> I need an rpi/beagleboard to tinker with so I've got an arm thing to test on
[21:21:01] <dstufft> sooner or later we'll convert pf_moore to the dark side
[21:21:02] <pf_moore> I *can* run Unix, it's just that when I do I spend more time getting annoyed at how hard it is to set up Ubuntu or whatever how I like it than actually testing anything ;-)
[21:21:22] <dstufft> (and then pip will never work on Windows ever again because i'm pretty sure the only reason it does is pf_moore
[21:21:25] <qwcode> gotta run, just when it was getting exciting...
[21:21:36] <pf_moore> lol
[21:22:14] <dstufft> like when I broke the entire test suite on windows and nobody noticed until pf_moore tried to run the tests :V
[21:22:17] <Ivo> he did single handlely procure a whole python patch release to fix windows
[21:22:42] <Ivo> that was fun
[21:22:58] <pf_moore> what I like about all this is that all I actually do is moan, then other people do the work and I get the credit. Total win :-)
[21:23:04] <Ivo> although I do feel sorry for the poor guy's patch that had to get reverted
[21:23:09] <dstufft> pf_moore: ha
[21:23:12] <dstufft> pf_moore: I know that feel
[21:23:17] <pf_moore> But yeah, it was fun to be the cause of a bbrown-bag Python release...
[21:23:19] <dstufft> people just associate me with anything packaging
[21:23:45] <dstufft> like when people thank me for wheel support and i'm all like, "well actually I had nothing to do with that :V"
[21:24:18] <pf_moore> yeah, dholth must be getting pretty fed up with everyone else getting the credit...
[21:24:48] <dstufft> on the flip side whenever anything packaging breaks I get told about it :V
[21:25:05] <pf_moore> I even tried to create a bdist_simple format. Got nowhere, then Daniel came up with wheels and saved the world.
[21:25:22] <pf_moore> dstufft: yeah, but you then go and fix it. You're your own worst enemy
[21:25:23] <Ivo> i was gonna say, you must sometimes feel like a mother for 100 kids to tell their problems to sometimes dstufft
[21:25:56] <dstufft> Ivo: heh, soemtimes yes
[21:26:02] <dstufft> I have a problem saying no
[21:26:08] <dstufft> or not sticking my nose in things
[21:26:08] <Ivo> including yours truly
[21:26:25] <dstufft> like now i'm on the fedora and debian mailing lists for python
[21:26:30] <dstufft> and the relevant irc channels
[21:26:39] <dstufft> beucase they were screwing up ensurepip I had to stop them :[
[21:26:57] <Ivo> sticking your nose in stuff is great fun! YOu just need an iron nose...
[21:28:09] <Ivo> heh, they seemed to have a "kill it with fire" attitude the first whiff they got of ensurepip :(
[21:28:24] <dstufft> oh this is better than before
[21:28:38] <dstufft> they were _really_ nasty on their mailing lists when PEP453 was being discussed
[21:28:51] <dstufft> some people were even trying to get pip removed from the distros all together
[21:29:13] <Ivo> ...yoikes
[21:29:42] <pf_moore> this is what, as an outsider, I don't understand with Linux
[21:29:53] <pf_moore> It's my PC, why can't I screw it up if I want?
[21:30:18] <dstufft> pf_moore: you can! you just can't install it from the built in installer
[21:30:24] <Ivo> pf_moore: it all comes from the fact they provide all the software for the people that use their OS (where windows gives you a browser and tells you good luck)
[21:30:42] <dstufft> if you call IE a browser :D
[21:30:43] <dstufft> ;P
[21:30:49] <dstufft> acutally i'm told the latest IE's are quite good
[21:30:57] <pf_moore> IE is the installer for Chrome...
[21:31:05] <dstufft> lolol
[21:31:06] <dstufft> bascially
[21:31:18] <dstufft> IE: Because microsoft doesn't have wget
[21:31:21] <pf_moore> (Although yes, IE isn't bad these days)
[21:31:59] <Ivo> so when pip comes along and also purports to give people software, and (OMGOSH) from an uncurated index, tempers get frayed
[21:32:15] <dstufft> epsecially when it touches the sacred system files
[21:32:25] <Ivo> I think they'll like us a lot more if wwe manage to get --user by default working sometime
[21:32:27] <dstufft> because they get bug reports saying apt-get <whatever> failed
[21:32:43] <dstufft> because someone did something with pip and broke their own system
[21:32:51] <dstufft> Ivo: yes I think so
[21:33:11] <pf_moore> yeah, but I sort of lose the thread when they complain about /usr/local - I thought that was specifically *for* the local sysadmin to use...
[21:33:32] <dstufft> pf_moore: pip installs to /usr/lib/python/X by default on most Linux systems
[21:33:45] <pf_moore> yeah
[21:33:51] <dstufft> it's on Debian deriviates that they install to /usr/local by default because they patched their python
[21:34:04] <dstufft> (and even then, pip will happily uninstall something apt-get installed)
[21:34:14] <Ivo> (when run under sudo, and it doesnt help it just outright fails when run without sudo, by default)
[21:34:14] <dstufft> but yea I think if we make --user the default, the distros will hate us far less
[21:34:54] <pf_moore> I see the point about pip and --user.
[21:35:29] <dstufft> i'm perfectly happy for that to be a linux only thing btw
[21:35:31] <dstufft> well a *nix
[21:35:45] <pf_moore> What I don't get is understanding where, on my Linux PC, I'm allowed to install "global" stuff that isn't OS-owned
[21:36:09] <dstufft> oops gotta go take my rental car back
[21:36:11] <pf_moore> I thought that was /usr/local, but the distros seem to expect to own that (or is that just debian/ubuntu?)
[21:36:11] <dstufft> they are almost closed
[21:36:19] <dstufft> pf_moore: no, /usr/local is for you
[21:36:20] <Ivo> you install global stuff that isnt distro owned, by running `sudo make install` with the source of some software
[21:36:41] <Ivo> but people discourage that a lot for the relevant reasons
[21:36:48] <dstufft> but people install stuff to /usr/local, break their OS, and then complain to Debian about it
[21:36:51] <dstufft> or Fedora, or whoever
[21:36:58] <dstufft> and that annoys debian/fedora/whoever
[21:37:09] <dstufft> and it so happens that pip guides you to doing that
[21:37:18] <dstufft> (sudo pip install is more obvious to do than pip install --user)
[21:37:32] <Ivo> maybe we should change that dstufft
[21:37:39] <pf_moore> well, I'd say that if /usr/local is for me, and installing stuff there breaks the OS, then that's an OS bug (just not the one you'd first think it is)
[21:37:44] <Ivo> make pip install --user more obvious than sudo pip install
[21:37:53] <dstufft> pf_moore: it's complicated :)
[21:38:01] <dstufft> because sometimes you want that stuff to take precidence
[21:38:06] <dstufft> but I gotta go for real
[21:38:07] <dstufft> see ya
[21:38:13] <pf_moore> lol exactly - the dark side always is. All those greys...
[21:38:13] <Ivo> cccccatch
[21:38:15] <pf_moore> see ya
[21:46:02] <Ivo> yoikes, the ukrainian intelligence hq has been captured by protesters
[21:49:10] <Ivo> whoever this guy is he does interesting reporting https://www.youtube.com/watch?v=CUNVwbp_JSA