PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Thursday the 16th of July, 2015

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[09:56:31] <mgedmin> new virtualenv creates 'local' as a directory tree full of symlinks, why?
[09:57:00] <mgedmin> why do I care: virtualenv . && bin/pip install pytest && bin/py.test now runs all the tests twice: once from src/, once from local/src/
[10:06:29] <mgedmin> amazingly it even creates local/.git, which makes 'git clean -dfx' leave the local/ subdir alone
[10:21:53] <jmurphyau> using pip -t and PYTHONPATH as an alternative to virtual env: http://blog.zoomeranalytics.com/pip-install-t/
[10:22:07] <jmurphyau> im just wondering - why dont people do that
[10:22:19] <jmurphyau> does virtual env give additional features/functions that that doesn't?
[10:22:49] <[Tritium]> full isolation from system packages, where pythonpath does not
[10:25:24] <mgedmin> an interesting approach for importable modules; how would it handle scripts?
[10:25:45] <mgedmin> e.g venv/bin/pip install coverage; venv/bin/coverage run ... becomes ... ?
[10:27:19] <[Tritium]> https://bpaste.net/show/c870b693c6b9 << note that the stdlib, and platform specific locations are in sys.path, but all other paths, including all instances of site-packages (the important bit), is prefixed by the venv location. There is no polution of site-packages from the system python
[10:30:17] <jmurphyau> @[Trinitum]: doesn't python -S or python -s provide that functionality?
[10:31:12] <jmurphyau> mgedmin: I got that to work using this: export PYTHONPATH=libs/; python -S -m coverage
[10:31:23] <jmurphyau> -m runs the module
[10:31:31] <mgedmin> not all console scripts can be invoked using the python -m syntax
[10:32:06] <ronny> jmurphyau: also note that pip will ignore what is installed in the target and anways install all deps
[10:32:06] <jmurphyau> ok cool.. so I started using that method and ran into that problem initially
[10:32:30] <jmurphyau> and there are also other scripts/bins that virtual environment works with - that pip -t and PYTHONPATH wont, or will have trouble with
[10:32:32] <jmurphyau> is that the case?
[10:33:33] <jmurphyau> im just trying to understand why virtual env is so wide spread and trying to sell it to myself since I now have to use it.. it just seem like extra layers/complexity to something that shouldnt be that hard
[10:34:14] <jmurphyau> ronny: didn't know that
[10:35:05] <[Tritium]> virtualenvs actually get out of the way. They do not sit between you and python, they make a copy of python in such a way that python thinks the virtualenv is the real environment.
[10:35:41] <[Tritium]> its not really abstraction in the sense of a wrapper around python, if thats what you were thinking
[10:51:04] <jmurphyau> i think the fact you feel like you're in a different environment (which i guess you are) when you type 'activate' is the big thing for me.. having to type 'activate' to transform my shell into a python virtual environment - it's almost like you need to have one dedicated shell session for a python virtual environment.. and the activate_this.py file as well.. it all just seems a bit much
[10:51:33] <jmurphyau> this is based on comparing/expecting something similar to npm/node
[10:51:50] <mgedmin> I don't like activate either
[10:52:16] <mgedmin> which is why I used to do 'virtualenv .' at the project root and then use bin/python ... or, more often, bin/somescript ...
[10:52:40] <mgedmin> this is becoming untenable as virtualenv (& pip) create all kinds of junk in the virtualenv root
[10:52:51] <mgedmin> so I'm switching to 'virtualenv .env && ln -sf .env/bin bin'
[10:53:39] <mgedmin> anyway, if you run python (or scripts) using the full (relative) pathname, then you never need to "activate" the venv
[10:55:30] <doismellburning> .env is for envdirs, simple ;)
[10:55:34] <doismellburning> .tox is for virtualenvs ;)
[10:55:57] <mgedmin> what are "envdirs"?
[10:58:37] <doismellburning> http://cr.yp.to/daemontools/envdir.html
[10:59:49] <mgedmin> omg that document is terrible about (not) explaining the purpose and conventions
[11:00:10] <mgedmin> anyway I see https://github.com/jpadilla/django-dotenv also uses .env for a different purpose
[11:00:35] <doismellburning> mgedmin: er, aiui entirely the same purpose
[11:06:39] <[Tritium]> I dont use activate, i use .venv/bin/python. .venv is in my gitignore
[11:12:22] <jmurphyau> ok.. [some path starting with . ]/bin/python, and .path/bin/pip
[11:12:39] <jmurphyau> that makes it more appealing to use
[11:15:00] <doismellburning> [Tritium]: similarly
[11:31:05] <ronny> hmm
[12:37:53] <Tarffull> Hi! I have a problem with a virtualenvironment and django. When installing third party tools for django (django-nose, django-bootstrap3) with pip install -e git.. everything works great. But when i install it just pip install django-nose i get a DistributionNotFound error on a completely different app when trying to run my application. What is the difference between pip install and pip install -e and how can i avoid this problem?
[12:48:23] <mgedmin> Tarffull, it might help us answer if you pastebin the two shell sessions (where you use -e and where you don't)
[12:48:48] <Tarffull> mgedmin: https://dpaste.de/iuKo
[12:49:20] <Tarffull> mgedmin: I get that exception when running my app after install django-nose w/o -e.
[12:50:07] <mgedmin> strange
[12:50:15] <mgedmin> AFAICT django-nose has nothing to do with measurements
[12:50:51] <mgedmin> https://pypi.python.org/pypi/measurements doesn't exist, is it a private package?
[12:52:59] <Tarffull> Yes it's a private package and has nothing to do with django-nose. That's why it's so strange.
[12:54:09] <mgedmin> did you install it with pip install -e?
[12:54:41] <Tarffull> No, without.
[12:54:48] <mgedmin> setup.py install?
[12:54:52] <mgedmin> manual cp of files?
[12:54:59] <mgedmin> pip install without -e?
[12:55:25] <mgedmin> (silly question, -e wouldn't've copied the files inside .venv/lib/)
[12:55:25] <Tarffull> pip install w/o -e
[12:55:30] <mgedmin> right
[12:55:42] <mgedmin> and what are you executing when you get that traceback?
[12:56:45] <Tarffull> python portal/manage.py test portal --settings=portal.settings_test --liveserver=localhost:5000-6000
[12:56:54] <Tarffull> (It's a django site)
[12:58:43] <mgedmin> have you activated the virtualenv?
[12:58:54] <mgedmin> do you see 'requirements' when you run pip list (or pip freeze)?
[12:59:21] <mgedmin> stepping back: in order for pkg_resources.get_distribution('measurements') to work, the package needs to be installed
[12:59:45] <mgedmin> I'm fuzzy on the details on what that means exactly
[13:00:16] <mgedmin> a directory called measurements-{version}.egg-info (or .dist-info in the brave new world?) somewhere on sys.path
[13:00:30] <Tarffull> Yes, the virtualenvironment is activated (i have regeneragted it multiple times). I guess you are asking if measurements is in pip freeze? Yes it is.
[13:00:38] <nedbat> I tried to build a wheel with 3.5b3, and it failed, and I don't know whether to file a bug against 3.5, or wheel, or what: https://gist.github.com/nedbat/f0beb65a55e05599a931
[13:01:09] <mgedmin> Tarffull, is there a measurements*.egg-info in /home/epontpe/ivy-portal/.venv/lib/python3.2/site-packages/ ?
[13:01:20] <Tarffull> mgedmin: Yes.
[13:01:57] <Tarffull> mgedmin: And it is no problem to import it from a python shell.
[13:03:29] <mgedmin> approaching from the other side, why django-nose affects this
[13:04:02] <mgedmin> either it's because you get a different versions (pypi has 1.4.1, if google's cache is to be trusted -- can't be bothered check pypi itself, it 503s me)
[13:04:10] <mgedmin> or because pip install -e differs from a regular pip install
[13:04:19] <mgedmin> changes since the last release: https://github.com/django-nose/django-nose/compare/v1.4.1...master
[13:04:38] <mgedmin> maaaybe one of those triggers the decorator that tries to take measurements
[13:04:49] <mgedmin> i.e. maybe the problem is always there, just older django-nose fails to trigger it
[13:05:15] <mgedmin> so, can you import pkg_resources and pkg_resources.get_distribution('measurements') in a python shell?
[13:05:23] <Tarffull> Yep!
[13:05:38] <Tarffull> Pypi and git are the same version of djang-nose.
[13:05:42] <mgedmin> I'd be tempted to pprint(sys.path) in the django shell
[13:05:51] <mgedmin> then run the tests again with --pdb, pprint sys.path again, and compare
[13:05:57] <mgedmin> maybe something will pop up
[13:06:18] <Tarffull> Already done! No difference between them.
[13:06:45] <Tarffull> sys.path stays the same, but w/o -e no distributions can be found.
[13:07:51] <ionelmc> nedbat: maybe you need to force a certain tag (workaround)
[13:07:58] <Tarffull> I've tried printing ','.join(pkg_resources.Environment()) from the test and it is empty when django-nose is installed w/o -e.
[13:08:00] <nedbat> ionelmc: i don't know what a tag is
[13:08:26] <ionelmc> nedbat: dunno what values those "tag" variables have
[13:08:28] <ionelmc> hard to say
[13:08:37] <mgedmin> nedbat, an assertion error is always a bug in the package that raises it, imho, so I'd file it
[13:08:47] <nedbat> mgedmin: good point
[13:08:52] <ionelmc> nedbat: you should either patch up the wheel code to see what's going on or use hunter (it has a fancy variable printer)
[13:08:59] <mgedmin> user-caused errors shouldn't be assertion errors, they should be something specific and with a clear error message
[13:15:23] <Tarffull> mgedmin: https://dpaste.de/igUW
[13:16:05] <mgedmin> that is a lot of badly-formatted text
[13:16:21] <Tarffull> Yes, I'll fix that. Sotty.
[13:16:54] <mgedmin> I'm not sure I could help you either way sorry -- not really familiar with pkg_resources internals :/
[13:17:35] <Tarffull> mgedmin: Ok, thank you for your time :)
[13:20:44] <ionelmc> woot
[13:20:51] <ionelmc> you got distribute installed? and 3.2?
[13:21:46] <Tarffull> Yes.
[13:27:30] <ionelmc> Tarffull: quite odd
[13:27:52] <Tarffull> ionelmc: Is it?
[13:28:09] <ionelmc> this prolly doesn't help at all right now but maybe don't use unmaintained software like distribute or 3.2
[13:28:32] <ionelmc> pretty hard to get help with those as few people use them nowdays
[13:33:12] <Tarffull> I know, it's old because of other reasons beyond my control.
[13:36:51] <ronny> Tarffull: elevate it as a critical imprediment that will cost a few thousand if you have to work around it instead of an admin fixing it
[13:52:15] <Tarffull> Just installed a fresh virtualenvironment with python 3.4 and setuptools instead of distribute. Worked like a charm! :D
[13:54:17] <ionelmc> Feels like a new house
[16:06:14] <benjaoming> Hi all! Would anyone in here have an experience about collecting statistics for their PyPi releases? Am releasing an open source project, but the organization behind is a bit skeptical about releasing on PyPi because of the download counts being polluted by bots and mirrors :/
[16:06:58] <benjaoming> Also, is there a replacement for http://pypi.python.org/stats/ ?
[16:09:46] <dstufft> benjaoming: no replacement for that, better stats is on the roadmap for PyPI but not in the near term. Also unless you make your thing completely uninstallable via pip you're going to get some mirrors and bots downloading it
[16:15:13] <benjaoming> dstufft: Thanks! I'm working with a charitable organization and their support relies partly on being able to prove the usage of the software, I can imagine that other projects face the same challenge. Would there happen to be a "honey pot" package available that can be used for correcting download counts? That would be an okay quick fix for us.
[16:16:12] <dstufft> benjaoming: not really, and it's hard for that to be made because you'll get a couple hundred downloads anytime you make a release basically from mirrors
[16:16:52] <Wooble> sticking google analytics on my documentation page was my solution, but I don't have to prove anything to anyone.
[16:18:09] <benjaoming> dstufft: yeah, I was thinking that the challenge could be solved because we know the release dates of both our own package and a given honey pot package. So we can use the latest release stats from the honey pot (1-2 days after release) to correct the first 1-2 days of stats at any given release of the real package.
[16:18:27] <tdsmith> the fake downloads are a nice ego boost
[16:23:19] <dstufft> benjaoming: I mean, the download counts can only really be taken with a grain of salt anyways. A mirror only counts as one download although they may be really masking thousands of downloads or they could just be inflating the download count, pip caches downloads by default now, so you'll only get 1 download per machine unless they clear their caches
[16:28:02] <benjaoming> dstufft: The pip cache is welcome, I guess the most interesting question is: How many users does our software have? We will surely have the stats polluted by upgrades as well, so there's a lot of uncertainty in the puzzle... making it an interesting task to have bestowed :)
[16:31:51] <dstufft> benjaoming: yea, and CI can often be a big influencer as well. If someone is running pip with cache disabled or in throw away environments in CI you can get a ton of donwloads from the same source :/
[16:31:59] <benjaoming> It might also be that we shouldn't be thinking too much about "absolute" numbers in the counts, but more about relative counts.. the ups and downs over time. Like economics, there's a general inflation (the mirrors), purchasing parity (the upgrades and cache), financial speculation (the bots, those leeches) and then growth should be adjusted accordingly :)
[16:32:42] <dstufft> I've often thought that we should get rid of numbers on the pages and just put something that's relative or fuzzier than pure numbers
[16:32:58] <dstufft> I also think sometimes about how we could get something more like Debian's popcon
[16:33:27] <dstufft> but you have to decide what you mean by an installation then too :V is it 1 per machine? 1 per python version? 1 per virtual environment? :V
[16:34:19] <dstufft> benjaoming: not that it helps you in the short term, but if you have any particular insights you'd like to be able to see, you can open issues up on github.com/pypa/warehouse
[16:34:27] <dstufft> that's where PyPI 2.0 is happening
[16:48:07] <benjaoming> dstufft: thanks for the pointer, I'll have a look and see if I can contribute -- but maybe the responsibility would be for pip and other clients to be more explicit about "who they are". E.g. pip could say "i'm being run by a normal user" in its user agent header. Pip could also tell PyPi that it's being run with an --upgrade flag.
[16:50:08] <dstufft> benjaoming: it's hard to do that though, because you don't know that the --upgrade flag really applies here, for instance: I do pip install --upgrade foo, and I upgrade from foo 1.0 to foo 2.0, foo 2.0 includes a _new_ dependencies on bar so this is the first time bar was installed, is that an "upgrade" or is it a new installation?
[16:55:34] <tdsmith> popularity percentiles would be interesting but potentially depressing
[17:01:17] <benjaoming> tdsmith: if you need more downloads for your package, you can try convincing other popular packages to depend on it.. kind of like how the academic citation system works :)
[17:05:48] <benjaoming> dstufft: yes I think the "most optimal way" (since there's no potential of a correct one) could be more information from pip, for instance reporting that the --upgrade flag was applied and to which of the downloads (dependencies) it was relevant.. then leave the interpretation of the statistics to the package owners. Pip could add a couple of extra HTTP headers every time it downloads?
[17:06:10] <benjaoming> ...just brainstorming
[18:08:56] <wsanchez> Is there a way to convince virtualenv to copy all all of the system python into the environment? ie. so it's totally self-contained
[18:09:33] <dstufft> wsanchez: No, but if you want the entire Python you probably just want to compile your own Python with --prefix?
[18:10:13] <dstufft> one of the core ideas of virtual environment is that it's lighter weight then a complete copy
[18:10:27] <wsanchez> Yeah… was hoping for a way to do that without building python for every app
[18:10:42] <wsanchez> ok
[18:10:45] <dstufft> wsanchez: you might be looking for something like PyInstaller
[18:12:48] <wsanchez> ah, yeah maybe that. Thanks!
[18:14:12] <dstufft> wsanchez: no problem :)
[18:15:14] <dstufft> wsanchez: there's also pex if you are OK with depending on Python itself being installed, but you don't want to install things into that thing or modify that global environment
[18:16:13] <Yasumoto> wsanchez: https://github.com/pantsbuild/pex/blob/master/docs/whatispex.rst