PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Friday the 12th of February, 2016

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[06:09:54] <ngaio> for a python end user app, what is the preferred way to create a man page these days? pod2man?
[06:11:03] <Ivo> ngaio, the sphinx project has a manual page builder
[06:11:23] <Ivo> if you construct your documentation as a sphinx rst project
[06:11:48] <ngaio> Ivo, is sphinx for more than just module developers?
[06:12:04] <Ivo> yes
[06:12:42] <Ivo> people use it to document apps, apis, tutorials / guides, etc
[06:13:16] <ngaio> can I just write my man page as an rst file and it will convert it for me?
[06:13:40] <Ivo> that's the basic gist, yes
[06:14:12] <Ivo> http://www.sphinx-doc.org/en/stable/config.html#confval-man_pages
[06:14:12] <ngaio> cool, thanks :-)
[06:36:59] <dw> hey. i've a package which uses stupid version numbering (0.nn), i'd like to fix that version numbering in v1.n.n, but in the meantime, i'd like to keep it until something worthy of v1.n.n happens. will distutils & friends handle version numbers like v0.nnn and treat them higher than v0.nn?
[06:37:24] <dw> currently at 0.89, which means this'll probably break by december :)
[06:38:44] <njs> yeah, version numbers are integers separated by dots
[06:47:22] <dw> that much should be obvious, but with packaging i've learned to find the obvious surprising ;)
[10:50:49] <linovia> Does the setup's long_desc override the README file when uploading to pypi ? I'd like to keep the Django REST framework README.md and have a better pypi page. I'm considering using pandoc to convert the markdown to rest in the long_desc but can't find reference on how pypi displays the long_desc vs README
[11:03:21] <wiggy> linovia: no, but only because the README file is completely ignored :)
[11:03:45] <wiggy> normally people read README in setup.py and use the contents as long_description
[11:09:13] <linovia> Strange, We don't have long desc and yet the read me.md is displayed
[11:12:31] <Ivo> linovia, which package
[11:15:13] <Ivo> linovia, I'm guessing pypi will try to find and display a README.* if there is no long_description, but PyPI only html renders restructured text files.
[11:16:56] <linovia> Ivo: djangorestframework
[11:17:30] <linovia> Fine if it goes with long desc first
[11:17:40] <linovia> Perfect even
[11:18:22] <Ivo> Highly suggest learning rst to be able to use sphinx and as documentation tool :)
[11:18:31] <Ivo> because sphinx is freakin' awesome as that
[11:19:14] <wiggy> linovia: why not use find_packages instead of writing your own get_packages?
[11:20:14] <linovia> wiggy: no idea, might be historical
[11:20:51] <Ivo> Also weird that the project "djangorestframework" doesn't have django as an install dependency...
[11:20:52] <Ivo> lol
[11:21:30] <linovia> Ivo: that's the integration job
[11:21:50] <Ivo> eh?
[11:22:41] <linovia> And in some case i just use non Django classes too
[11:24:25] <Ivo> just need to have `install_requires=['django>=1.7'],`
[11:42:33] <neredsenvy> How can I put this line: $ pip install -e git+https://github.com/georgeyk/mezzanine-instagram-gallery.git#egg=mezzanine_instagram_gallery
[11:42:41] <neredsenvy> Into my requirements.txt
[11:42:42] <neredsenvy> file
[11:43:00] <neredsenvy> if I add only mezzanine_instagram_gallery it won't find it
[11:43:57] <neredsenvy> ok just adding -e git+https://.. to requirements worked : /
[11:44:03] <neredsenvy> not sure if everything will be ok
[11:54:35] <nedbat> neredsenvy: why do you need it to be -e ?
[22:22:52] <wh3ko19> I asked this over in #python already, but maybe someone here knows better - I'd like to build a .egg for numpy, scipy, scikit-learn and pandas that includes all required .so files. I can't use a wheel for this.
[22:22:53] <wh3ko19> ANy ideas?
[22:30:26] <tdsmith> i think you should use a wheel
[22:30:43] <wh3ko19> Good idea.
[22:31:20] <tdsmith> why are wheels bad?
[22:31:46] <wh3ko19> They're great, actually, but they're not supported by pyspark's -pyfiles flag.
[22:32:04] <tdsmith> gotcha
[22:32:20] <ngoldbaum> add support for it to pyspark?
[22:32:22] <wh3ko19> And it seems that's the only way I have to distribute additional dependencies to this Spark cluster, given that I don't have the ability to install anything on the clusters themselves.
[22:32:36] <ngoldbaum> although i guess you're constrained to a release pyspark version
[22:32:42] <wh3ko19> ngoldbaum: Probably not feasible, since I'd have to upgrade Spark on the cluster.
[22:33:02] <wh3ko19> And if I could do that, I could just install scipy directly :P
[22:33:10] <nanonyme> It probably still makes sense to hepl them implement it if does not yet exist in trunk
[22:33:28] <nanonyme> May end up thanking yourself in a couple of years
[22:33:44] <wh3ko19> I have my own neglected open source libraries to deal with already :P
[22:34:54] <ngoldbaum> so i guess https://pythonhosted.org/setuptools/formats.html would be helpful here
[22:35:44] <wh3ko19> Hm... So like patch the egg you mean?
[22:36:52] <ngoldbaum> i mean construct the egg yourself following that spec
[22:37:06] <ngoldbaum> why does the egg need to combine all the deps?
[22:37:13] <ngoldbaum> it's not possible to ship multiple eggs?
[22:37:32] <wh3ko19> ngoldbaum: No no, they can be multiple eggs.
[22:37:52] <wh3ko19> Just numpy for whatever reason doesn't bundle up libblas.so or libatlas.so or whatever - has some system dependencies.
[22:38:11] <ngoldbaum> ah, sure
[22:38:22] <ngoldbaum> i think it's possible to statically link BLAS when you build numpy
[22:38:43] <wh3ko19> Yeah, that's what I need. Didn't think to phrase it that way.
[22:38:46] <ngoldbaum> your spark server doesn't see the system BLAS installation?
[22:39:08] <ngoldbaum> http://stackoverflow.com/questions/16093910/numpy-and-scipy-static-vs-dynamic-loading
[22:39:10] <wh3ko19> Haven't deployed to spark yet, but I've got a build machine with similar problems that is not seeing it.
[22:39:18] <wh3ko19> Using that as a testbed.
[22:39:42] <wh3ko19> I'll dig into this.
[22:46:48] <njs> wh3ko19: there is not standard tool for doing what you want. Given that you're stuck writing a tool anyway, I would also suggest spending that effort on fixing the problem properly instead of spending a similar amount of time trying to hack together something fragile and unreusable
[22:47:18] <wh3ko19> njs: Fix it properly how?
[22:47:25] <njs> by teaching spark about wheels :-)
[22:47:35] <wh3ko19> Well, that would solve the problem for other people but not for me.
[22:47:59] <wh3ko19> Not to mention it seems like there may in fact be a standard way to do it using static linking.
[22:49:36] <njs> (I assume you're on linux from it being spark and from the mention of .so files -- if so then there is a standard way to do what you want for wheels; in fact those projects are currently working on providing exactly the wheels you want as a standard download. Relevant search terms would be "PEP 513" and "auditwheel". For eggs though, uh. I guess you could try emailing David Cournapeau; he's very busy but might know :-).)
[22:51:20] <njs> well, good luck :-)
[22:51:25] <wh3ko19> Yeah, it's on linux, but I can't update spark. If I could update spark I could also install them directly and not have the overhead of shipping out the scipy stack to every node in the cluster.
[22:51:30] <wh3ko19> On every run.
[22:51:42] <njs> fair enough
[22:52:23] <njs> I guess you could try downloading Enthought Canopy. They distribute everything using eggs. I suspect it will be a prohibitive amount of work to make them actually work for you though -- they have a really screwy installation method.
[22:53:25] <wh3ko19> Yeah, I think I'll just set up a static build.