[06:36:59] <dw> hey. i've a package which uses stupid version numbering (0.nn), i'd like to fix that version numbering in v1.n.n, but in the meantime, i'd like to keep it until something worthy of v1.n.n happens. will distutils & friends handle version numbers like v0.nnn and treat them higher than v0.nn?
[06:37:24] <dw> currently at 0.89, which means this'll probably break by december :)
[06:38:44] <njs> yeah, version numbers are integers separated by dots
[06:47:22] <dw> that much should be obvious, but with packaging i've learned to find the obvious surprising ;)
[10:50:49] <linovia> Does the setup's long_desc override the README file when uploading to pypi ? I'd like to keep the Django REST framework README.md and have a better pypi page. I'm considering using pandoc to convert the markdown to rest in the long_desc but can't find reference on how pypi displays the long_desc vs README
[11:03:21] <wiggy> linovia: no, but only because the README file is completely ignored :)
[11:03:45] <wiggy> normally people read README in setup.py and use the contents as long_description
[11:09:13] <linovia> Strange, We don't have long desc and yet the read me.md is displayed
[11:15:13] <Ivo> linovia, I'm guessing pypi will try to find and display a README.* if there is no long_description, but PyPI only html renders restructured text files.
[11:22:41] <linovia> And in some case i just use non Django classes too
[11:24:25] <Ivo> just need to have `install_requires=['django>=1.7'],`
[11:42:33] <neredsenvy> How can I put this line: $ pip install -e git+https://github.com/georgeyk/mezzanine-instagram-gallery.git#egg=mezzanine_instagram_gallery
[11:43:00] <neredsenvy> if I add only mezzanine_instagram_gallery it won't find it
[11:43:57] <neredsenvy> ok just adding -e git+https://.. to requirements worked : /
[11:44:03] <neredsenvy> not sure if everything will be ok
[11:54:35] <nedbat> neredsenvy: why do you need it to be -e ?
[22:22:52] <wh3ko19> I asked this over in #python already, but maybe someone here knows better - I'd like to build a .egg for numpy, scipy, scikit-learn and pandas that includes all required .so files. I can't use a wheel for this.
[22:32:20] <ngoldbaum> add support for it to pyspark?
[22:32:22] <wh3ko19> And it seems that's the only way I have to distribute additional dependencies to this Spark cluster, given that I don't have the ability to install anything on the clusters themselves.
[22:32:36] <ngoldbaum> although i guess you're constrained to a release pyspark version
[22:32:42] <wh3ko19> ngoldbaum: Probably not feasible, since I'd have to upgrade Spark on the cluster.
[22:33:02] <wh3ko19> And if I could do that, I could just install scipy directly :P
[22:33:10] <nanonyme> It probably still makes sense to hepl them implement it if does not yet exist in trunk
[22:33:28] <nanonyme> May end up thanking yourself in a couple of years
[22:33:44] <wh3ko19> I have my own neglected open source libraries to deal with already :P
[22:34:54] <ngoldbaum> so i guess https://pythonhosted.org/setuptools/formats.html would be helpful here
[22:35:44] <wh3ko19> Hm... So like patch the egg you mean?
[22:36:52] <ngoldbaum> i mean construct the egg yourself following that spec
[22:37:06] <ngoldbaum> why does the egg need to combine all the deps?
[22:37:13] <ngoldbaum> it's not possible to ship multiple eggs?
[22:37:32] <wh3ko19> ngoldbaum: No no, they can be multiple eggs.
[22:37:52] <wh3ko19> Just numpy for whatever reason doesn't bundle up libblas.so or libatlas.so or whatever - has some system dependencies.
[22:46:48] <njs> wh3ko19: there is not standard tool for doing what you want. Given that you're stuck writing a tool anyway, I would also suggest spending that effort on fixing the problem properly instead of spending a similar amount of time trying to hack together something fragile and unreusable
[22:47:25] <njs> by teaching spark about wheels :-)
[22:47:35] <wh3ko19> Well, that would solve the problem for other people but not for me.
[22:47:59] <wh3ko19> Not to mention it seems like there may in fact be a standard way to do it using static linking.
[22:49:36] <njs> (I assume you're on linux from it being spark and from the mention of .so files -- if so then there is a standard way to do what you want for wheels; in fact those projects are currently working on providing exactly the wheels you want as a standard download. Relevant search terms would be "PEP 513" and "auditwheel". For eggs though, uh. I guess you could try emailing David Cournapeau; he's very busy but might know :-).)
[22:51:25] <wh3ko19> Yeah, it's on linux, but I can't update spark. If I could update spark I could also install them directly and not have the overhead of shipping out the scipy stack to every node in the cluster.
[22:52:23] <njs> I guess you could try downloading Enthought Canopy. They distribute everything using eggs. I suspect it will be a prohibitive amount of work to make them actually work for you though -- they have a really screwy installation method.
[22:53:25] <wh3ko19> Yeah, I think I'll just set up a static build.