PMXBOT Log file Viewer

Help | Karma | Search:

#mongodb logs for Wednesday the 28th of March, 2018

(Back to #mongodb overview) (Back to channel listing) (Animate logs)
[00:53:02] <Jonno_FTW> sivy: that entirely depends on the aggregation you are doing and how it's handled by the query optimiser, do some tests first to see how long a typical request takes
[00:54:45] <sivy> Jonno_FTW: hello! yes, I’ve spent several days now constructing tests, and the aggregation - as simple as i can make it - takes anywhere from 150+s to over 500s
[00:54:58] <sivy> it’s just not a good use case for mongo
[00:55:13] <Jonno_FTW> sivy: you could always email the results to the user
[00:55:15] <sivy> (it’s handled our non-aggregation work excellently, no complaints)
[00:55:26] <Jonno_FTW> (I have the same problem)
[00:55:29] <sivy> Jonno_FTW: lol not as an API backend I’m not :)
[00:55:47] <Jonno_FTW> you can always buy more processing power
[00:56:02] <sivy> the more i bang on it, the more I see that the data simply is fitting the relational model much better
[00:56:06] <Jonno_FTW> I just ended up writing the data processing part in python and using multiprocessing
[00:56:31] <Jonno_FTW> along with numpy, much easier than writing aggregations
[00:56:48] <sivy> no slam on mongodb at all - but it’s looking like our need for extensible fields (dicts, lists) can be met nearly as well in (mysql+postgres)+json
[00:57:05] <sivy> ie, 95% of the top-level fields are static
[00:57:16] <Jonno_FTW> hm
[00:57:21] <Jonno_FTW> most of my data is non-relational
[00:57:23] <sivy> and the ones that typically take deeper dicts are limited
[00:57:33] <sivy> i hear you
[00:57:44] <Jonno_FTW> works well for my 2M+ rows
[00:57:50] <sivy> that’s graet
[00:57:59] <Jonno_FTW> mongo can be a pain if you don't have indexes set up correctly
[00:58:01] <sivy> as long as i don’t have to cross collections, it’s working well for us
[00:58:14] <Jonno_FTW> you can denormalise your data
[00:58:37] <Jonno_FTW> but that takes time and might break things
[01:01:37] <sivy> in a nutshell - requirements dictate being able to query each collection separately, and across them
[01:01:54] <sivy> i cannot integrate one into the other and satisfy those reqs
[01:02:07] <Jonno_FTW> you need relational then
[01:02:14] <sivy> i’m not complaining about mongo - but an aggregation isn’t the solution :)
[01:02:25] <Jonno_FTW> you can look into mongoose but the db isn't designed for going across collections
[01:02:43] <sivy> I know, and I don’t fault it
[01:02:56] <sivy> we have other services using it to great affect
[01:03:02] <sivy> s/affect/effect/
[01:03:05] <sivy> (dammit)
[01:04:47] <Jonno_FTW> well you can always write a tool in your language of choice that does the joins
[01:05:05] <Jonno_FTW> make it run in parallel and you have a decent product
[01:06:48] <sivy> I do have some code that does it in two queries, which performance-wise lands much closer to acceptable but still very short of sql performance, even on unoptimized code/hardware
[01:09:09] <sivy> thanks
[07:56:35] <Derick> moin
[13:49:01] <Bin4ry> Hi guys, I current have an app in remote server that points mongodb engine to localhost:27017. But when I access the website, it actually reading data from my own localhost. How is that possible?
[13:50:10] <Derick> does it run in the browser?
[13:50:55] <Bin4ry> @Derick, yes it is a SSR website.
[13:51:02] <Derick> I don't know what SSR is
[13:51:12] <Bin4ry> @Derick: server-side rendered
[13:51:30] <Derick> sorry, what app is this, which language?
[13:51:47] <Bin4ry> A JavaScript web app, Vue.js, Nuxt.js
[13:52:01] <Derick> so what's the server side part of it?
[13:52:25] <Bin4ry> I'm using a framework call FeathersJS, its isomorphic
[13:53:16] <Derick> if you app runs in the browser, from the browser's perspective, localhost in the machine it runs on
[13:53:43] <Bin4ry> I see, so I should set the IP to remote IP?
[13:54:07] <Derick> yes, and make damned sure you use SSL and use authentication
[13:55:43] <Bin4ry> @Derick: so I should create a new credentials for my collections
[14:02:26] <Derick> new? you just need to make sure not everybody on the internet has access to your data
[14:09:50] <Bin4ry> @Derick: I'm not sure if the credentials will leak or not, is it safe to create a user that has ReadWrite to entire collections then point my web app to using it?
[14:10:32] <Derick> can you read the code in your web app as a random user?
[14:14:16] <Bin4ry> @Derick: I supposed not.
[15:04:45] <Bin4ry> One question, if I createUser in admin collections with root privilege. Can the admin account read/write to other collections?
[16:59:37] <SilentByte> I'm using the node.js MongoDb driver and trying to pull an ObjectId from an array in a subdocument like so: update = {$pull: {"scene.viewerIds": viewerId}} where viewerId is an ObjectId. This isn't working. Nothing gets pulled. Running the same exact update by hand in the MongoDb interpreter works fine. What gives?
[17:00:30] <SilentByte> Using findOneAndUpdate if that matters...
[17:00:54] <SilentByte> Also tried changing the $pull to a $push and the $push works