PMXBOT Log file Viewer

Help | Karma | Search:

#mongodb logs for Friday the 15th of January, 2021

(Back to #mongodb overview) (Back to channel listing) (Animate logs)
[07:32:02] <micw> Hi
[07:35:22] <micw> I hava (growing) a dataset of about 150 million documents. A document has ~10 fields, structure of all documents are similar. In need to sort by one column (timestamp in most cases) while I filter on 0..7 of the other fields - requests are paged to a maximum of 100 per page. (how) can this be done efficiently with mongodb? Of is mongo not a good solution for that kind of workload?
[16:35:26] <mahmoudajawad> micw: are you getting bad performance, or that you are looking into better performance? I'm not expert on either, I'm just curious what is the background of your question.
[17:48:36] <d4rkp1r4t3> micw: might be a good use case for wildcard indexes. https://docs.mongodb.com/manual/core/index-wildcard/
[19:49:03] <micw> I've not made a final decision for mongodb yet, i just dump my data into it. I'm going to write a UI in a few weeks, until then I need to decide if I stay with mongo or search for anything else.
[19:49:26] <micw> currently I have an index on the timestamp and on each colum I'd like to filter for
[19:49:52] <micw> but that gives bad performance whan I filter on 2 columns and sort by date
[19:50:30] <micw> before I dig deeper into multi-column indices or something, I'd just like to ask if that's a good kind of workload / use case for a mongodb