PMXBOT Log file Viewer

Help | Karma | Search:

#mongodb logs for Wednesday the 11th of September, 2019

(Back to #mongodb overview) (Back to channel listing) (Animate logs)
[05:46:51] <Wulf> I've got a collection with rows like {"src": 10, "seq": 4819203, "keys": "values"}. There are ~ 100 different values for "src". Each (src, seq) is unique. How can I do a query like "Give me the last 10 rows for each src"? "last" would here mean the rows with the largest seq values.
[08:07:59] <_aeris_> hello here!
[08:08:26] <_aeris_> i need mongodb on an arm64 machine, but there is no debian package for this arch, so i try to build the package by myself
[08:08:38] <_aeris_> but there is no official documentation or readme about the process
[08:08:58] <_aeris_> the standard "dpkg-buildpackage -us -uc -b" doesn't work :(
[08:09:12] <_aeris_> what's the official way to build deb ?
[12:56:55] <Wulf> I've got a collection with rows like {"src": 10, "seq": 4819203, "keys": "values"}. There are ~ 100 different values for "src". Each (src, seq) is unique. How can I do a query like "Give me the last 10 rows for each src"? "last" would here mean the rows with the largest seq values.
[13:00:03] <GothAlice> Wulf: I think that’s the fourth time I’ve seen that question copy/pasted? A very brief DDG/Google search using the name of the operation you are trying to perform instantly found: https://docs.mongodb.com/manual/reference/operator/projection/slice/
[13:03:50] <Wulf> GothAlice: should've been third time. Thanks, will have a look at slice.
[13:05:22] <GothAlice> Specifically, combine $slice as the final $project stage after an aggregate $group. (Grouping on your “src” field.)
[13:05:44] <GothAlice> Noting that there is no definition of “last”; you have not specified a sort.
[13:06:10] <GothAlice> (So do that, too. ;)
[20:23:50] <cthulchu> hey folks! I'm using Adobe Launch API to speed up my work with large containers. The API returns complex json objects with complex yet standard structure. Currently I parse them into a csv, open with excel, modify, save the csv, parse it back to json and bulk upload. It's a totally temporary solution. Also the complexity of these object has to be increased beyond the capabilities of a table
[20:24:19] <cthulchu> so what I think to do is to have a local mongo instance that would accept whole json and give me a comfortable interface to do magic with it
[20:24:36] <cthulchu> and then I could get the whole json out of it and upload it back to Launch
[20:25:38] <cthulchu> sounds reasonable?
[21:12:00] <anamok> hi
[21:13:11] <anamok> On Manjaro I installed mongodb-bin 4.2.0-1 and mongodb-tools-bin 4.2.0-1 , but when I launch "mongo", the MongoDB server version is still 3.6.12 .
[21:13:43] <anamok> How could I remove the old server? How to update to 4.2 properly?
[22:17:37] <neoromantique> aloha, I wonder if there are some tools that would be of help before I start scripting it
[22:18:06] <neoromantique> I have hourly backups of mongodb(basically mongodumps), and I want to be able to run queries against them from time to time when I'm searching for something
[22:18:51] <neoromantique> for ex: I know object was modified at 24th of April, and I have 24 dumps from that time, so I want to run a query against different backups from that day
[22:19:19] <neoromantique> How would one tackle it except for obvious tedious mongorestore -d %date%?