[17:50:04] <hackeron_> hey, question, I have this pipeline: [{"$project"=>{"name"=>1, :starts_at=>1, "year"=>{"$year"=>"$starts_at"}, "month"=>{"$month"=>"$starts_at"}, "day"=>{"$dayOfMonth"=>"$starts_at"}}}, {"$group"=>{"_id"=>{"year"=>"$year", "month"=>"$month", "day"=>"$day"}, "count"=>{"$sum"=>1}}}, {"$sort"=>{"count"=>-1}}, {"$match"=>{"timeline_id"=>"5cdbfd6c4caa410beb179b9a"}}] -- I'm trying to run it on a collection
[17:50:10] <hackeron_> of 9000 or so items like this: Event.unscoped.collection.aggregate(pipeline).entries (using Mongoid with Rails), but it has been running for minutes now. Any ideas why it is so slow or how to speed up?
[18:16:37] <dino82> Quick question -- I have a replicaset of 3 mongo nodes (version 4.0), how can I replicate the admin db to all nodes so users can authenticate against any of them?
[18:27:57] <bgilb> my aggregate returns an array of objects that have 2 arrays, how can i combine all those 2 arrays into 1 object with just those 2 arrays?
[18:43:46] <bgilb> i got a little closer using unwind
[20:11:29] <GothAlice> bgilb: $unwind takes arrays and makes them into a copy of the record for each value of the array. The opposite: $group — take multiple records, and combine them into one based on _id criteria, and operations defined within the rest of the $group stage document.
[20:12:17] <GothAlice> A bizarre example I often end up having to do: $lookup + $unwind because the $lookup will literally only ever match one foreign value.
[20:14:29] <bgilb> i ended up just handling it programatically. the farthest i could get was an array that looked like [{field1: {}}, { field1: {}}, { field2: {}}]
[20:14:46] <bgilb> so i just loop through and check if it contains field1 or field2, if so add the bsondocument value to another list
[20:14:55] <bgilb> mine was more complicated because i ended up with 2 arrays
[20:15:30] <GothAlice> Reminds me of: https://www.javaworld.com/article/2088406/how-to-screw-up-your-mongodb-schema-design.html
[20:15:50] <GothAlice> “Identifying documents is key.”