[10:18:39] <weeb1e> Can anyone tell me how I can group by every X number of documents?
[10:20:16] <weeb1e> I have the following query and I would like to group every X number of results and sum them up either to an average or max number in that group: db.server_stats.aggregate([{$match: {ip: ip, port: port}},{$sort: { at: -1 }, $limit: 60}, {$project: {at: 1, count: 1}}])
[14:12:44] <evilC> Hi, can anyone explain to me what the deal is with mongodb (I am using the version that came with Meteor) and non-zero indexed arrays?
[14:13:32] <evilC> ie I have myvar[2000]...[2014] and when stored in mongo, [0]...[1999] are shown
[14:13:51] <evilC> or at least they are in my browser debug window
[14:14:04] <evilC> (but not in the array before i stored)
[14:18:05] <evilC> ok, so doing googling, looks like this is called a "sparse index".
[16:20:20] <murosai> meh, a simple query in a very small database is taking almost 200ms :/ i wonder if i could speed up things somehow
[16:20:43] <murosai> granted i'm running this on rasperry pi, but still seems kind of slow
[16:26:08] <kali> murosai: show us the query and the an explain() but pis are good at playing video and lighting bulbs. not sure about big data :)
[16:28:49] <murosai> hold on let me first try to measure the query time more accurately
[16:30:09] <murosai> can mongodb help me in this respect somehow?
[16:30:34] <murosai> ..or is that what the explain() does?
[16:37:37] <murosai> okay hm, i think it's not mongodb that's slow..
[16:39:36] <kali> explain() will gives you an execution plan and some stats
[16:40:03] <kali> you can tail the log to see the slow (>100ms by default) queries
[17:21:55] <raminnoodle> Can someone take a look at my query, results and tell me how I can adjust my query to get the results im looking for. Its all in the example http://pastebin.com/B9jqtACM
[18:55:00] <frostyfrog> Hello, does anyone know how (if it's possible) to connect to a mongo database with mongoclient using a javascript object only?
[18:58:57] <gx> anyone cool wanna help me out? I'm trying to get mongo working on my local IIS server for development. I have mongod running, i enabled php_mongo.dll and everything appears to be fine...
[18:59:11] <gx> but when i hit my website, i get Fatal error: Class 'MongoClient' not found in C:\Users\willi_000\Documents\repos\jobminx\app\libraries\src\JobMinx\Bootstrap.php on line 37
[18:59:38] <gx> it works fine on the deployment server (lamp)
[18:59:46] <frostyfrog> did you install the mongodb php module?
[18:59:56] <gx> yeah, i added the dll to php.ini and reset
[19:00:04] <gx> jumped through hoops to get that working but it's installed
[19:00:20] <gx> let me find php.ini output actually
[19:01:37] <frostyfrog> That's where I'd start looking then :)
[19:01:50] <gx> ok so then... i've been having this other issue. i downloaded a ton of the dll files for diff versions of mongo, put them in php/ext directory
[19:02:02] <gx> for like 99% of them, when i enable them, IIS just restarts over and over
[19:02:21] <gx> one version DIDNT cause the reset, so i assumed it was installed, but i guess not
[19:02:44] <gx> ive googled extensively and im not sure why, when i add the extension, IIS just continuously resets, and i get 500 error
[19:03:06] <frostyfrog> Create a simple PHP script that says "Hello world!", then run it through PHP on the commandline
[19:03:38] <frostyfrog> php should spit out any problems that are causing IIS to restart. (At least in theory)
[19:05:14] <gx> it runs, but i'm getting another error related to morph.phar
[19:05:29] <gx> include(phar://Morph/Storage.php): failed to open stream: phar error: Cannot open temporary file for decompressing phar archive "C:/Users/willi_000/Documents/repos/jobminx/app/libraries/src/Morph/Morph.phar" file "Storage.php"
[19:05:44] <gx> so i'm going to try to run make_phar.php. which i think i did already
[19:12:59] <frostyfrog> Glad to have been able to help as much as I could :)
[21:09:13] <overburn> hey guys , i have a small question
[21:10:02] <overburn> if i need for example a company document that contains multiple users , what is the performance improvement over mysql having a companies table and a users table with a foreign id ?