[04:54:42] <jrdn> anyway, as for the schema, it works well if you want the document in line.. but if you need to transform anything it becomes a pain in the ass
[05:01:10] <mrpro> maybe u can feed graphite and then extract from graphite for charts
[05:01:26] <jrdn> so if a partner's site is down, basically we'd know if an aggregation showed for $0 within a 5 minute period so to say (although we take the average $ per minute over the last 7 days then gear alerts / metrics based off that)
[05:01:37] <jrdn> some stuff is affiliate based, yes
[05:03:20] <jrdn> it just comes down to how much money do you have to initially waste vs how good quality you can produce :P
[05:03:21] <mrpro> so from what i understand, if i charge for my product and an affil parter gets me a signup, i have to pay x% per signup coming from them
[05:08:10] <mrpro> btw, your replica is in diff physical loc?
[05:08:11] <jrdn> we just hired someone who works for EC2 to manage our servers… we were starting to do our own puppet management and stuff, but moved it over to someone else so we can focus on products and good code
[05:09:18] <jrdn> but… i don't know how your app works
[05:10:02] <jrdn> but what we're going to do next is run Mongods on every app server that is kind of like a replica (not set up as a replica… AMPQ is going to send new data from one main replica to each web server)
[05:10:15] <jrdn> so all reads will actually be local, if the read doesn't exist, then it goes to replica
[05:19:22] <jrdn> but anyway, were moving our environment so that our apps can run even if we completely lost all our moongos which won't happen.. we're going to have 1/2 servers on EC2 and 1/2 on Rackspace eventually
[05:33:13] <mrpro> so its a pain with all the safemode crap and stuff like that, plus it takes time to come up with some clever stuff
[05:33:35] <mrpro> but i think so far we're doing ok with that… if all that works out, we should get relibability and also ability to shard it later on
[09:23:54] <Dantas> hi all ! I'am in a new project using mongoose and mongodb, but when I try to query all documents ( 200k documents ) to mongoose is too slow ( average response 5 seconds ). But when using the mongo repl, the response is immediately. What am I wrong ?
[09:30:34] <remonvv> You're doing two different things most likely.
[09:30:44] <remonvv> Post your code and the shell query in a pastie.
[09:33:34] <Dantas> remonvv: Ok , i will do it right now
[10:48:17] <Guest52366> I am seeing a lot many mongo "down/slow to respond" in "rs" logs, in our live AWS replica-servers, the load is just normal, no network issues, the flip pattern looks like very random... any suggestions what to look at
[16:31:11] <emehrkay> how is mongo's performance when searching for something like a tag against millions of documents?
[16:33:49] <Derick> emehrkay: depends hugely on how it's indexed
[16:39:55] <emehrkay> What we're doing now is breaking down documents into keywords (and their properties; date, weight, etc.) and then searching against them by using an IN clause (mysql). We have millions of rows in mysql, but i feel that the number of entries could be drastically reduced using something like mongodb. Id still need to search against all documents. I want to explore this a bit more, just figured Id ask if anyone has done anything simil
[17:42:24] <hdon> hi all :) been a little while since i used scons. how can i tell it to take advantage of multiple cpu cores? i hit ^c and tried adding -j5 (i have 4 cpu cores) to my command (scons .) but ---- oh, there it goes. it's using them all now
[17:43:02] <hdon> new question: why does it look like scons is configuring build parameters when i run my scons command again, even after the build was already underway? is that normal scons behavior, or mongodb behavior?
[17:54:14] <hdon> why does scons check for pcap when building mongodb?
[19:16:14] <ArturoVM> and yeah, that's where I got the driver
[19:16:22] <hdon> what npm packages did you install?
[19:16:57] <hdon> also... do you know how to use -g in npm? when i use -g, i never get the dependencies installed. and when i don't use -g, the dependencies are installed in subdirectories of each package :\
[19:18:24] <ArturoVM> Generally npm installs global pkgs to folders where you need sudo to make changes.
[19:19:45] <ArturoVM> hdon, I'm sorry if that seems dumb to ask, but you never know :)
[19:19:47] <hdon> ArturoVM, yeah i use root when i use -g
[19:20:02] <hdon> you'd be even more right to ask if you knew i was on ubuntu :|
[19:21:40] <ArturoVM> So after you've installed a package with -g, you can't import it? Is that it?
[19:26:39] <hdon> what is the mongodb shell made of? is it a completely custom repl? it seems to be javascript. but i don't have Object.keys()
[19:27:07] <hdon> ArturoVM, i can require() it, but it throws an error trying to require() its dependencies
[19:27:14] <hdon> ArturoVM, so for now i've given up on globally installed modules
[19:32:19] <ArturoVM> hdon, that is really strange. But it does install dependencies when you run npm install?
[19:33:30] <hdon> ArturoVM, yes, but when a dependency is installed, it isn't installed in the same directory as its dependent module, it's installed in a subdirectory. let me see if i have an example dir laying around..
[19:34:43] <hdon> ArturoVM, actually, i see this is happening even without -g. *pasting*
[19:36:21] <hdon> so if i "npm install supermodule" then it ends up with ./node_modules/supermodule but its dependency ultramodule doesn't end up in ./node_modules/ultramodule it ends up in ./node_modules/supermodule/node_modules/ultramodule
[19:36:48] <hdon> like, wth. am i gonna end up with multiple copies of modules? will the copy that gets loaded depend on the order i load my modules in? many questions...
[19:36:59] <hdon> i'm not a big fan of npm but i find myself using it a lot
[19:38:33] <ArturoVM> Ugh :/ yeah, dependency management in npm is weird. But yep i think that every module installs its own dependencies, no matter if you have them already. I could be wrong, though. Not much of an npm expert :P
[19:39:35] <ArturoVM> Well, I've got to go. See you later. FTR, this is the question I was going to ask: http://stackoverflow.com/questions/11799953/whats-the-best-practice-for-mongodb-connections-on-node-js
[19:59:56] <storrgie> I got asked by someone to help out with their project, they are 'adding me to their github' but while I was waiting I scanned their project host and noticed that they had both mongodb and mongodb console open to the outside. Is this common? I've only ever worked with mysql
[20:00:21] <storrgie> with the mongo client I can connect to their database without any authentication and actually browse around
[20:00:28] <storrgie> this seems like a bad thing to me...
[20:03:14] <storrgie> I want to be able to give them some pointers on how to lock this down... but maybe this is the way you host a mongodb and I'm just new/ignorant
[20:06:14] <crudson> storrgie: set the bind_ip option to restrict addresses to listen on
[20:06:50] <storrgie> crudson, so this is not common to expose it to the whole world?
[20:07:10] <crudson> well, the machine firewall may prevent access, I'd look there too
[20:07:20] <crudson> but it depends on how you want it deployed
[20:07:20] <storrgie> I installed the mongo client and I was able to get in without auth... I'm guessing from there I could try to elevate access
[20:07:35] <storrgie> well, it seems bad practice to allow for the entire DB to be viewable to anyone right?
[22:50:49] <Aartsie> hi all, i have a collection created with the name user-log but know mongo think i want to read log ?
[23:17:02] <ArturoVM> Aartsie: I think it's standard practice (regardless of language/OS/environment) that when you want to name something and they're two separate words, you should either a) camel-case them or b) use underscores.
[23:17:38] <Aartsie> ArturoVM: yeah i think so :) i use debian :)
[23:18:29] <Aartsie> when im in the console and want to show all the records of a collection i got the first 10 and then it says 'have more' how can i see them all ?
[23:38:56] <crudson> of course you don't want to do this for a mega query
[23:40:42] <crudson> but say you know you want 50 and get them all printed without having to type 'it' many times, this could be useful: .find().limit(50).map(function(e){return e}).forEach(function(d) { printjson(d) })
[23:41:22] <crudson> actually you can get rid of the .map() bit totally
[23:41:37] <crudson> (getting late in the day, sorry)