[00:30:27] <garbagecollectio> for example, diet: 'low fat', diet: 'high energy'
[00:33:40] <boterock> hello, i'm rather new to nodejs and mongodb, i am using mongolian, how do I say mongo to find in collection where id = any value of an array?
[00:39:12] <garbagecollectio> is mongodb good for doing calculations
[01:06:57] <Aartsie> boterock: i think it is better to use mongoose
[01:07:53] <Aartsie> boterock: http://mongoosejs.com/ is created by 10gen
[04:30:21] <zzing> Is there something for mongo like phpmyadmin is for mysql?
[13:15:14] <Nodex> will be interesting to see what they do with it
[14:16:32] <moe> hi i have a question regarding mongodb replication + sharding
[14:18:08] <moe> I have 3 machines : A, B, C . I have replication turned on. So A is primary, and B and C are secondary. Suppose I shard the db. I believe all writes will still go to A. So what is the advantage
[14:28:01] <moe> shard only contains part of the data. BUt with replication turned on, a secondary is still laggin behind the primary. So writes will still have to be done on the primary
[14:28:21] <moe> So how does that help in write scalability
[14:30:53] <kali> if you want 4 shards, you'll have 12 machines: 4 replica set of 3 servers. each shard has a quarter of the data.
[14:31:27] <kali> each shard has a primary, so each primary will handle a quarter of the whole write load
[14:39:38] <touilltouill> Hi everyone, i'm trying to connect to my replicaset with php but every time i try to connect i have a php fatal error : Uncaught exception 'MongoConnectionException' with message 'No candidate servers found' can someone help me to solve my problem :)
[14:50:31] <touilltouill> Hi everyone, i'm trying to connect to my replicaset with php but every time i try to connect i have a php fatal error : Uncaught exception 'MongoConnectionException' with message 'No candidate servers found' can someone help me to solve my problem ? :)
[14:58:59] <Nodex> please pastebin your connection string
[15:07:36] <touilltouill> her my php connection lin
[15:21:46] <balboah> how do you properly set a $comment with pymongo? it seems cursor.count() stops working if you do .find({"$query": …., "$comment": ...})
[15:28:50] <balboah> why aren't these giving the same result? db.users.find({"$query": {"gender": "female"}}).count() vs db.users.find({"gender": "female"}).count()
[15:49:28] <ehershey> balboah: I think using the $query operator changes how find() return values work
[15:49:38] <ehershey> Note Do not mix query forms. If you use the $query format, do not append cursor methods to the find(). To modify the query use the meta-query operators, such as $explain.
[15:50:55] <balboah> ehershey: thanks. This causes me to have to know when my code might want to count later on, and breaks adding $comment :(
[16:13:45] <grouch-dev> hi all. I am running a purge query. 5 members of replica-set, 1 of which is arb. On the primary I run a db.collection.remove(...) and it fails after a few minutes because the Primary "can't see a majority".
[16:14:22] <grouch-dev> Nothing else is running during this time
[16:15:18] <grouch-dev> At first I see messages "is down (or slow to respond)"
[16:40:53] <bee_keeper> hi, i'm using the twitter streaming api to get constant info from twitter, some of which i will store. in your opinions, is mongodb a good use case for this scenario?
[17:42:29] <grouch-dev> seems like whenever mongo has a heavy load, it can't keep a primary
[17:44:47] <grouch-dev> In a 3 server (1pri, 1sec, 1arb) environment it works, but with 2 more secondaries, the primary loses majority and the query fails
[17:45:24] <jaraco> What's the planned schedule for 2.4.5? We're holding off an upgrade for a fix included in that build.
[17:51:09] <orngchkn> Is anyone around that could help me figure out how to get Mongo to stop filling up logs with "warning: ClientCursor::yield can't unlock b/c of recursive lock ns" (hundreds of megs of logs per minute) while doing a findAndModify query?
[18:06:02] <thismax> Anyone know how to query for a document with an array that does NOT contain a certain element?
[18:07:39] <thismax> like if I had two documents: { features: ['a', 'b'] } and { features: ['c', 'b'] }, and I wanted to find any document that didn't have feature 'a'?
[18:28:25] <orngchkn> Can anyone explain how a query with $explain: true can be running for 30 seconds or more?
[22:23:56] <awpti> Howdy folks. I used this block ( http://pastie.org/8104889 ) to create a boatload of records to fiddle with mongodb, yet test_collection appears empty -- db.test_collection.find() returns nothing and 'it' says there's no curse. What am I missing here?
[22:36:56] <awpti> I'll check that out. I'm pretty familiar with Python.
[22:37:09] <awpti> Also, thanks leku. Re-running it worked.
[23:57:07] <kurtis> Hey guys, I'm trying to come up with a good solution for a "big data" problem. On my web-end, I use several million documents to perform various computations. These same set of documents are used for all of the computations. Is there a smart way to cache those documents for a short period of time? I thought about storing all of the ObjectIDs in Redis or something similar (for 15 minutes) but I'm not sure if that will help much