mongodb - Does mongo provide functionality for deconstructing document arrays for large datasets? -


similar map/reduce in reverse. mongo have way of reformatting data. have collection in following format.

{    {"token-id" : "lkj8_lkjsd"     "data": [                {"views":100, "date": "2015-01-01"},                {"views":200, "date": "2015-01-02"},                {"views":300, "date": "2015-01-03"},                {"views":300, "date": "2015-01-03"}             ]   } } 

i process entire collection new format. every time series data point document mapped id using inherent mongo functionality similar map reduce. if there isn't; i'd appreciate strategy in can this.

{   { "token-id" : "lkj8_lkjsd", "views": 100, "date" : "2015-01-01"},   { "token-id" : "lkj8_lkjsd", "views": 200, "date" : "2015-01-01"},   { "token-id" : "lkj8_lkjsd", "views": 300, "date" : "2015-01-01"} } 

the aggregate command can return results cursor or store results in collection, not subject size limit. db.collection.aggregate() returns cursor , can return result sets of size.

 var result = db.test.aggregate( [ { $unwind : "$data" }, {$project: {_id:0, "token-id":1, "data":1}}])      for(result.hasnext()){      db.collection.insert(result.next());     } 

Comments