mongoose distinct and populate with documents

10,805

Solution 1

Would this be the right way considering the current lack of support in mongoose?

followerModel
.find({id_follower:{$in:followerIds}})
.distinct('id_post',function(error,ids) {
   Posts.find({'_id':{$in : ids}},function(err,result) {
     console.log(result);
   });
});

Solution 2

You can simply use aggregate to group and populate the collection. now we have the desired result

db.<your collection name>.aggregate([
  {
    $match: {<match your fields here>}
  },
  {
    $group: {_id: <your field to group the collection>}
  },
  {
    $lookup: {
              from: "<your collection of the poupulated field  or referenced field>",
              localField: "<give the id of the field which yout want to populate from the collection you matched above cases>",
              foreignField: "_id", //this references the id of the document to match the localField id in the from collection
              
              as: 'arrayName', //<some name to the returned document, this is a single document array>
            }
  },
  {
    $project: {
     //you really don't want the whole populated fields, you can select the fields you want
     <field name>: 
<1 or 0>, // 1 to select and 0 to not select 
     //you can add multiple fields here 
     //to select the fields that just returned from the last stage we can use
     "arrayName._id": <1 or 0>,
      }
  }
])
//at last you can return the data
.then((data) =>{
  console.log(data);
});

we have distinct() by $group and populate() by $lookup and we also select() by $project

Share:
10,805

Related videos on Youtube

Andries Heylen
Author by

Andries Heylen

Updated on September 15, 2022

Comments

  • Andries Heylen
    Andries Heylen over 1 year

    I have the following model:

    var followerSchema = new Schema({
        id_follower: {type: Schema.Types.ObjectId, ref: 'Users'},
        id_post: {type: Schema.Types.ObjectId, ref: 'Posts'}
    });
    

    I want to be able to find all posts for a list of followers. When I use find, it returns me of course multiple times the same post as multiple users can follow the same post.

    So I tried to use distinct, but I have the feeling the "populate" does not work afterwards.

    Here is my code:

    followerModel
        .distinct('id_post',{id_follower:{$in:followerIds}})
        .populate('id_post')
        .sort({'id_post.creationDate':1})
        .exec(function (err, postFollowers) {
            console.log(postFollowers);
        })
    

    It only returns me the array of the posts, and it is not populated.

    I am new to mongoDB, but according to the documentation of mongoose, the "distinct" method should return a query, just as the "find" method. And on a query you can execute the "populate" method, so I don't see what I am doing wrong.

    I also tried to use the .distinct() method of the query, so then my code was like this:

    followerModel
        .find({id_follower:{$in:followerIds}})
        .populate('id_post')
        .distinct('id_post')
        .sort({'id_post.creationDate':1})
        .exec(function (err, postFollowers) {
            console.log(postFollowers);
        })
    

    In that case it works, but as in the documentation of mongoose you need to provide a callback function when you use the distinct method on a query, and so in my logs I get errors all over. A workaround would be to have a dummy callback function, but I want to avoid that...

    Does anybody has an idea why the first attempt is not working? And if the second approach is acceptable by providing a dummy callback?

  • Jamie Hutber
    Jamie Hutber over 8 years
    That is how I believe i would do it also.
  • cubefox
    cubefox over 5 years
    but it's not really scalable is it? Posts is huge and a currentUser only needs a tiny percentage of them, all of which are stored through id_follower.id_post
  • PravyNandas
    PravyNandas over 3 years
    @cubefox since it's filtered on Ids Posts.find({'_id':{$in : ids}} collection size is not an issue anymore. It scales even millions of records.