Soft Delete 🗑 – Node.js, MongoDB

In this article, we will teach you, how you can implement soft delete in your Node.js app with MongoDB as database.

We also created a tutorial on how to do soft delete in Laravel, check our tutorial here.

Soft delete 🗑 is useful when you do not want to actually delete the record from database, but mark the data as deleted.

You might have seen “This message has been deleted” on WhatsApp 💬 when you delete a message. That was soft delete. Because if you actually delete the message from database, then you would have no reference if there was a message at that time.

It is also helpful in cases where you want to allow the user to delete some data, but wants the administrator 👨🏻‍💼 to see the data. So when user deletes the data, it will not actually delete the data from database. It will simply hide it from user, but administrator can still see it. Only administrator can actually delete the data.

We will use “users” as entity for the sake of this tutorial.

In this article, you will learn:

  1. Insert users in MongoDB
  2. Fetch non-trashed users
  3. Soft delete a user
  4. Fetch trashed users
  5. Restore a user
  6. Permanently delete a user

1. Insert users in MongoDB ➕

Following API will insert a new document in MongoDB “users” collection using Node.js:

// API to insert data
app.post("/insert", async function (request, result) {
    // Get input fields
    const name = request.fields.name || "";

    // Insert data in database
    await db.collection("users")
        .insertOne({
            name: name,
            deletedAt: null // This will save the time when user was deleted
        });

    // Return response to the client
    result.json({
        status: "success",
        message: "User has been inserted."
    });
});

It will get user name from client app and insert a new document in “users” collection. Note that it creates another field “deletedAt” and set it’s value to null. Having NULL value means that the user is not deleted.

2. Fetch non-trashed users

Then we are going to fetch all the users that are not deleted.

// API to fetch non-trashed users
app.post("/fetch", async function (request, result) {
    const users = await db.collection("users")
        .find({
            deletedAt: null
        }).toArray();

    result.json({
        status: "success",
        message: "Users has been fetched.",
        users: users
    });
});

Here we are using filter to fetch all the records where deletedAt field is null.

3. Soft delete a user ␡

Now we will create an API that will soft delete a user.

// API to move the user to trash can
app.post("/delete", async function (request, result) {
    // Get user ID
    const _id = request.fields._id || "";

    // Don't delete the document,
    // just update the deletedAt field
    await db.collection("users")
        .updateOne({
            _id: ObjectId.createFromHexString(_id)
        }, {
            $set: {
                deletedAt: new Date()
            }
        });

    result.json({
        status: "success",
        message: "User has been moved to trash can."
    });
});

Here, we are first getting the user ID from client app. Then we are setting the deletedAt field to the current date.

If you run the /fetch API again, you will not see that user’s data in the response. Even though the record exists in the database, it will not be returned.

4. Fetch trashed users

Now that we have trashed the user, we need to find a way to see all those trashed users. So we will create an API that will return all the users that has been soft-deleted.

// API to see all trashed users
app.post("/trashed", async function (request, result) {
    const users = await db.collection("users")
        .find({
            deletedAt: {
                $ne: null
            }
        }).toArray();

    result.json({
        status: "success",
        message: "Trashed users has been fetched.",
        users: users
    });
});

At this point, you can give administrator 2 options, either:

  1. Restore the user
  2. Delete the user permanently

5. Restore a user ↩️

Restoring the user will simply set the deletedAt field of user back to null, like it was when user was created.

// API to restore a user
app.post("/restore", async function (request, result) {
    const _id = request.fields._id || "";

    await db.collection("users")
        .updateOne({
            _id: ObjectId.createFromHexString(_id)
        }, {
            $set: {
                deletedAt: null
            }
        });

    result.json({
        status: "success",
        message: "User has been restored."
    });
});

This will also receive a user ID that needs to be restored. If you call the /fetch API again, you will see that user in the response.

6. Permanently delete a user 🗑

Permanently deleting a user will actually deletes the user from database. There will no way to restore it once done (unless you have a backup 🎒 of your database).

// API to actually delete the user from database
app.post("/delete-permanently", async function (request, result) {
    const _id = request.fields._id || "";

    await db.collection("users")
        .deleteOne({
            $and: [{
                _id: ObjectId.createFromHexString(_id)
            }, {
                deletedAt: {
                    $ne: null
                }
            }]
        });

    result.json({
        status: "success",
        message: "User has been permanently deleted."
    });
});

Usually, this access is given to super admin only.

Bonus ⭐️

One more thing you can do, is if you have saved the user ID in other collections as well, then you can save the data in the dependent collections before removing them.

For example, if you have an “orders” collection, where you are saving user’s orders in the following format:

{
    "_id": ObjectId("1234567890"),
    "userId": ObjectId("0987654321")
}

If you permanently delete the user, then this order’s data will be lost as well. So best practice is to:

  1. Fetch the user.
  2. Put in orders collection.
  3. Delete the user.

Following is the code for that:

const user = await db.collection("users")
    .findOne({
        _id: ObjectId.createFromHexString(_id)
    });

if (user == null) {
    result.json({
        status: "error",
        message: "User not found."
    });
    return;
}

await db.collection("orders")
    .updateMany({
        userId: user._id
    }, {
        $set: {
            user: {
                _id: user._id,
                name: user.name
            }
        }
    });

Now, when you delete the user, your “orders” collection will save the user data for historical use.

{
    "_id": ObjectId("1234567890"),
    "userId": ObjectId("0987654321"),
    "user": {
        "_id": ObjectId("0987654321"),
        "name": "Adnan"
    }
}

Conclusion 📝

So that’s it. That’s how you can create a soft delete feature in your Node.js application with MongoDB as backend. It is very helpful in almost every project and I would recommend everyone must apply it. Whenever user delete something, soft delete it so if user done that by mistake, he will be able to recover it.

Pagination in React, Node.js and MongoDB

In this tutorial, we will show you, how you can do pagination in your React app having Node.js and MongoDB as backend. Even if you have any other backend technology, you can still get the algorithm and apply it in your language.

React

Following is our React component that will call an AJAX request and fetch the data from API. After fetching the data from API, it will render the data along with the pagination links. Whenever any of the pagination link is clicked, it will refetch the data based on the clicked page number.

<script src="js/babel.min.js"></script>
<script src="js/react.development.js"></script>
<script src="js/react-dom.development.js"></script> 

<div id="app"></div>

<script type="text/babel">

    function App() {

        // Set current page
        const [page, setPage] = React.useState(1);

        // Number of pages on pagination
        const [pages, setPages] = React.useState(0);

        // Whenever it's value is updated, the data will be re-fetched
        const [refetch, setRefetch] = React.useState(0);

        // Data array
        const [users, setUsers] = React.useState([]);

        // Check if there are more pages
        const [hasMorePages, setHasMorePages] = React.useState(false);

        async function loadMore() {
            // Creating a built-in AJAX object
            var ajax = new XMLHttpRequest();

            // Tell the method, URL of request and make is asynchronous
            ajax.open("POST", "http://localhost:3000/fetch-users", true);

            // Detecting request state change
            ajax.onreadystatechange = function () {

                // Called when the response is successfully received
                if (this.readyState == 4) {

                    if (this.status == 200) {
                        // For debugging purpose only
                        // console.log(this.responseText);

                        // Converting JSON string to Javasript array
                        var response = JSON.parse(this.responseText);

                        // Update state variable
                        setUsers(response.users);

                        // Set if there are more pages
                        setHasMorePages(response.hasMorePages);

                        // Set pagination
                        setPages(response.pages);
                    }
                }
            };

            // Create form data object
            const formData = new FormData();

            // Attach current page number
            formData.append("page", page);

            // Send the request
            ajax.send(formData);
        }

        // Run function when page value updates
        React.useEffect(function () {
            loadMore();
        }, [refetch]);

        return (
            <>
                {/* Display data */}

                { users.map(function (user) {
                    return (
                        <p key={ `user-${ user._id }` }>{ user.name }</p>
                    );
                }) }

                {/* Show pagination if required */}
                
                { pages > 1 && (
                    <nav>
                        <ul>
                            { Array.from({ length: pages }, function (_, index) { return index + 1 } ).map(function (p) {
                                return (
                                    <li key={ `pagination-${ p }` } className={ `page-item ${ p == page ? "active" : "" }` }>
                                        <a href="#" onClick={ function (event) {
                                            event.preventDefault();
                                            setPage(p);
                                            setRefetch(refetch + 1);
                                        } }>{ p }</a>
                                    </li>
                                );
                            }) }
                        </ul>
                    </nav>
                ) }
            </>
        );

    }

    ReactDOM.createRoot(
        document.getElementById("app")
    ).render(<App />);

</script>

Node.js and MongoDB

Now in our Node.js server, we will create an API that will return all the records based on the current page from client side. It will also return the number of pagination links.

const express = require("express");
const app = express();
const http = require("http").createServer(app);

const mongodb = require("mongodb");
const ObjectId = mongodb.ObjectId;
const MongoClient = mongodb.MongoClient;

const cors = require("cors");
app.use(cors("http://localhost/pagination-in-react-node-js-and-mongodb"));

const expressFormidable = require("express-formidable");
app.use(expressFormidable({
    multiples: true
}));

const port = process.env.PORT || 3000;
const databaseName = "test"
const MONGO_URI = process.env.MONGO_URI || "mongodb://127.0.0.1:27017";

http.listen(port, async function () {
    console.log("Server started");

    const client = new MongoClient(MONGO_URI);
    try {
        await client.connect();
        const db = client.db(databaseName);
        console.log("Database connected.");

        app.post("/fetch-users", async function (request, result) {
            const limit = 2;
            const page = parseInt(request.fields.page || 1);
            const skip = (page - 1) * limit;

            const users = await db.collection("users")
                .find({})
                .sort({
                    _id: 1
                })
                .skip(skip)
                .limit(limit)
                .toArray();

            const usersArr = [];
            for (let a = 0; a < users.length; a++) {
                const obj = {
                    _id: users[a]._id || "",
                    name: users[a].name || ""
                };

                usersArr.push(obj);
            }

            const totalCount = await db.collection("users").countDocuments({});
            const hasMorePages = (page * limit) < totalCount;
			const pages = Math.ceil(totalCount / limit);

            result.json({
                status: "success",
                message: "Data has been fetched.",
                users: users,
                hasMorePages: hasMorePages,
                pages: pages
            });
        });

    } catch (exp) {
        console.log(exp.message);
    }
});

That’s how you can create pagination links in your React app with Node.js and MongoDB as backend. If you face any problem in following this, feel free to contact me.

Load more in React, Node.js & MongoDB

Previously, we wrote 2 articles on how to add a “load more” button. First article uses Node.js and MongoDB and EJS for frontend and second article uses PHP and MySQL. In this article, we will show you, how you can create a “load more” button in React with Node.js and MongoDB as your backend. This is the refined version of previous load more tutorial in Node.js and MongoDB.

First, let me show you all the code. Then I will explain everything.

React

function App() {

    // Set current page
    const [page, setPage] = React.useState(1);

    // Data array
    const [users, setUsers] = React.useState([]);

    // Check if there are more pages
    const [hasMorePages, setHasMorePages] = React.useState(false);

    async function loadMore() {
        // Creating a built-in AJAX object
        var ajax = new XMLHttpRequest();

        // Tell the method, URL of request and make is asynchronous
        ajax.open("POST", "http://localhost:3000/fetch-users", true);

        // Detecting request state change
        ajax.onreadystatechange = function () {

            // Called when the response is successfully received
            if (this.readyState == 4) {

                if (this.status == 200) {
                    // For debugging purpose only
                    // console.log(this.responseText);

                    // Converting JSON string to Javasript array
                    var response = JSON.parse(this.responseText);
            
                    // Get data from API
                    const newArr = response.users;

                    // ❌ Less efficient, more readable
                    // Get current data
                    /*const tempArr = [ ...users ];

                    // Loop through all new records
                    for (let a = 0; a < newArr.length; a++) {
                        // And add them in existing array
                        tempArr.push(newArr[a]);
                    }

                    // Update state variable
                    setUsers(tempArr);*/

                    // ✅ Efficient way
                    setUsers([...users, ...newArr]); 

                    // Set if there are more pages
                    setHasMorePages(response.hasMorePages);
                }
            }
        };

        // Create form data object
        const formData = new FormData();

        // Attach current page number
        formData.append("page", page);

        // Send the request
        ajax.send(formData);
    }

    // Run function when page value updates
    React.useEffect(function () {
        loadMore();
    }, [page]);

    return (
        <>
            {/* Display data */}

            { users.map(function (user) {
                return (
                    <p key={ `user-${ user._id }` }>{ user.name }</p>  
                );
            }) }

            {/* Show load more button if there are more pages */}

            { hasMorePages && (
                <button type="button"
                    onClick={ function () {
                        // Increment page value
                        // It will automatically call the loadMore function from useEffect hook
                        setPage(page + 1);
                    } }>Load more</button>
            ) }
        </>
    );
}

Here, we are displaying all the data we are getting from API. We are displaying the “load more” button only if there are more pages required. We created a React.useEffect hook and calling the loadMore function everytime the page value gets updated. It will automatically gets called first time when the page loads. After that, every time you click the “Load more” button, we are simply incrementing the page value by 1. It will trigger the React.useEffect hook, thus calling the loadMore function again.

From API, we will receive the data that will be an array, and a boolean variable that tells if there are more pages required. Based on the latter, we will show or hide the “Load more” button. If there are more pages required, you can keep pressing load more button. When there are no more records in the database, then the load more button will be removed.

When the response from API is received, we are appending the new data in our existing data array. The way to do that in React is different. First, we need to create a shallow copy of our existing array using spread operator: const tempArr = [ …users ]; Then we need to loop through all new data received from API, and push it in your shallow copy of array. Finally, we will update our state variable and will render the new data.

For rendering, we are using Javascript map function to iterate over an array. In each paragraph, we should have a unique key value for React to identify each element uniquely. In order to create a unique key, we are using user ID, and we are displaying user name in each paragraph. This is create a “load more” button in our React component. In next step, we will create it’s API.

Node.js & MongoDB

// Run command in your terminal:
// npm install express http mongodb cors express-formidable
const express = require("express");
const app = express();
const http = require("http").createServer(app);

const mongodb = require("mongodb");
const ObjectId = mongodb.ObjectId;
const MongoClient = mongodb.MongoClient;

const cors = require("cors");
app.use(cors("http://localhost/load-more.html"));

// Native way to allow CORS
/*// Add headers before the routes are defined
app.use(function (req, res, next) {
    // Website you wish to allow to connect
    res.setHeader("Access-Control-Allow-Origin", "*");
    // Request methods you wish to allow
    res.setHeader("Access-Control-Allow-Methods", "GET, POST, OPTIONS, PUT, PATCH, DELETE");
    // Request headers you wish to allow
    res.setHeader("Access-Control-Allow-Headers", "X-Requested-With,Content-Type,Authorization");
    // Set to true if you need the website to include cookies in the requests sent
    // to the API (e.g. in case you use sessions)
    res.setHeader("Access-Control-Allow-Credentials", true);
    // Pass to next layer of middleware
    next();
});*/

const expressFormidable = require("express-formidable");
app.use(expressFormidable({
    multiples: true
}));

const port = process.env.PORT || 3000;
const databaseName = "test"
const MONGO_URI = process.env.MONGO_URI || "mongodb://127.0.0.1:27017";

http.listen(port, async function () {
    console.log("Server started");
    
    const client = new MongoClient(MONGO_URI);
    try {
        await client.connect();
        const db = client.db(databaseName);
        console.log("Database connected.");

        // Seeding the data in database
        /*const usersCount = await db.collection("users").countDocuments({});
        if (usersCount == 0) {
            await db.collection("users")
                .insertMany([
                    { name: "Adnan" },
                    { name: "Afzal" },
                    { name: "Ahmad" },
                    { name: "Khalid" },
                    { name: "Tariq" },
                ]);
        }*/

        app.post("/fetch-users", async function (request, result) {
            const limit = 1;
			const page = parseInt(request.fields.page || 1);
			const skip = (page - 1) * limit;
			
			const users = await db.collection("users")
                .find({})
                .sort({
                    _id: 1
                })
                .skip(skip)
                .limit(limit)
                .toArray();

            const usersArr = [];
            for (let a = 0; a < users.length; a++) {
                const obj = {
                    _id: users[a]._id || "",
                    name: users[a].name || ""
                };

                usersArr.push(obj);
            }

			const totalCount = await db.collection("users").countDocuments({});
    		const hasMorePages = (page * limit) < totalCount;

			result.json({
				status: "success",
				message: "Data has been fetched.",
				users: users,
				hasMorePages: hasMorePages
			});
		});
    } catch (exp) {
        console.log(exp.message);
    }
});

Here, we are first starting our server and connecting with MongoDB database. We created an API that accepts the page number as parameter. If the page number is not provided, then the default value of it will be 1. We are setting the limit to 1 for testing. We subtract 1 from current page value and multiply it by the limit, it will give use the number of records we need to skip.

Then we are fetching the data from database using skip and limit. Your document might have a lot of fields, so we created an obj to only return those fields that are need for this API. Finally, in order to check if there is a need for more pages, we first need to find the total number of records in our collection. We can then check if it is greater than the product of limit and page variable. If it is greater, it means there are still more records in the database.

That’s how you can create a “load more” button in React as your frontend and Node.js and MongoDB as your backend. If you face any problem in following this, kindly do let me know.

Joins in Mongo DB

If you are working in MySQL, you can use joins to match rows between multiple tables. In Mongo DB, however you are working in documents instead of rows, you can still use joins to combine data from 2 different collections.

Here is how you can do it. Let’s say you have 2 collections, users and posts. You want to save user data in each post document so you can know which user created that post.

const userObj = {
    name: "Adnan"
};

await db.collection("users")
    .insertOne(userObj);

const postObj = {
    title: "My first post",
    userId: userObj._id
};

await db.collection("posts")
    .insertOne(postObj);

After that, you will have one document in users collection.

users collection - mongo db joins

And you will have one document in posts collection. That document will have the same user ID as in users collection.

posts collection - mongo db joins

Now in order to fetch all the posts along with the users data, you can run the following query:

const posts = await db.collection("posts")
    .aggregate([
        {
            $lookup: {
                from: "users",
                localField: "userId",
                foreignField: "_id",
                as: "user"
            }
        },
        {
            $unwind: "$user"
        }
    ])
    .toArray();

console.log(posts);

While in traditional way, you use find({}) function, for complex queries like join, you need to use aggregate function. $lookup operator is used to perform a join on multiple collections.

  • from: This tells the collection name whom to perform join with.
  • localField: This is the name of field in current collection, in this case, the current collection is “posts”.
  • foreignField: This is the field in other collection whom you are performing a join operation. In this case, it is “_id” field in “users” collection.
  • as: This is the alias of the object that will be returned as a result of join operation.

By default, the join will return “user” as an array. To convert each to a separate document, you can use $unwind operator. This will deconstruct the array into each document separately. In this case, it will be returning only 1 user as an array, so it will be deconstructed to 1 document only.

posts users joins - mongo db

That’s how you can perform joins in Mongo DB using aggregate and lookup. If you face any issue in performing a join, feel free to contact me.

Multi-purpose platform in Node.js and MongoDB

A multi-purpose platform is created in Node.js and MongoDB. Developing an API in Node.js and MongoDB allows you to create real-time, high performance and scalable applications. When you are building a website or a mobile app, you might have a different design than what is already built. But your backend will be almost similar to your competitor. Let’s say you want to build a social network like Facebook. You will pick an HTML template or design a UI specific for your business. But you want to have a module to create users, posts, pages, groups, friends, chats and much more.

So we are providing you a multi-purpose platform that allows you to use social network, job portal, manage your media files and much more. It is a self-hosted script so all the data will be stored on your server, no third-party storage service is used here.

Tech stack 💻

  • Bootstrap 5+
  • React 18+
  • Node.js 20+
  • MongoDB 4+

User Registration and Profile 🙋🏻‍♂️

This API allows you to register an account. The passwords are encrypted before saving them in database. User can also login using his email and password that he entered during registration.

On successfull login, a Json Web Token (JWT) is generated and returned in the API response. The client application needs to save that token in its local storage because that token will be use in other APIs in headers to identify the user. This token is also stored in server as well. So if you want to logout a user manually, you can simply remove his token from database.

User can edit his name and profile image. The email field is read-only, it cannot be changed. User can also change his password. For that, he needs to provide his current password and enter his new password 2 times for confirmation.

Social Network Posts 📝

If you are building a social network, then this module allows you to create and manage posts for a social network. You can also share other’s posts on your timeline. You can edit or remove your own posts. Other users will be able to like, comment or share your posts. Users will be able to edit to delete their own comments. You will be able to reply to the comments on your posts, and you can also edit or remove your reply if you want. Users can also see the list of all users who has liked or shared your post.

Pages 📃

If you are running a business, you can create a page for your business and start posting about your products or services in that page. In order to create a page, you need to set it’s name, tell a little about the page, and provide a cover photo for the page. User can follow your page and start seeing your page’s posts in their newsfeed. Only creator of page can create a post in that page.

You can see a list of all users who has followed your page. User can also see a list of all pages he has followed. The owner of the page can edit or delete the page. Once page is deleted, all the posts inside that will be deleted as well.

Groups 🥳

You can create groups to create a community of like-minded people. In order to create a group, you need to enter the name of group, a little description about the group and it’s cover photo. You can see a list of all groups created in the platform. There are separate links from where you can see only the groups that you have created or the groups you have joined. You can join or leave the group whenever you want. However, admin of the group can also remove you from the group if he wants.

Only admin or group members can post in a group. Posts uploaded by admin will be immediately displayed on the groups newsfeed. However, the posts uploaded by group members will be held pending for approval from the admin. Admin can see a list of all posts uploaded by group members. He can accept or decline a post he wants.

Admin of the group can update the name or description of the group, and he can also change the cover photo of the group. Admin can delete the group as well. Upon deleting, all the posts uploaded in that group will be deleted as well.

Media 📂

This API allows you to upload media files (images, videos) on the platform. Each media file will have a title, alt attribute, and caption. These will help you in your Search Engine Optimization (SEO). You can edit or delete your own media files whenever you want.

Friends 🙏🏻

You can create friends by sending them a friend request, just like you do in Facebook. Other user can see a list of all friend requests he has received. He can accept or decline your request. If accepted, then you both will become friends and you can chat with each other. You can see a list of all of your friends. At any point, you can remove a user from your friend list.

Realtime Chat 💬

You can have realtime chat with your friends. Chats are end-to-end encrypted, that means that the messages are encrypted before sending to the server. And they are decrypted only after receiving the response from the server. Your messages will remain secure in-transit.

For realtime communication, we are using Socket IO. When you send a message, an event is emitted. The receiver will be listening to that event and display a notification alert that you have received a new message. If the chat is already opened, then the new message will automatically be appended at the bottom of the chat.

Job Portal 👨🏻‍🔧 👨🏻‍🍳 👩🏻‍🏫 👩🏻‍🎨

Since this is a multi-purpose platform and it is developed in scalable technologies (Node.js & MongoDB), a job portal platform is also added that allows recruiter to post jobs and candidates can apply on that job. Recruiter can see all the applications he has received on a job and can change the status of applicant to shortlisted, interviewing, rejected or selected etc. Candidate can upload multiple CVs and choose the relevant CV while applying for the job. Recruiter can update or delete the job any time.

You can also filter job applications by status. So if you are recruiter, you will post a job. Then you will start receiving applications from candidates. Recruiter can change the status from interviewing to shortlisted employess. Then recruiter can change the status to interviewing. Recruiter can also reject the candidate, but he also needs to provide a reason for rejection. Finally, recruiter can set the candidate as “Hired”. So there must be a way for recruiter to filter job applications by their status.

We also have a PHP and MySQL version of Job Portal website. You can get it from here.

Admin Panel 👨🏻‍💼

An admin panel is created where you can manage all the users, social network posts and jobs created on the platform. It is a single page application created in React JS and can be deployed as a sub-domain to your main website.

Super admin can see the list of all users. He can add a new user if he wants. An email with password set by admin will be sent to the new user. Admin can also update the user password as well. He can also delete the user, all the posts uploaded by that user will be deleted as well.

Super admin can see all the posts uploaded on social network platform. Just like users, he can delete any post he did not feel appropriate.

Admin can also see all the jobs posted by recruiter and if any job post goes against the standards, super admin can remove it.

Blogs 📜

Since this is a multi-purpose platform, so I added a blog module that allows you to write articles on your website. Admin can create blogs from admin panel and they will be displayed on user side. User can post a comment if they are logged-in. Other users or admin can reply to their comments. Admin can manage all posts and comments from admin panel. Admin can delete any post or command he did not find suitable.

We are using sockets to update the blogs in realtime. So if user is reading a post and admin changes something from admin panel, it will immediately gets updated on user side. User does not need to refresh the page.

Page Builder 🏗

We are using page builder to create blog posts. It allows you to create paragraphs, headings, images, videos and buttons.

You can set the attributes like id and class. You can also set CSS properties of each tag.

If you are using image or video, you can add alt and caption attributes as well.

Freelance Platform 👨🏻‍💻 🤑

This multi-purpose platform project also has a freelance platform where you can hire experts to get your job done. Or, if you are a skilled person, you can find work there.

There are 2 entities in freelance platform: Buyer 💰 and Seller 🤑. Buyer will create a task to be done, he will mention his budget and deadline. Buyer must have sufficient balance in their account. They can add balance using Stripe. 💳

Sellers will start bidding on that task. Seller will also send a proposal mentioning how he will get the work done, what is his offer and in how many days he will do it. 🌚

Buyer will see all the bids from sellers. He will also see a freelance profile of each bidder, telling the orders done by each seller, their ratings and reviews etc. ⭐️

Buyer can accept the bid of any seller he seems fit for the job. The seller will be notified and their order will be started. ⏰

On their order detail page, they can chat with each other. They can send messages 💬 with attachments 🧷. At any point, buyer or seller can cancel the order.

After the work is done, buyer can complete the order ✅. Once it is completed, the amount that was offered in the bid will be deducted from buyer’s account. 5 percent will be deducted as a “platform fee”, and the remaining 95% of the amount will be added in the seller’s account 🤑.

Buyer can also gives ratings ⭐️ to a seller and also can give a review about his experience. This will help seller build his freelance profile.

Seller can withdraw the money by entering his bank account details 🏦. These details will be sent to the admin panel 👨🏻‍💼. Admin can transfer the funds and once the transfer is completed, seller’s balance will be returned to 0.

Notifications 🔔

User will receive a notification when:

  • His freelance order gets completed/cancelled.
  • A new message has been received on his order.
  • His bid got accepted on freelance project.
  • His withdrawl request has been updated.
  • Someone commented on his post.
  • Someone replied to his comment.
  • Someone likes his post on social network.
  • His post has been shared by someone.

These notifications will be displayed as a bell icon 🔔 on header when user is logged-in. If he has any unread notifications, then it will be displayed as read.

Unread notification’s background will be gray. On clicking the notification, it will be marked as read.

Installer 🤹🏻‍♂️

Run the following commands in your “admin” folder:

npm install
npm start

In order to create a super admin, you need to run the following command in your “premium-api” folder once:

node installer.js

This will create a super admin in your database. You can set the email and password of your choice from “installer.js” file.

Paste the “multi-purpose-platform-nodejs-mongodb” folder in your “htdocs” folder if you are using XAMPP or MAMP, or in “www” folder if you are using WAMP.

Run the following commands in the “premium-api” folder:

npm update
npm install -g nodemon
nodemon index.js

Project can be accessed from:

http://localhost/multi-purpose-platform-nodejs-mongodb/web/index.html

Demos 🎯

1. Social network – Posts

2. Pages

3. Groups

4. Friends & chat

5. Job portal

6. Admin panel

7. Blogs

8. Freelance Platform

More features can also be added on-demand in this multi-purpose platform. If you do not want all the features, we can remove or change features as per your needs.

Files included 🗄

  • api
    • index.js
    • installer.js
    • modules
      • admin
        • admin.js
        • auth.js
      • auth-optional.js
      • auth.js
      • freelance.js
      • banks.js
      • payments.js
      • blogs.js
      • categories.js
      • chats.js
      • files.js
      • job-portal
        • cvs.js
        • gigs.js
        • jobs.js
      • mails.js
      • media.js
      • notifications.js
      • sn
        • friends.js
        • groups.js
        • pages.js
        • posts.js
      • users.js
    • package.json
    • uploads
      • private
      • public
  • web
    • freelance
      • buyer.html
      • order-detail.html
      • orders.html
      • seller.html
      • task-detail.html
    • blogs
      • index.html
      • detail.httml
    • change-password.html
    • index.html
    • balance.html
    • withdraw.html
    • job-portal
      • applied.html
      • create.html
      • cv-manager.html
      • detail.html
      • edit.html
      • index.html
      • my.html
    • login.html
    • media
      • index.html
      • edit.html
    • profile.html
    • public
      • css
      • js
      • img
    • register.html
    • sn
      • chat.html
      • edit-comment.html
      • edit-post.html
      • edit-reply.html
      • friends.html
      • groups
        • create.html
        • detail.html
        • edit.html
        • index.html
        • members.html
        • my-joined.html
        • my.html
        • pending-posts.html
      • index.html
      • notifications.html
      • pages
        • create.html
        • detail.html
        • edit.html
        • index.html
        • followers.html
        • my-followed.html
        • my.html
      • post.html
      • profile.html
      • search.html
      • send-reply.html
  • admin (SPA)
    • package.json
    • public
    • src
      • App.css
      • App.js
      • index.js
      • components
        • blogs
          • AddPost.js
          • Comments.js
          • EditPost.js
          • Posts.js
        • categories
          • AddCategory.js
          • Categories.js
          • EditCategory.js
        • Dashboard.js
        • ecommerce
          • AddProduct.js
          • EditProduct.js
          • Orders.js
          • Products.js
        • files
          • Files.js
        • jobs
          • Jobs.js
        • layouts
          • Header.js
          • Footer.js
          • Sidebar.js
        • Login.js
        • sn
          • Posts.js
        • users
          • AddUser.js
          • EditUser.js
          • Users.js

Save and display images in Binary – NodeJS

In this tutorial, you will learn, how you can save and display images in Binary in NodeJS and MongoDB.

We will also create an API that will return a binary image as a response.

Saving images in the database has many advantages over saving images in file storage.

  1. First, if you are deploying in Heroku, they do not provide persistent storage for their free tier. This means that the files uploaded on Heroku will automatically be removed after 30 minutes of inactivity.
  2. Second, migrating from one deployment platform to another is easy. Since you do not have to move all the uploaded files too. You can use mongodb.com for your MongoDB database and use this on all platforms.
  3. Third, we will be saving images in Binary format so it will take less space than saving in Base64.

Video tutorial:

The following route will save the user-uploaded image as Binary in MongoDB using NodeJS.

// npm install fs
// import file system module
const fs = require("fs")

app.post("/upload", async function (request, result) {
    // get user-uploaded file
    const image = request.files.image
  
    // reading file data
    const fileData = await fs.readFileSync(image.path)
    
    // converting to binary
    const binary = Buffer.from(fileData)
    
    // saving in database
    await db.collection("images").insertOne({
        path: binary
    })
    
    // sending response back to client
    result.send("Done")
})

Check out this tutorial if you want to know how to connect with MongoDB.

Now that the image has been saved, we will create a GET route that will return the image as a base64 string.

// set EJS as templating engine
app.set("view engine", "ejs")

app.get("/", async function (request, result) {
    // get image from collection
    const image = await db.collection("images")
        .findOne({})
        
    // variable to get base64 string
    let imageString = ""
    
    // check if document exists
    if (image != null) {
        // image.path will return binary
        // buffer() function is called on binary object
        imageString = "data:image/png;base64," + image.path.buffer.toString("base64")
    }
    
    // sending data to file "views/index.ejs"
    result.render("index", {
        image: imageString
    })
})

After that, we need to create a folder named “views” and inside it a file named “index.ejs” and write the following code in it:

<img src="<%= image %>" style="width: 100%;" />

That’s how you can save and display images in Binary in NodeJS and MongoDB.

MongoDB GridFS

In this tutorial, you will learn, how you can upload, retrieve, and delete files using MongoDB GridFS.

Video tutorial:

Upload the file to MongoDB GridFS

First, in your Node JS file, you need to create an instance of your GridFS bucket. You can create as many buckets as you want.

// include MongoDB
const mongodb = require("mongodb")

// get MongoDB client
const mongoClient = mongogb.MongoClient

// connect with MongoDB server
const client = await mongoClient.connect("mongodb://localhost:27017")

const db = client.db("mongodb_gridfs")

// create GridFS bucket instance
const bucket = new mongodb.GridFSBucket(db)
  1. First, it includes the MongoDB module.
  2. Then it gets a Mongo client object that helps in connecting with the database.
  3. Then we connect to the server.
  4. After that, we set the database.
  5. And finally, we are creating an instance of a bucket.

You can also give your bucket a name to identify.

// create GridFS bucket instance named "myBucketName"
const bucket = new mongodb.GridFSBucket(db, {
  bucketName: "myBucketName"
})

Following POST route will save the file.

// npm install fs
const fs = require("fs")

// npm install ejs
app.set("view engine", "ejs")

// npm install express-formidable
const expressFormidable = require("express-formidable")
app.use(expressFormidable())

app.post("/upload", function (request, result) {
  // get input name="file" from client side
  const file = request.files.file
  
  // set file path in MongoDB GriDFS
  // this will be saved as "filename" in "fs.files" collection
  const filePath = (new Date().getTime()) + "-" + file.name
  
  // read user uploaded file stream
  fs.createReadStream(file.path)
  
    // add GridFS bucket stream to the pipe
    // it will keep reading and saving file
    .pipe(
      bucket.openUploadStream(filePath, {
        // maximum size for each chunk (in bytes)
        chunkSizeBytes: 1048576, // 1048576 = 1 MB
        // metadata of the file
        metadata: {
          name: file.name, // file name
          size: file.size, // file size (in bytes)
          type: file.type // type of file
        }
      })
    )
    // this callback will be called when the file is done saving
    .on("finish", function () {
      result.send("File saved.")
    })
})

Now if you check in your “mongodb_gridfs” database, you will see 2 new collections.

  1. fs.files
    • This will save all uploaded files.
  2. fs.chunks
    • This will save all chunks of each file with that file ID.

Fetch all files from MongoDB GridFS

The following GET route will fetch all files uploaded to MongoDB GridFS.

app.get("/", async function (request, result) {
  const files = await bucket.find({
    // filename: "name of file" // 
  })
    .sort({
      uploadDate: -1
    })
    .toArray()
  result.render("index", {
    files: files
  })
})

Now you need to create a folder named “views” and inside that folder create a file named “index.ejs”.

Then you can loop through all files and display them in the image tag.

<% if (files) { %>
  <% files.forEach(function (file) { %>
    <p><%= file.filename %></p>
    <img src="image/<%= file.filename %>" style="width: 200px;" />
  <% }) %>
<% } %>

Right now, you will see a broken image. Now we need to create an API that will return the image as a response.

Return image as API response

app.get("/image/:filename", async function (request, result) {
  // get file name from URL
  const filename = request.params.filename
  
  // get file from GridFS bucket
  const files = await bucket.find({
    filename: filename
  })
  .toArray()
  
  // return error if file not found
  if (!files || files.length == 0) {
    return result.status(404).json({
      error: "File does not exists."
    })
  }
  
  // it will fetch the file from bucket and add it to pipe
  // result response is added in the pipe so it will keep
  // returning data to the client
  bucket.openDownloadStreamByName(filename)
    .pipe(result)
})

Now you will be able to view the image.

Delete file from MongoDB GridFS

First, you need to create a button after each file.

<% if (files) { %>
  <% files.forEach(function (file) { %>
    <p><%= file.filename %></p>
    <img src="image/<%= file.filename %>" style="width: 200px;" />
    
    <form action="/files/del" method="POST">
      <input type="hidden" name="_id" value="<%= file._id %>" />
      <button type="submit" class="btn btn-danger">Delete</button>
    </form>
  <% }) %>
<% } %>

Then you need to create an API in your Node JS file.

// 
const ObjectId = mongodb.ObjectId

app.post("/files/del", async function (request, result) {
  // get ID from data
  const _id = request.fields._id
  
  // delete file from bucket
  await bucket.delete(ObjectId(_id))
  
  // return response
  result.send("File has been deleted.")
})

This will delete the file and all its chunks from the database. So that’s all for now if you face any problem in following this, kindly do let me know.

You can learn more about the grid file system from Mongo DB’s official website.

MongoDB and MySQL equivalent queries

Hello. In this article, we are going to show you some MongoDB and MySQL equivalent queries. This will help you greatly if you want to convert a MySQL project into MongoDB or vice-versa.

Video tutorial:

Introduction

First, let’s give a small introduction to both of these databases.

MongoDB

MongoDB is schema-less database architecture. Means that the schema or structure of the database does not needs to be defined. Schema will automatically gets created as data comes in. It is used where data needs to be loosely-coupled.

MySQL

MySQL is a structured query language. The structure of the database and its tables needs to be defined before creating them. It is used where data needs to be tightly-coupled.

MongoDBMySQL
It is schema-less.Schema needs to be defined.
Non-relational database.Relational database.
It has collections.It has tables.
It has documents.It has rows.
Used for loosely-coupled data.Used for tightly-coupled data.
Horizontal scaling.Vertical scaling.
Each document can have different structure.Each row must have same structure.
Data is not dependent on other collections.Data might be dependent on other tables.
Uses nested objects and arrays.Uses separate tables and joins them using foreign keys.

1. Creating collections/tables

MongoDB

As mentioned above, MongoDB collections does not needs to be created. They will be created automatically once a document is inserted in it.

MySQL

To create a table in MySQL database, you can run the following query:

CREATE TABLE IF NOT EXISTS users(
    id INTEGER NOT NULL PRIMARY KEY AUTO_INCREMENT,
    name TEXT NOT NULL,
    age INTEGER DEFAULT 0
);

This will create a table named “users” having 3 columns “id”, “name” and “age”. ID will be a unique key that will automatically be incremented once a new row is inserted. While “name” will be a text string field and “age” will be an integer number.

2. Inserting documents/rows

MongoDB

To insert a document in a MongoDB collection, you can run the following query:

db.users.insertOne({
    name: "Adnan",
    age: 30
});

This will insert a new document in “users” collection. It will automatically assign unique ObjectId to it named “_id”.

{
	"_id": ObjectId("6474680ef3c486d92597e787"),
	"name": "Adnan",
	"age": 30
}

To insert a row in MySQL table, you would do:

INSERT INTO users(name, age) VALUES ("Adnan", 30);

3. Fetching data

MongoDB

To fetch multiple records from MongoDB collection, you can run the following query:

db.users.find({
    name: "Adnan"
}).toArray();

This will return all the documents where “name” is “Adnan”.

MySQL

In MySQL, you can run the following SQL query:

SELECT * FROM users WHERE name = "Adnan";

3.1 AND clause

You can use $and operator in MongoDB to fetch data where all conditions must met.

db.users.find({
    $and: [
        {
            name: "Adnan"
        },
        {
            age: 30
        }
    ]
}).toArray();

This will return all the users whose name is “Adnan” and their age is 30.

Same filter can be applied on MySQL using the following query:

SELECT * FROM users WHERE name = "Adnan" AND age = 30;

3.2 OR clause

You can use $or operator in MongoDB to fetch data where any of the condition met.

db.users.find({
    $or: [
        {
            name: "Adnan"
        },
        {
            age: 30
        }
    ]
}).toArray();

This will return all the users whose name is “Adnan” or if their age is 30.

In MySQL, we would apply the above filter like this:

SELECT * FROM users WHERE name = "Adnan" OR age = 30;

3.3 Limiting, sorting and skipping data

To limit the number of records to fetch, order them by their name in ascending order and skip 1 document, in MongoDB you would do:

db.users.find({
        age: 30
    })
    .sort({
        name: -1
    })
    .skip(1)
    .limit(1)
    .toArray()

This will sort the users documents by name in descending order, skip 1 record and return only 1 record.

Similar query can be run on MySQL in the following way:

SELECT * FROM users WHERE age = 30 ORDER BY id DESC LIMIT 1, 1

LIMIT {skip}, {limit} this is the syntax of LIMIT command in SQL.

3.4 Less or greater than

In MongoDB, you can use $lt (less than) and $gt (greater than) operators like this:

db.users.find({
    age: {
        $lt: 30
    }
}).toArray()

This will return the users whose age is less than 30. Following query will return the users whose age is greater than 30:

db.users.find({
    age: {
        $gt: 30
    }
}).toArray()

You can also use $lte and $gte operators for “less than and equal to” and “greater than and equal to” conditions respectively.

Above are the MongoDB queries, following are their equivalent MySQL queries:

/* less than */
SELECT * FROM users WHERE age < 30;

/* less than and equal to */
SELECT * FROM users WHERE age <= 30;

/* greater than */
SELECT * FROM users WHERE age > 30;

/* greater than and equal to */
SELECT * FROM users WHERE age >= 30;

4. Updating data

To update a document in MongoDB collection, you would do the following:

db.users.updateMany({
    name: "Adnan"
}, {
    $set: {
        age: 31
    }
})

This will set the age to 31 of all users having name “Adnan”. If you want to update only one user then you can use updateOne() function instead.

In MySQL, you can do:

UPDATE users SET age = 31 WHERE name = "Adnan" LIMIT 1

4.1 Incrementing/decrementing values

To increment the value, in MongoDB you can do:

db.users.updateOne({
    name: "Adnan"
}, {
    $inc: {
        age: 3
    }
})

This will increment the value of age by 3 where name is “Adnan”. Same thing can be done for decrementing, you can just set the value in negative.

db.users.updateOne({
    name: "Adnan"
}, {
    $inc: {
        age: -3
    }
})

It’s equivalent MySQL query would be:

UPDATE users SET age = age + 3 WHERE name = "Adnan"

5. Delete data

To delete a document from MongoDB collection, you can run the following query:

db.users.deleteOne({
    name: "Adnan"
})

This will delete one document from users collection whose name is “Adnan”. To delete multiple, you can use deleteMany() function instead.

In MySQL, you can do:

DELETE FROM users WHERE name = "Adnan" LIMIT 1

6. Relationships

MongoDB

MongoDB is not a relational database. Data saved in one collection is not dependent on another collection’s data.

For example, if you want to save job history of each user. You do not have to create a separate collection for that. You can simply push a new job in document’s array.

db.users.findOneAndUpdate({
    name: "Adnan"
}, {
    $push: {
        "jobs": {
            _id: ObjectId(),
            title: "Developer",
            company: "adnan-tech.com",
            period: "3 years"
        }
    }
})

This will create an array “jobs” if not already created and insert a new element in that array.

{
	"_id" : ObjectId("64748227f3c486d92597e78a"),
	"name" : "Adnan",
	"age" : 30,
	"jobs" : [
		{
			"_id" : ObjectId("647490a4f3c486d92597e78e"),
			"title" : "Developer",
			"company" : "adnan-tech.com",
			"period" : "3 years"
		}
	]
}

MySQL

Whereas, MySQL is a relational database. Data saved in one table might be dependent on another table’s data.

If you want to achieve the above in MySQL, you would have to create a separate table for that and create user_id as foreign key and reference it to your “users” table.

CREATE TABLE IF NOT EXISTS jobs(
    id INTEGER NOT NULL PRIMARY KEY AUTO_INCREMENT,
    title TEXT NOT NULL,
    company TEXT NOT NULL,
    period TEXT NOT NULL,
    user_id INTEGER NOT NULL,

    CONSTRAINT fk_user_id_jobs FOREIGN KEY (user_id) REFERENCES users(id)
)

After that, you can insert a new job of that user using the following query:

INSERT INTO jobs (title, company, period, user_id) VALUES ("Developer", "adnan-tech.com", "3 years", 1)

This will insert a new row in “jobs” table and link it with “users” table using its foreign ID “user_id”.

jobs-table - mongodb mysql equivalent queries
jobs-table

To fetch the data both from users and jobs table, you have to perform a join operation.

SELECT * FROM users INNER JOIN jobs ON users.id = jobs.user_id

This will return all the users along with their jobs.

jobs-table-join-users-table
jobs-table-join-users-table

So you have learned to create many MongoDB queries to their equivalent MySQL queries and vice-versa. If you have any question related to this, feel free to comment it in the section below. You can also check our more tutorials on MongoDB to learn more.

Securely upload and view image in Node JS and Mongo DB

Learn how to securely upload and view image in Node JS and Mongo DB. Most of the time when we create a website where we allow users to upload images, we face a problem to secure them from unauthorized access. For example, you are creating a website where you are allowing users to upload pictures and share them with only selected people. Now you do not want any other person to view that picture.

This can be done easily with Node JS. First, we are going to create an empty project. So run the following commands in your terminal.

> npm init
> npm install express express-formidable ejs fs mongodb
> npm install -g nodemon
> nodemon server.js

Upload image

After that, create a file named server.js and write the following code in it.

// server.js

const express = require("express")
const app = express()
app.set("view engine", "ejs")

const port = process.env.PORT || 3000
app.listen(port, function () {
    console.log("Server started.")

    app.get("/", function (request, result) {
      result.render("index")
    })
})

Now, we will create a folder named views and inside this folder, create a file named index.ejs. Following will be the content of this file.

<!-- views/index.ejs -->

<form method="POST" action="/upload" enctype="multipart/form-data">
  <input type="file" name="file" />
  <input type="submit" value="Upload" />
</form>

This will create a form where user can select image. You can access this from the URL:

http://localhost:3000

Now in our server.js file, we will include the file system module, express formidable module and connect Mongo DB.

// server.js

const fs = require("fs")
const mongodb = require("mongodb")
const ObjectId = mongodb.ObjectId
const mongoClient = new mongodb.MongoClient("mongodb://localhost:27017")

const expressFormidable = require("express-formidable")
app.use(expressFormidable())

// Connect the client to the server (optional starting in v4.7)
await mongoClient.connect()

// Establish and verify connection
const db = await mongoClient.db("upload_view")
db.command({ ping: 1 })
console.log("Database connected")

After that, we will create a route that will handle this request.

// server.js

app.post("/upload", async function (request, result) {
  const file = request.files.file

  const fileData = await fs.readFileSync(file.path)
  if (!fileData) {
    console.error(fileData)
    return
  }

  const filePath = "uploads/" + (new Date().getTime()) + "-" + file.name
  fs.writeFileSync(filePath, fileData)

  await db.collection("files").insertOne({
    path: filePath,
    name: file.name,
    size: file.size
  })

  result.send("File has been uploaded.")
})

This will upload the file in your uploads folder. If you do not have that folder, you need to create it. This will also save the uploaded file data in Mongo DB.

You will not be able to view the images directly from the URL in the browser. So we need to create an API allowing users to view the image.

View image

First, we will fetch all the images from Mongo DB. So change your main route to the following:

// server.js

app.get("/", async function (request, result) {
  const files = await db.collection("files").find({}).toArray()
  result.render("index", {
    files: files
  })
})

Now, in your views/index.ejs, we will loop through all images and display their name and a link to view.

<!-- views/index.ejs -->

<% files.forEach(function (file) { %>
  <p>
    <a href="image/<%= file._id %>">
      <%= file.name %>
    </a>

    <img src="image/<%= file._id %>" style="width: 300px;" />
  </p>
<% }) %>

Finally, we need to create an API that will return the content of image. This will allow you to view the image on user side.

// server.js

app.get("/image/:_id", async function (request, result) {
  const _id = request.params._id
  const file = await db.collection("files").findOne({
    _id: new ObjectId(_id)
  })

  if (file == null) {
    result.json({
      status: "error",
      message: "File not found."
    })
    return
  }

  const fileData = await fs.readFileSync(file.path)
  if (!fileData) {
    console.error(fileData)
    return
  }
  result.writeHead(200, {
    "Content-Type": "image/png",
    "Content-Length": fileData.length
  })
  result.end(fileData)
})

Now you will be able to view the image as well.

Pagination on arrays – MongoDB

In this tutorial, we will teach you, how you can do pagination on arrays of documents in MongoDB.

If you have a very large database, fetching all the documents in one query might be very slow. But we have a solution: Pagination. It allows you to fetch a few records in one query and the next few in another query. For example, if you have 1000 users, the first query will fetch users from 0 to 100, the second query will fetch users from 101 to 200, and so on.

Problem

Pagination comes with a problem; when data is stored in arrays. In Mongo DB, the data is stored in JSON format. You might be saving data in arrays. For example, take the following document:

db.collection("users").insertOne({
	name: "Adnan",
	workouts: [{
		name: "Push ups",
		sets: 10,
		reps: 30
	}, {
		name: "Pull ups",
		sets: 7,
		reps: 10
	}, {
		name: "Crunches",
		sets: 5,
		reps: 50
	}, {
		name: "Deadlifts",
		sets: 3,
		reps: 15
	}, {
		name: "Shoulder press",
		sets: 8,
		reps: 8
	}]
})

Here, one document has an array of “workouts” and it might have a lot more data in it. For example, the user we created above has 5 “workouts” array elements, but you want to show 2 workouts on each query. If you run the regular query to fetch the users document, then it will return all the 5 workouts:

// not a solution

// return the whole object
const data = await db.collection("users").find({
	name: "Adnan"
}).toArray()
console.log(data[0])

But it will return the whole object with all 5 “workouts” elements, which is not the desired outcome.

So how will you paginate on that ?

Solution

This is where the Mongo DB $slice operator comes in. It works the same as the Javascript slice() function. It cuts the array from provided start and end values. So we will use the following query to get the users document but with a few workouts:

// solution

// return the first 2 workouts
const data = await db.collection("users").find({
	name: "Adnan"
}, {
	projection: {
		workouts: {
			$slice: [0, 2]
		}
	}
}).toArray()
console.log(data[0])

Slice start value works on indexes, so 0 means the first element of the array, and the end means the number of elements to fetch. So it will fetch 2 array elements starting at index 0.

The second parameter to the find() function tells the projection, which means which keys you want to fetch. You have to write all your key names inside the “projection” object. Now if you want to fetch the next 2 records, you can simply do the following:

// return the workouts from 2 to 4 
const data = await db.collection("users").find({
	name: "Adnan"
}, {
	projection: {
		workouts: {
			$slice: [2, 2]
		}
	}
}).toArray()
console.log(data[0])

The “start” is set to 2, which means the array starts from index 2 which will be the 3rd element of the array. Because the first and second are already fetched in the previous query. The “end” is again set to 2 so we are again fetching the 2 records from the array.

That’s how you can keep going on and implementing pagination on your document’s arrays in MongoDB. If you face any problem in following this, kindly do let me know.

[wpdm_package id=’2045′]