To connect to MongoDB from Express and Node, you will first need to install the official Mongo driver:
$ npm install mongodb
Once the official Mongo driver is installed, you can connect to the local MongoDB server and issue commands from your JavaScript program as follows:
// import MongoDB driver
const MongoClient = require('mongodb').MongoClient;
const options = { useUnifiedTopology: true, writeConcern: { j: true } };
// connection URL
const url = 'mongodb://localhost:27017';
// create a new MongoClient
const client = new MongoClient(url, options);
// use connect method to connect to the Server
client.connect((err) => {
// obtain dogs collection in animals db
const dogs = client.db('animals').collection('dogs');
// insert document {breed: 'poodle', size: 'small'}
dogs.insertOne({ breed: 'poodle', size: 'small' }, (err, result) => {
// retrieve any document with breed == 'poodle'
dogs.find({ breed: 'poodle' }).toArray((err, docs) => {
// print retrieved document in console
console.log(docs);
client.close();
});
});
});
The above code connects to MongoDB, saves
a document { breed: 'poodle', size: 'small' }
to the
dogs
collection of animals
database, and retrieves it.
Save the code to a file, say animals.js
, and run it as follows:
$ node animals.js
If everything is set up properly, you will get a result similar to the following:
[ { _id: 5eb46d42dad26b0087a166be, breed: 'poodle', size: 'small' } ]
Congratulations! You just issued your first MongoDB commands in Node.js using the official Mongo driver.
Now that we learned the basic, let us consider a bit more advanced Express app with the following structure:
app/
app.js
routes/
dogs.js
cats.js
views/
dogs.ejs
cats.ejs
In the file structure above, there are two route handlers (or middleware): dogs.js
and
cats.js
. The dogs.js
middleware displays information on dogs and
cats.js
on cats. Both of them need to access the MongoDB database. If you just use the code from the previous example inside each of the two middleware, it will work but it will be highly inefficient.
It will create a new connection to the MongoDB server per every request.
In a high-throughput application this can result in a constant flood of new connection requests to your database which will adversely affect the performance of your database and your application.
To avoid this inefficiency, let’s create a separate db.js
module with the
following content.
const MongoClient = require('mongodb').MongoClient;
const options = { useUnifiedTopology: true, writeConcern: { j: true } };
let client = null;
// create a connection to url and call callback()
function connect(url, callback) {
if (client == null) {
// create a mongodb client
client = new MongoClient(url, options);
// establish a new connection
client.connect((err) => {
if (err) {
// error occurred during connection
client = null;
callback(err);
} else {
// all done
callback();
}
});
} else {
// connection was established earlier. just call callback()
callback();
}
}
// get database using pre-established connection
function db(dbName) {
return client.db(dbName);
}
// close open connection
function close() {
if (client) {
client.close();
client = null;
}
}
// export connect(), db() and close() from the module
module.exports = {
connect,
db,
close
};
The above db.js
module will help our app connect to MongoDB once when it
starts, so that all middleware can share the same pre-established connection.
To see how, here is an example on how app.js
may use the db.js
module
to establish the initial connection:
let express = require('express');
let app = express();
let client = require('./db');
let dogs = require('./routes/dogs');
let cats = require('./routes/cats');
app.set('view engine', 'ejs');
app.set('views', './views');
app.use('/dogs', dogs);
app.use('/cats', cats);
// connect to Mongo on start
client.connect('mongodb://localhost:27017/', (err) => {
if (err) {
console.log('Unable to connect to Mongo.');
process.exit(1);
} else {
app.listen(3000, () => {
console.log('Listening on port 3000...');
});
}
});
When the app starts, it connects to MongoDB server using
the connect()
method of our db.js
module and listens on port 3000.
Later, when a request for dogs.js
is received,
it may retrieve all dogs data from MongoDB like the following:
let express = require('express');
let router = express.Router();
let client = require('../db');
router.get('/list', (req, res) => {
let dogs = client.db('animals').collection('dogs');
dogs.find().toArray((err, docs) => {
res.render('dogs', {animals: docs});
});
});
module.exports = router;
Note that dogs.js
does not make any connect()
call. The connection
was already established by app.js
during initial setup.
The dogs.js
middleware, therefore, can directly retrieve
all data from dogs
collection of animals
database using
this pre-established connection by calling client.db('animals').collection('dogs')
.
Sharing the same connection for all requests, neat!
Note: Even though our code creates a database connection only once during the app initial setup, MongoDB driver secretly creates multiple connections (“connection pool”) behind the scene, so that it can issue multiple database queries simultaneously even when previous queries were not completed. Most MongoDB drivers support a parameter that sets the max number of connections (pool size) available to your application. The connection pool size can be thought of as the max number of concurrent requests that your driver can service. The default pool size for the node driver is 5. If you anticipate your application receiving many concurrent or long-running requests, we recommend increasing your pool size- adjust accordingly!
Sharing the same pre-established connection for all requests is neat, but we need one more important refactoring of our code.
With our current setup, dogs.js
middleware accesses dogs data
roughly as follows:
let client = require('../db');
let dogs = client.db('animals').collection('dogs');
dogs.find().toArray((err, docs) => {
// show result
});
That is, the specific details on where and how our data is stored is exposed to our middleware, which is against the well-established Model-View-Controller pattern. This has many negative consequences down the road. For example, if we need to migrate our data to a different data storage engine, we will have to modify all our middleware, even though there is no fundamental change in our data itself.
So let us factor out the data handling code to separate “models” like the following:
app/
app.js
db.js
routes/
dogs.js
cats.js
views/
dogs.ejs
cats.ejs
models/
dogs.js
cats.js
As you can see above, we now have two additional files in the models folder.
The dogs.js
model will look like the following:
let client = require('../db');
function getAll(callback) {
let dogs = client.db('animals').collection('dogs');
dogs.find().toArray((err, docs) => {
callback(err, docs);
});
}
module.exports = {
getAll
};
Given this dogs.js
model,
our dogs.js
middleware can now display all dogs as follows:
let express = require('express');
let router = express.Router();
let dogs = require('../models/dogs');
router.get('/list', (req, res) => {
dogs.getAll((err, docs) => {
res.render('dogs', {animals: docs})
});
});
module.exports = router;
Note that now the dogs.js
middleware has no knowledge of how and where dogs data is stored. All it knows is that dogs data can be obtained simply by importing the dogs.js
model and calling dogs.getAll()
.
Our models are the only ones who knows that our data is in fact stored and managed by MongoDB.
Also remember that all our models share the same database connection through db.js
. Twice neat!
To make this tutorial simple, we did not include any error handling logic in our sample code. In reality, errors are bound to happen, and you will have to carefully identify places where errors may occur and add the appropriate logic to handle them. This unfortunately is a messy and laborious process for which we cannot provide much help. You just have to do it.