Hello there, I hope you are interested in modern approach of building API’s with Node.js, cause it’s would be the main theme of this tutorial.
Let’s make quick overview, what’s hidden in this article:
- setting up Koa 2 server
- creating basic API folder structure
- transpiling async/await to generator functions
- implementing basic Model with Data Abstraction Layer in Mind
- wiring up basic API CRUD
It's the first tutorial in the series of "Node.js Heaven" - full tutorial on bugless.me
First thing everyone should do when building some API - describe your dependencies in package.json file. So, let's create yet another (or first, if you're a novice) package.json file with following cli command: npm init --yes
--yes flag helps you answer yes to all the npm questions. After this step, you'll have basic package.json.
Now we could install all the dependencies and save them to package.json: npm i koa@2 koa-router@next koa-bodyparser@next gulp gulp-babel babel-plugin-syntax-async-functions config --saveSo, let's look closely:
- koa@2 - next generation of well known Koa framework, supports async/await functions
- koa-router@next - routing with support for Koa 2 and async/await
- koa-bodyparser@next - parses JSON in requests for us
- gulp - efficient easy-to-use build system, you'll love it
- gulp-babel - we'd like to set transpiling with babel
- babel-plugin-syntax-async-functions - transpiles async/await functions to generators
- config - there is no API without config, trust me!
const Koa = require('koa'); const Router = require('koa-router'); const bodyParser = require('koa-bodyparser'); const config = require('config');Also, you'll need to create config/default.js at the root of your project with simple export:const app = new Koa(); const router = new Router();
app.use(bodyParser()); router.get('/', (ctx) => ctx.body = {hello: 'world'}); app.use(router.routes());
app.listen(config.port, () => { console.info(
Listening to http://localhost:${config.port}
); });
module.exports = { port: 1234 };
Great news, we could run node app.js to launch server and get our beautiful {hello: 'world'} response!
Considering significant fact Node.js is hard because of it's asynchronous nature, that's where async/await approach comes to the rescue! It tames complexity of callbacks and manipulates promises just as good old synchronous code!
Next thing we want is error handling middleware, but before implementing it, we must understand that latest Node.js version (6.5.0) still doesn't support async/await functions natively. It's definitely the place where Babel comes in hand!
Let's set it up! First, install required Babel plugin:
npm install babel-plugin-transform-async-to-generator --save
This plugin would manage all the hard things for us, and to successfully use it we should create gulpfile.js with next default task:const gulp = require('gulp'); const babel = require('gulp-babel');Yep, that's how we do it! Last small step of our preparations - add line 2 to package.json:gulp.task('default', () => { return gulp.src(['app.js', 'src/**/*.js']) .pipe(babel({ plugins: ['transform-async-to-generator'] })) .pipe(gulp.dest('dist')); });
"scripts": { "start": "gulp && node dist/app.js", "test": "echo \"Error: no test specified\" && exit 1" }
And now, when every async function could be transpiled, let's create src folder, where we put in all the source code and add to src/middlewares/handleErrors.js next few lines of code:
module.exports = async (ctx, next) => { try { await next(); } catch (e) { const resError = { code: 500, message: e.message, errors: e.errors }; if (e instanceof Error) { Object.assign(resError, {stack: e.stack}); } Object.assign(ctx, {body: resError, status: e.status || 500}); } };
So, here is our first async function, which will help us catch every error and send it to our reponse instead of silently writing to console. The last step, in app.js let's write one more endpoint, and it surely would be async:
router.get('/error/test', async () => { throw Error('Error handling works!'); });
So now, when we run npm start and type in browser http://localhost:1234/error/test we would get error with detailed stack trace! Victory!
Okay, our next challenge is simplifying work with DB. There is well known solution called Data abstraction/access layer, but its implementation may depend on a lot of different factors. In our case it would be enough to write very simple DAL, but it will show us all the benefits.
So, here is our roadmap:- Install mongod npm module
- Setting up connection to DB on startup
- Creating simple model with CRUD (create, read, update and delete) methods
- Pass db instance, received in the first step, to model we created
- Actually, use it :)
Before we go further, please install and run MongoDB, as it would be our toy today (https://docs.mongodb.com/manual/installation/)
1. npm i mongodb --save - now you have that awesome mongodb driver
2. setting up basic connection is quick and easy, let's create folder db in root of the project and place there index.js file with next simple code:const mongodb = require('mongodb'); const config = require('config');The main point here is that connect method of MongoClient could be used both ways as a callback and as a promise. As you see, we prefer the second way and combine it with async/await control flow methodic. That's simple! Don't forget to update your config/default.js to look like this:module.exports = { connect: async () => { await MongoClient.connect(config.db.url); } };
module.exports = { port: 1234, db: { url: 'mongodb://localhost:27017/koatutor' } };
Our finishing touches would change app.js a little (lines 6, 19-28):
const Koa = require('koa'); const Router = require('koa-router'); const bodyParser = require('koa-bodyparser'); const config = require('config'); const handleErrors = require('./middlewares/handleErrors'); const db = require('./db'); const app = new Koa(); const router = new Router(); app.use(handleErrors); app.use(bodyParser()); router.get('/error/test', async () => { throw Error('Error handling works!'); }); router.get('/', (ctx) => ctx.body = {hello: 'world'}); app.use(router.routes()); db .connect() .then(() => { app.listen(config.port, () => { console.info(`Listening to http://localhost:${config.port}`); }); }) .catch((err) => { console.error('ERR:', err); });
As we could see here, async function also could be used with promise interface, that's fantastic!
2. It's time for first model. Let's imagine, that we need to organize all birds all over the world and store them into our DB. That's where Bird model comes to the rescue!
A little changes touched src/db/index.js, and now our connect method and export:
const MongoClient = require('mongodb').MongoClient; const config = require('config'); const Model = require('./model'); let db; class Db { async connect() { if (!db) { db = await MongoClient.connect(config.db.url); this.Bird = new Model(db, 'birds'); } } }; module.exports = new Db();
Using of class is absolutely necessary here, cause it allows us to map our models into db instance. That's the moment, when we implement our 4. - pass db as the first argument to the Model. And Model for now looks very simple (src/db/model.js):
class Model { constructor(db, collectionName) { this.name = collectionName; this.db = db; } } module.exports = Model;
That's how we do it!
So what's next? Let's start with create operation:
class Model { constructor(db, collectionName) { this.name = collectionName; this.db = db; } async insertOne(data) { const operation = await this.db.collection(this.name).insertOne(data); if (operation.result.ok !== 1 || operation.ops.length !== 1) { throw new Error('Db insertOne error'); } return operation.ops[0]; } } module.exports = Model;
The reason why we named create operation as insertOne is just consistency with MongoDB api.
Also, we check for insert correctness by checking result.ok and ops.length properties. In return statement we just take the one and only inserted document and show it to the world.
Final touch, add some route in app.js to let some request arrive:
... router.get('/', (ctx) => ctx.body = {hello: 'world'}); router.post('/birds', async (ctx, next) => { const data = ctx.request.body; ctx.body = await db.Bird.insertOne(data); }); app.use(router.routes()); db .connect() ...
And now, we could use our API in the right way:
curl -H "Content-Type: application/json" -X POST -d '{"bird":"seagull","age":"3"}' http://localhost:1234/birds
It would be nice to get stored bird, so get operation of Model CRUD would be next:
const ObjectId = require('mongodb').ObjectID; class Model { constructor(db, collectionName) { this.name = collectionName; this.db = db; } async insertOne(data) { const operation = await this.db.collection(this.name).insertOne(data); if (operation.result.ok !== 1 || operation.ops.length !== 1) { throw new Error('Db insertOne error'); } return operation.ops[0]; } async findOneById(id) { let query = { _id: ObjectId(id) } const result = await this.db.collection(this.name).findOne(query); if (!result) { throw new Error('Db findOneById error'); } return result; } } module.exports = Model;
What's new, what's old? Now we have ObjectId wrapper, which would help us with getting documents from db. If we pass id received in request raw as string, MongoDB would return us nothing. Notice, that findOne returns straightly document, so we just check, that it was found and pass document back to route.
Don't forget to add the route in app.js:
... router.get('/', (ctx) => ctx.body = {hello: 'world'}); router.post('/birds', async (ctx, next) => { const data = ctx.request.body; ctx.body = await db.Bird.insertOne(data); }); router.get('/birds/:id', async (ctx, next) => { const id = ctx.params.id; ctx.body = await db.Bird.findOneById(id); }); app.use(router.routes()); ...
Here we see old good params - id from route could be accessed in ctx.params, and that's great. So, npm start the API and send request(! change 57def7f270e422085ca61d28 to id from your DB !): curl -X GET http://localhost:1234/birds/57def7f270e422085ca61d28
Now we could happily access every stored document!
There are only two operations left - Update and Delete. Here is the self-described code:
const ObjectId = require('mongodb').ObjectID; class Model { constructor(db, collectionName) { this.name = collectionName; this.db = db; } ... async findOneAndUpdate(id, data) { const query = {_id: ObjectId(id)}; const modifier = {$set: data}; const options = {returnOriginal: false}; const operation = await this.db .collection(this.name) .findOneAndUpdate(query, modifier, options); if (!operation.value) { throw new Error('Db findOneAndUpdate error'); } return operation.value; } async removeOne(id) { const query = {_id: ObjectId(id)}; const operation = await this.db.collection(this.name).remove(query); if (operation.result.n !== 1) { throw new Error('Db remove error'); } return {success: true}; } } module.exports = Model;
Take a look at $set operator, which allows us partially update document in db by touching only properties specified in data. Options object contains returnOriginal property, which is set to false cause we need updated document as the result at the end of operation, yep?) Error handling takes its way by checking for value in findOneAndUpdate and for result.n in removeOne operation.
The last but not the least - bring our new functionality to routes in app.js:
... router.get('/birds/:id', async (ctx, next) => { const id = ctx.params.id; ctx.body = await db.Bird.findOneById(id); }); router.put('/birds/:id', async (ctx, next) => { const id = ctx.params.id; const data = ctx.request.body; ctx.body = await db.Bird.findOneAndUpdate(id, data); }); router.del('/birds/:id', async (ctx, next) => { const id = ctx.params.id; ctx.body = await db.Bird.removeOne(id); }); app.use(router.routes()); ...
Now we can update our document with next request:
curl -H "Content-Type: application/json" -X PUT -d '{"appearance":"white","age":"5"}' http://localhost:1234/birds/57def7f270e422085ca61d28
Such update will $set(rewrite) property age to 5 and add one new - appearance and $set it to "white".
Be careful while updating documents, cause doing it without $set will overwrite the whole document and it could lead to loss of data.
Okay, let's remove the gull: curl -H "Content-Type: application/json" -X DELETE http://localhost:1234/birds/57def7f270e422085ca61d28
Oh yep, it works perfectly! So it seems to be done easily, though.
Little remarks for the future:
- there also should be Service layer in the middle of routes and DAL, it was dismissed in this tutorial for the sake of simplicity
- there also should be findAll method(implemented with streams) in DAL, and we'll talk about it in the next article ;)
- there also should be some even simple tests. Mocha allows painless testing, next article will show you
- there also should be basic eslint checks, search for it
Thanks for your attention and patience, hope you enjoyed our journey!