The purpose of this article is to demonstrate how quickly you can build complete applications with Node.js. This adventure came about whilst working on an engagement. The customer wanted to build a request broker between two systems that needed to be integrated together, both having their own REST APIs. The customer in this case opted to develop their own connector in Java, and estimated a six month development turnaround.
I actually started with a script written in Coffeescript that I was given to act as a mock of the target system. The script uses Express to create a bunch of REST endpoints which return a fixed response.
I would have liked to have taken a more TDD approach and written unit tests but for this application there isn’t really a good way of doing this. However, since we have a clear line of separation between the three components, testing at the boundaries of each is really easy. I can write integration tests for each component, and a complete end to end test quite easily. My client test script is simply a bash script which uses curl to send a pre-canned JSON request. I can also test the target system in isolation, in the same way.
In my project directory I did the following to get the folder structure I wanted: mkdir 'mock' mkdir 'node-client' mkdir 'node-connector' mkdir 'coffee-client'
mkdir 'mock' mkdir 'node-client' mkdir 'node-connector' mkdir 'coffee-client'
I put the existing coffee script ‘index.coffee’ in the ‘mock’ folder. Then did the following :
cd mock npm --init (anwser the questions in the text wizard) npm install express
The ‘–init’ command basically sets up the folder as being a node application. The ‘install’ command downloads and install the dependancies for ‘Express’ in the current project folder. It also updates the ‘package.json’ to include ‘Express’.
cd ../coffee-client
I then created a new script file and supporting files with the following content:
send_single.sh : curl -H "Accept: application/json" -H "Content-Type: application/json" -vX POST -d @single.json http://localhost:8081/automationrequests
curl -H "Accept: application/json" -H "Content-Type: application/json" -vX POST -d @single.json http://localhost:8081/automationrequests
single.json : [ { "action": "Do Something 1", "application": "API", "created": "15-OCT-2015 17:17:35", "description": "This is a test of the REST API 1", "inboundPayload": { "Domain": ".uk.sombank.com", "Host Name": "LNRBAEST01", "IP Address": "10.192.1.123", "OS Version": "Blah Blah windows 2003" }, "location": "UNITED KINGDOM", "sourceRequestId": "ID00001" } ]
[ { "action": "Do Something 1", "application": "API", "created": "15-OCT-2015 17:17:35", "description": "This is a test of the REST API 1", "inboundPayload": { "Domain": ".uk.sombank.com", "Host Name": "LNRBAEST01", "IP Address": "10.192.1.123", "OS Version": "Blah Blah windows 2003" }, "location": "UNITED KINGDOM", "sourceRequestId": "ID00001" } ]
get_status.sh: curl -H "Accept: application/json" -H "Content-Type: application/json" -vX GET http://localhost:8081/automationrequests/status/$1
curl -H "Accept: application/json" -H "Content-Type: application/json" -vX GET http://localhost:8081/automationrequests/status/$1
../node-client/send_single.sh: curl -H "Accept: application/json" -H "Content-Type: application/json" -vX POST -d @single.json http://localhost:8079/automationrequests/
curl -H "Accept: application/json" -H "Content-Type: application/json" -vX POST -d @single.json http://localhost:8079/automationrequests/
../node-client/send_batch.sh: curl -H "Accept: application/json" -H "Content-Type: application/json" -vX POST -d @batch.json http://localhost:8079/automationrequests/
curl -H "Accept: application/json" -H "Content-Type: application/json" -vX POST -d @batch.json http://localhost:8079/automationrequests/
and ../node-client/batch.json: Just contains an array of 5 mock objects.
So very quickly I had a mock of the target system and test wrappers for both the source and target system. This left me to concentrate on developing the connector. I knew I wanted to use Express for getting the client requests and actioning them. So I did the following :
cd node-connector npm --init
Modified the package.json to reflect the correct information about the project.
npm install express
I also knew I want to handle JSON in the request body – the body-parser module for Express was the next thing to get.
npm install bodyparser
At this point I wrote the core of the Express endpoints and tested that I could handle requests from the client script.
Eventually I will need to ship these logs and index them using logstash and elastic search. However, for now I just wanted a simple logging system that I knew I could ship to logstash in the future. Luckily, as is often the case with node an appropriate module already exists:
npm install bunyan
So I added the initial logging code into my main.js script. I will revisit logging and ELK properly in a future post.
Now this is where things get a little complex. Node has a heavy preference for asynchronous actions which makes sense for a lot of its use cases in web applications. However, sometimes we need to do things sequentially – so having the ability to make web requests synchronously is a necessity. Unfortunately Node and other Javascript toolkits try to discourage this sort of usage by moving the functionality out to plugins rather than keeping it in the core. Fortunately, there are a number of npm modules which provide the ability to make synchronous HTTP calls. After a little bit of googling I settled on http-sync.
I added a number of methods that deal with making synchronous requests to the target system’s REST API.
So now I needed a data structure to manage the lifecycle of the outbound requests. I don’t need to hold much state information about the initial batch of requests from the client system. The natural choice would be a queue. Rather than implementing my own queueing system I decided to try a module called ‘Bull’ which is a Redis backed stateful queue.
Installing this was straight forward. Just install Redis using your favourite package manager: brew install redis
brew install redis
Then install Bull npm install bull
npm install bull
I started by adding the method hooks for the queue. Of course I was adapting and extending my test cases so I could ensure I was not breaking things.
I soon realised that I would need to separate the queue processing from the main node process running Express. So, I started researching best way of separating the client request handling from the queue processing. Luckily enough, Node’s process model allows for some very simple process separation. Right now all I need is a child process which processes the queue. Node.JS’s builtin module ‘child_process’ allows you to spawn another process. There is no IPC code to be written since the main process writes items into the queue and the child process reads and updates the queue. All I needed to add were some exit handlers to deal with shutdown conditions. This was a good time to refactor my code into three files: common.js for logging and the queue access and the http-sync request methods, worker.js for the queue processing operations, and main.js for the Express endpoint handlers.
For this phase of work, the solution is functionally complete for short running jobs. However, it will still need a bit of work to become a complete solution. So far there is no reverse communication and long running jobs will not be handled correctly. The queue configuration and processing also needs some refinement. However, if you are new to Node.js, I hope this article will inspire you to give it a proper go. Node does get a lot of hype at the moment but it’s fair to say that it does make working with web APIs a breeze. The time to get up and running and productive with node is comparatively small, allowing you too focus on what your app does as opposed to worrying about the low level details. Of course other languages such as Python would have also have been good for this example, Flask provides similar advantages to Express.
If you want to see the code that goes with this example it’s in the following git repo:
git clone https://bitbucket.org/automationlogic/nodeapp.git
We work with our clients to de-risk and accelerate their business goals realisation. Our approach is based on tailoring our services to fit your needs leveraging our portfolio of strategy, execution, innovation and service delivery offerings to help you reach your objectives
We’re always on the lookout for exceptional talent and people who share our values. Even as we continue to grow, we maintain a family environment with respect and teamwork core to our culture.
Companies can start deploying containerised workloads to Kubernetes In days not months, by leveraging Automation Logic’s Kubernetes Platform Accelerator.
Largest Irish-Founded Tech Employer Surpasses 3000 Employees Signing 15th and 16th Acquisition Deals Version 1, a leading digital transformation partner, is to acquire Automation Logic – for an undisclosed sum as part of its ambitious growth strategy for 2023 and beyond. The Automation Logic deal is subject to clearance by the National Security and Investment […]
Automation Logic were recognised as one of the UK’s best Workplaces again for the 5th year running! We also placed 29th in best workplaces for women. This highlights the great work that we are doing as a business, including all the wonderful work being put in by the employee-led DE&I groups. This award means so […]