Using Prom-Client Library For NodeJS Application Monitoring
Importing the library & requiring it
- Move into the
forethought
directory:cd forethought
- Install the
prom-client
vianpm
, Node.js’s package manager:npm install prom-client --save
- Open the
index.js
file, where we’ll be adding all of our metrics code:vim index.js
- Require the use of the
prom-client
by adding it to our variable list:var express = require('express'); var bodyParser = require('body-parser'); var app = express(); const prom = require('prom-client');
With
prom
being the name we’ll use when calling the client library. - Enable default metrics scraping:
const collectDefaultMetrics = prom.collectDefaultMetrics; collectDefaultMetrics({ prefix: 'forethought' });
- Use Express to create the
/metrics
endpoint and call in the Prometheus data:app.get('/metrics', function (req, res) { res.set('Content-Type', prom.register.contentType); res.end(prom.register.metrics()); });
Counters
- Open up the
index.js
file:cd forethought $EDITOR index.js
- Define a new metric called
forethought_number_of_todos_total
that works as a counter:// Prometheus metric definitions const todocounter = new prom.Counter({ name: 'forethought_number_of_todos_total', help: 'The number of items added to the to-do list, total' });
- Call the new metric in the
addtask
post function so it increases by one every time the function is called while adding a task:// add a task app.post("/addtask", function(req, res) { var newTask = req.body.newtask; task.push(newTask); res.redirect("/"); todocounter.inc(); });
Save and exit.
- Test the application:
node index.js
- While the application is running, visit MYLABSERVER:8080 and add a few tasks to the to-do list.
- Visit
MYLABSERVER:8080/metrics
to view your newly created metric!
Gauges
- Define the new gauge metric for tracking tasks added and completed:
const todogauge = new prom.Gauge ({ name: 'forethought_current_todos', help: 'Amount of incomplete tasks' });
- Add a gauge
.inc()
to the/addtask
method:// add a task app.post("/addtask", function(req, res) { var newTask = req.body.newtask; task.push(newTask); res.redirect("/"); todocounter.inc(); todogauge.inc(); });
- Add a gauge
dec()
to the/removetask
method:// remove a task app.post("/removetask", function(req, res) { var completeTask = req.body.check; if (typeof completeTask === "string") { complete.push(completeTask); task.splice(task.indexOf(completeTask), 1); } else if (typeof completeTask === "object") { for (var i = 0; i < completeTask.length; i++) { complete.push(completeTask[i]); task.splice(task.indexOf(completeTask[i]), 1); todogauge.dec(); } } res.redirect("/"); });
Save and exit the file.
- Test the application:
node index.js
- While the application is running, visit MYLABSERVER:8080 and add a few tasks to the to-do list.
- Visit
MYLABSERVER:8080/metrics
to view your newly created metric!
Summaries and Histograms
- Move into the
forethought
directory:cd forethought
- Install the Node.js module
response-time
:npm install response-time --save
- Open the
index.js
file:$EDITOR index.js
- Define both the summary and histogram metrics:
const tasksumm = new prom.Summary ({ name: 'forethought_requests_summ', help: 'Latency in percentiles', }); const taskhisto = new prom.Histogram ({ name: 'forethought_requests_hist', help: 'Latency in history form', });
- Call the
response-time
module with the other variables:var responseTime = require('response-time');
- Around where we define our website code, add the
response-time
function, calling thetime
parameter within our.observe
metrics:app.use(responseTime(function (req, res, time) { tasksumm.observe(time); taskhisto.observe(time); }));
- Save and exit the file.
- Run the demo application:
node index.js
- View the demo application on port 8080, and add the tasks to generate metrics.
- View the
/metrics
endpoint. Notice how our response times are automatically sorted into percentiles for our summary. Also notice how we’re not using all of our buckets in the histogram. - Return to the command line and use CTRL+C to close the demo application.
- Reopen the
index.js
file:$EDITOR index.js
- Add the
buckets
parameter to the histogram definition. We’re going to adjust our buckets based on the response times collected:const taskhisto = new prom.Histogram ({ name: 'forethought_requests_hist', help: 'Latency in history form', buckets: [0.1, 0.25, 0.5, 1, 2.5, 5, 10] });
- Save and exit. Run
node index.js
again to test.