Node.js | Load Testing Tools

Here are some popular Node.js stress testing tools:

  1. Artillery
  2. k6
  3. Autocannon
  4. AB (Apache Bench)

Artillery

Artillery is an open-source load testing and performance benchmarking tool for APIs, microservices, and websites. It allows you to define HTTP load testing scenarios and run them against your target system to determine its performance characteristics under various conditions. Artillery provides features like flexible scenario definition, real-time reporting, and automatic scaling of traffic, making it a popular choice for performance testing and optimization.

Here’s an example of how you can use Artillery to load test a Node.js server:​​​

  1. Install Artillery:
npm install -g artillery

2. Create a file named test.yml with the following content:config:
 target:

config:
  target: "http://localhost:3000"
  phases:
    - duration: 60
      arrivalRate: 5

scenarios:
  - flow:
    - get:
        url: "/"

This scenario defines a single flow that sends a GET request to the root URL of the target system, with a constant arrival rate of 5 requests per second for 60 seconds.

3. Start the test:

artillery run test.yml

4. View the results:

Artillery will generate a report with the performance metrics of the test, including requests per second, response times, and error rates.

More Examples:

config:
  target: "http://localhost:3000"
  phases:
    - duration: 30
      arrivalRate: 5
    - duration: 30
      arrivalRate: 10
scenarios:
  - flow:
      - post:
          url: "/api/login"
          json:
            username: "testuser"
            password: "testpassword"
          capture:
            json: "$.token"
            as: "token"
      - get:
          url: "/api/user"
          headers:
            Authorization: "Bearer {{token}}"

In this example, Artillery is running a test scenario that simulates a login request to the /api/login endpoint, captures the JSON response and extracts the token, and then uses the token to make a request to the /api/user endpoint with an Authorization header. The test will run for 60 seconds at two different arrival rates, 5 requests per second for the first 30 seconds, and 10 requests per second for the next 30 seconds.


k6

k6 is an open-source load testing tool designed to help developers and performance engineers to test the performance and scalability of their web applications and APIs. k6 is written in Go and uses JavaScript as its scripting language, which makes it easy to use and provides a high level of flexibility and customization.

With k6, you can define complex testing scenarios and run them against your target system to measure various performance metrics such as response time, requests per second, and error rates. k6 also provides real-time reporting and visualization capabilities, making it easy to analyze test results and identify performance bottlenecks.

k6 is a popular choice for performance testing due to its ease of use, versatility, and the ability to integrate with other tools and services such as CI/CD pipelines, monitoring systems, and cloud-based load testing platforms.

Here’s an example of how you can use k6 to load test a Node.js server:

  1. Install k6:
npm install -g k6

2. Create a script file:

Create a file named test.js with the following content:

import http from "k6/http";

export default function() {
  http.get("http://localhost:3000/");
};

This script defines a single HTTP GET request to the root URL of the target system.

3. Start the test:

k6 run test.js

4. View the results:

k6 will generate a report with the performance metrics of the test, including requests per second, response times, and error rates. The report can be further customized and visualized with the help of k6’s built-in reporting and visualization capabilities.

More Examples:

import http from "k6/http";
import { check, sleep } from "k6";

export let options = {
  stages: [
    { duration: "30s", target: 5 },
    { duration: "30s", target: 10 },
  ],
};

export default function() {
  let res = http.post("http://localhost:3000/api/login", JSON.stringify({
    username: "testuser",
    password: "testpassword",
  }), {
    headers: {
      "Content-Type": "application/json",
    },
  });

  check(res, {
    "status is 200": (r) => r.status === 200,
    "login successful": (r) => JSON.parse(r.body).success === true,
  });

  let token = JSON.parse(res.body).token;

  res = http.get("http://localhost:3000/api/user", {
    headers: {
      Authorization: `Bearer ${token}`,
    },
  });

  check(res, {
    "status is 200": (r) => r.status === 200,
  });

  sleep(1);
}

In this example, k6 is running a test scenario that simulates a login request to the /api/login endpoint, captures the JSON response and extracts the token, and then uses the token to make a request to the /api/user endpoint with an Authorization header. The test will run for 60 seconds at two different target arrival rates, 5 requests per second for the first 30 seconds, and 10 requests per second for the next 30 seconds.


Autocannon

Autocannon is a Node.js library for performance testing HTTP/1.1 and HTTP/2 APIs. It allows you to quickly and easily create and run benchmarks against your API to determine its performance characteristics, including request per second, latency, and other important metrics. Autocannon can be used to test the performance of a single API endpoint, or to test multiple endpoints simultaneously to get a more comprehensive view of the overall performance of your API.

Here is a simple example of how to use Autocannon in Node.js to test the performance of an API endpoint:

  1. Install Autocannon:
npm install autocannon

2. Create a script file:

Create a file named test.js with the following content:

const autocannon = require('autocannon');

const instance = autocannon({
  url: 'http://localhost:3000/api/endpoint',
  connections: 10,
  pipelining: 1,
  duration: 10
});

instance.on('done', () => {
  console.log('Test completed.');
  console.log('Requests per second: ', instance.stats.requests);
  console.log('Latency average: ', instance.stats.latency.avg);
  console.log('Latency max: ', instance.stats.latency.max);
});

instance.run();

In this example, we are using Autocannon to test the performance of an API endpoint at http://localhost:3000/api/endpoint. We are using 10 connections, pipelining 1 request at a time, and running the test for a duration of 10 seconds. After the test is complete, we log the results to the console, including the requests per second, average latency, and maximum latency.


Apache Bench

Apache Bench (ab) is a simple command-line tool used to test the performance of HTTP servers. It allows you to send a specified number of concurrent requests to a target URL and measure the response time and other performance metrics, such as the number of requests per second and the total time taken to complete all requests. Apache Bench is commonly used to benchmark web servers, test the performance of APIs, and evaluate the overall capacity and scalability of a server. It is part of the Apache HTTP Server project, and is available as a standard tool on most Unix-like systems.

Here is an example of how you can use Apache Bench to test a simple Node.js server:

  1. First, make sure that Apache Bench is installed on your system. On most Unix-like systems, it is already installed as a standard tool. If it is not installed, you can install it using a package manager, such as apt-get on Ubuntu:
sudo apt-get install apache2-utils

2. Start a simple Node.js server on localhost, port 3000:

node index.js

3. In a separate terminal window, run Apache Bench against the Node.js server:

ab -n 1000 -c 100 http://localhost:3000/

In this example, ab is sending 1000 requests to http://localhost:3000/, using 100 concurrent connections. The output of the command will show various performance metrics, such as the number of requests per second, the time taken to complete all requests, the average and median latency, and the number of failed requests.