Skip to content

Benchmarks & Performance Optimisation

Sebastiaan Deckers requested to merge benchmark into master

Scenario

To measure the overhead of various HTTP/2 libraries, and the benefits of clustering, I ran a super simple benchmark. It requests a tiny Hello, World! text file over HTTP/2 (or HTTPS in the case of http-server).

  • This does not reflect Server Push performance. Only interested right now in the server/protocol implementation overhead.
  • Compression was not enabled for any server, except http2server where it does not affect performance.

Results

Requests per Second
Single Process
node-http2 1209.91
node-spdy 1326.47
native HTTP/2 2431.56
Cluster: 2 workers
node-http2 1704.26
node-spdy 2149.96
native HTTP/2 4303.82 💪 3.6x faster with this patch!
Reference
http-server 0.9.0 1822.10
h2o 2.1.0-beta3 15489.69
nghttpd 1.16.0 15944.44
nginx 1.10.2 12310.82
apache 2.4.23 8408.89

Note: Apache 2.4 mod_http2, nghttpd, and Node.js 8+ all use the same nghttp2 library. 🤔 Room for additional 3x improvement?

Setup

Native HTTP/2 is Node.js v8.0.0-pre from a week ago (Round 5). Does not include the performance optimisation that have been implemented in recent days. Should revisit these benchmarks once that settles.

Regular Node.js is 7.2.0

CLI

nghttpd

nghttpd -d ~/Code/sebdeckers/http2server/public --workers=2 8443 ~/.http2server/key.pem ~/.http2server/cert.pem

http-server

npm install --global http-server
http-server --silent --ssl -p 8443 --key ~/.http2server/key.pem --cert ~/.http2server/cert.pem ~/Code/sebdeckers/http2server/public

Nginx

brew install nginx-full --with-http2
nginx -c /Users/seb/Code/sebdeckers/nginx/benchmark.conf

Apache

brew tap homebrew/apache
brew install httpd24 --with-http2 --with-mpm-worker

Configurations

H2O benchmark.yml

user: seb
send-server-name: OFF
file.etag: OFF
access-log: logs/access.log
error-log: logs/error.log
pid-file: logs/pid.file
http2-max-concurrent-requests-per-connection: 10000

hosts:

  "localhost:8443":
    listen:
      port: 8443
      ssl:
        certificate-file: crypto/cert.pem
        key-file: crypto/key.pem

    paths:

      "/":
        file.dir: dummy

Nginx benchmark.conf

worker_processes auto;

events {
  worker_connections 1024;
}

http {
  server {
    listen 8443 ssl http2;

    ssl_certificate /Users/seb/.http2server/cert.pem;
    ssl_certificate_key /Users/seb/.http2server/key.pem;

    location / {
      root /Users/seb/Code/sebdeckers/http2server/public;
    }
  }
}

Apache

ErrorLog /Users/seb/Code/sebdeckers/apache/error.log
DocumentRoot /Users/seb/Code/sebdeckers/http2server/public

LoadModule authz_core_module libexec/mod_authz_core.so
<Directory "/Users/seb/Code/sebdeckers/http2server/public">
  Require all granted
</Directory>

LoadModule unixd_module libexec/mod_unixd.so

LoadModule ssl_module libexec/mod_ssl.so
SSLEngine on
SSLCertificateFile /Users/seb/.http2server/cert.pem
SSLCertificateKeyFile /Users/seb/.http2server/key.pem

LoadModule http2_module libexec/mod_http2.so
Protocols h2 http/1.1

ServerName localhost
Listen 0.0.0.0:8443 https

HTTP/1

wrk -t4 -c100 -d10s https://192.168.1.21:8443/index.html

HTTP/2

h2load -v -n 30000 -c 100 -t 8 https://192.168.1.21:8443/index.html
  • Changing the -n depending on how fast or slow the server is. Should not take more than 30s per test.
  • Can not run more than ~4000 requests with node-http2 before exponential server slowdown.

index.html

Hello, World!

Server: 2015 MacBook

Wifi <1m from router @ 867 Mbps (2x2)

  Model Name:	MacBook
  Model Identifier:	MacBook8,1
  Processor Name:	Intel Core M
  Processor Speed:	1.1 GHz
  Number of Processors:	1
  Total Number of Cores:	2
  L2 Cache (per Core):	256 KB
  L3 Cache:	4 MB
  Memory:	8 GB

Client: 2011 iMac

Wired LAN 1m CAT.5E @ 1000 Mbps

  Model Name:	iMac
  Model Identifier:	iMac12,1
  Processor Name:	Intel Core i5
  Processor Speed:	2.5 GHz
  Number of Processors:	1
  Total Number of Cores:	4
  L2 Cache (per Core):	256 KB
  L3 Cache:	6 MB
  Memory:	8 GB

Merge request reports