API response time

For doing these benchmarks for comparing frameworks, I think it’s much better to use a dedicated tool, like wrk:

It will send several requests, so you will be able to actually see the average response time, under load. Otherwise, if the server isn’t receiving requests frequently, the response time you’ll see will depend mostly on the value you have for DoEvents, and it will vary a lot between runs.

Here are the result of running wrk with 10 threads, 10 concurrent connections during 10 seconds.

2024r1.1 with Express (the example from @Jürg_Otter):

./wrk -t 10 -c 10 -d 10 --latency http://127.0.0.1:8080/Api/ws_test
Running 10s test @ http://127.0.0.1:8080/Api/ws_test
  10 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    27.12ms    2.91ms  46.97ms   86.38%
    Req/Sec    36.38      5.05    40.00     67.97%
  Latency Distribution
     50%   26.23ms
     75%   27.40ms
     90%   30.58ms
     99%   38.97ms
  3665 requests in 10.08s, 1.28MB read
  Socket errors: connect 0, read 1, write 0, timeout 0
Requests/sec:    363.70
Transfer/sec:    130.25KB

2024r1.1 with a Xojo Web project:

./wrk -t 10 -c 10 -d 10 --latency http://127.0.0.1:8080/Api/ws_test
Running 10s test @ http://127.0.0.1:8080/Api/ws_test
  10 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    10.31ms    8.88ms 150.80ms   98.77%
    Req/Sec   105.43     12.43   150.00     70.43%
  Latency Distribution
     50%   10.02ms
     75%   10.96ms
     90%   11.05ms
     99%   37.99ms
  10479 requests in 10.05s, 2.09MB read
Requests/sec:   1042.57
Transfer/sec:    212.79KB

2024r2 beta with the same Xojo Web project:

./wrk -t 10 -c 10 -d 10 --latency http://127.0.0.1:8080/Api/ws_test
Running 10s test @ http://127.0.0.1:8080/Api/ws_test
  10 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.95ms    6.17ms 111.46ms   99.03%
    Req/Sec   711.91     49.23   770.00     97.68%
  Latency Distribution
     50%    1.39ms
     75%    1.42ms
     90%    1.49ms
     99%    4.94ms
  70542 requests in 10.10s, 14.06MB read
Requests/sec:   6981.38
Transfer/sec:      1.39MB

Mileage will vary depending on the server specs, if you’re running the server locally or over the wire and other things, like running the benchmark on battery (like I’m doing now) vs. plugged.

5 Likes