Sunday, January 30, 2011

Checking out Node.js 


I am been playing with node.js for some time and over the weekend i thought of doing a simple benchmark between node.js and jetty,  the objective was to quickly check out the two implementation in other words beat the bush and see what falls out.

I half expected the node.js performance to be less that jetty using a NIO connector , however i was pleasantly surprised to see that tps is the same while the cpu utilization of node.js was very very less.


Here is what i ended up doing

  • wrote a small helloworld  httpserver in node.js , about 4 lines of code
  • wrote a custom handler for jetty which says helloWorld
Used apache bench to test it for 10000 requests and 20 concurrent clients.

node.js results
=========

Server Software:      
Server Hostname:        uno
Server Port:            8124

Document Path:          /
Document Length:        14 bytes

Concurrency Level:      20
Time taken for tests:   12.896 seconds
Complete requests:      10000
Failed requests:        0
Write errors:           0
Total transferred:      780000 bytes
HTML transferred:       140000 bytes
Requests per second:    775.41 [#/sec] (mean)
Time per request:       25.793 [ms] (mean)
Time per request:       1.290 [ms] (mean, across all concurrent requests)
Transfer rate:          59.06 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        2   13  29.2     10     994
Processing:     3   13  10.4     10     255
Waiting:        2   12   8.4     10     234
Total:         11   26  32.2     22    1009

Percentage of the requests served within a certain time (ms)
  50%     22
  66%     24
  75%     26
  80%     27
  90%     33
  95%     40
  98%     69
  99%    105
 100%   1009 (longest request)




Jetty results
========


Server Software:        Jetty(7.2.2.v20101205)
Server Hostname:        uno
Server Port:            8081

Document Path:          /
Document Length:        21 bytes

Concurrency Level:      20
Time taken for tests:   12.779 seconds
Complete requests:      10000
Failed requests:        0
Write errors:           0
Total transferred:      1680000 bytes
HTML transferred:       210000 bytes
Requests per second:    782.54 [#/sec] (mean)
Time per request:       25.558 [ms] (mean)
Time per request:       1.278 [ms] (mean, across all concurrent requests)
Transfer rate:          128.39 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        2   13  32.9     10    1011
Processing:     2   12  12.0     10     357
Waiting:        2   12  10.5      9     288
Total:          9   25  36.6     21    1062

Percentage of the requests served within a certain time (ms)
  50%     21
  66%     24
  75%     25
  80%     26
  90%     32
  95%     43
  98%     98
  99%    121
 100%   1062 (longest request)



Observations:
   
1) the latency and tps between the two is very much the same  which truly surprised me so i decided to see cpu and context switches and that the number were very interesting


jetty cpu usage:
======
cpu usage : 30%
cs :  max  ~ 7200   avg ~ 5000 

node.js cpu usage:
=======
cpu usage : 4%
cs :  max 1100   avg ~ 750



jetty has a very high number of context switches (note that this is untuned server, i had expected the default config to do a better job) , this also points to the fact how engineers overlook context switches there is no way that the system can perform if it has such a high number of context switches

trying to figure out why jetty is doing so much CS and cpu is a another weekend project.
2)  because of modern cpus and because of the lean design of node.js ( which means that we are executing only a small amount of script code) the numbers are very close.

Things that i did not try or find out how to:
*) the current release does not give the ability to tune the number of worker threads. not sure how it can be tuned to different workloads.

I came away impressed with node.js and now need to look into the actual code and figure out more about its tuning parameters  as well as benchmark it against netty 






Labels:

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home