Sunday, January 30, 2011

Checking out Node.js 


I am been playing with node.js for some time and over the weekend i thought of doing a simple benchmark between node.js and jetty,  the objective was to quickly check out the two implementation in other words beat the bush and see what falls out.

I half expected the node.js performance to be less that jetty using a NIO connector , however i was pleasantly surprised to see that tps is the same while the cpu utilization of node.js was very very less.


Here is what i ended up doing

  • wrote a small helloworld  httpserver in node.js , about 4 lines of code
  • wrote a custom handler for jetty which says helloWorld
Used apache bench to test it for 10000 requests and 20 concurrent clients.

node.js results
=========

Server Software:      
Server Hostname:        uno
Server Port:            8124

Document Path:          /
Document Length:        14 bytes

Concurrency Level:      20
Time taken for tests:   12.896 seconds
Complete requests:      10000
Failed requests:        0
Write errors:           0
Total transferred:      780000 bytes
HTML transferred:       140000 bytes
Requests per second:    775.41 [#/sec] (mean)
Time per request:       25.793 [ms] (mean)
Time per request:       1.290 [ms] (mean, across all concurrent requests)
Transfer rate:          59.06 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        2   13  29.2     10     994
Processing:     3   13  10.4     10     255
Waiting:        2   12   8.4     10     234
Total:         11   26  32.2     22    1009

Percentage of the requests served within a certain time (ms)
  50%     22
  66%     24
  75%     26
  80%     27
  90%     33
  95%     40
  98%     69
  99%    105
 100%   1009 (longest request)




Jetty results
========


Server Software:        Jetty(7.2.2.v20101205)
Server Hostname:        uno
Server Port:            8081

Document Path:          /
Document Length:        21 bytes

Concurrency Level:      20
Time taken for tests:   12.779 seconds
Complete requests:      10000
Failed requests:        0
Write errors:           0
Total transferred:      1680000 bytes
HTML transferred:       210000 bytes
Requests per second:    782.54 [#/sec] (mean)
Time per request:       25.558 [ms] (mean)
Time per request:       1.278 [ms] (mean, across all concurrent requests)
Transfer rate:          128.39 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        2   13  32.9     10    1011
Processing:     2   12  12.0     10     357
Waiting:        2   12  10.5      9     288
Total:          9   25  36.6     21    1062

Percentage of the requests served within a certain time (ms)
  50%     21
  66%     24
  75%     25
  80%     26
  90%     32
  95%     43
  98%     98
  99%    121
 100%   1062 (longest request)



Observations:
   
1) the latency and tps between the two is very much the same  which truly surprised me so i decided to see cpu and context switches and that the number were very interesting


jetty cpu usage:
======
cpu usage : 30%
cs :  max  ~ 7200   avg ~ 5000 

node.js cpu usage:
=======
cpu usage : 4%
cs :  max 1100   avg ~ 750



jetty has a very high number of context switches (note that this is untuned server, i had expected the default config to do a better job) , this also points to the fact how engineers overlook context switches there is no way that the system can perform if it has such a high number of context switches

trying to figure out why jetty is doing so much CS and cpu is a another weekend project.
2)  because of modern cpus and because of the lean design of node.js ( which means that we are executing only a small amount of script code) the numbers are very close.

Things that i did not try or find out how to:
*) the current release does not give the ability to tune the number of worker threads. not sure how it can be tuned to different workloads.

I came away impressed with node.js and now need to look into the actual code and figure out more about its tuning parameters  as well as benchmark it against netty 






Labels:

Saturday, January 29, 2011

NFC why i am excited about it


During my interactions with my friends and colleagues i came to realize that  lot of people do not know about this technology and how it would make their daily experiences better. so here i am putting down my thoughts why i am excited about nfc and looking forward to it.


Consider the following use case :
Whenever is visit fry's i see a lot of people scanning the barcode and checking out the product information and as anyone knows this is a not a very fool-proof system but it works as of today, with NFC we would be able to tap our phones on the product tag on the shelf below and get all the product information.

The scope for retail is huuuge (there are a lot of other very interesting use cases but that will take me some days to talk about) and i expect google to enable the whole merchant eco system and integrate it with google checkout  and hook it up with the ad network and analytics  , i fully expect Apple will going to try the same too.

if we consider our daily interactions to consist of the following
1) me interacting with society ( let us assume for the discussion that it a web interaction )
2) me interacting with objects : buying goods (clothes/electronics/phone)  
3) my interacting with other machines at my workplace , my home , my car , my music system  using my phone

1 and 3 are possible as of today , i can control my TV/printer  through my phone , what is missing is my interactions with objects like buying a gadget, the gadget has a  bar code but i am not a bar code reader there is no way i can get information and look up what other people are talking about it and more importantly what do people in my social circle have to say about it. NFC is going to enable this and make my interaction cycle complete.

I am looking forward to the day when the technology becomes mainstream.