Background Image

BLOG

?

Shortcut

PrevPrev Article

NextNext Article

Larger Font Smaller Font Up Down Go comment Print
Written by CUBRID Community on 07/14/2017

One of the first things I stumbled upon when I started my first Node.js project was how to handle the communication between the browser (the client) and my middleware (the middleware being a Node.js application using the CUBRID Node.js driver (node-cubrid) to exchange information with a CUBRID 8.4.1 database).

I am already familiar with AJAX (btw, thank God for jQuery!!) but, while studying Node.js, I found out about the Socket.IO module and even found some pretty nice code examples on the internet... Examples which were very-very easy to (re)use...

So this quickly becomes a dilemma: what to choose, AJAX or sockets.io?

Obviously, as my experience was quite limited, first I needed more information from out there... In other words, it was time to do some quality Google search :)

There’s a lot of information available and, obviously, one would need to filter out all the “noise” and keep what is really useful. Let me share with you some of the goods links I found on the topic:

To summarize, here’s what I quickly found:

  1. Socket.IO (usually) uses the persistent connection between the client and the server (the middleware), so you can reach a maximum limit of concurrent connections depending on the resources you have on the server side (while more AJAX async requests can be served with the same resources).
  2. With AJAX you can do RESTful requests. This means that you can take advantage of existing HTTP-infrastructure like e.g. proxies to cache requests and use conditional get requests.
  3. There is more (communication) data overhead in AJAX when compared to Socket.IO (HTTP headers, cookies etc.)
  4. AJAX is usually faster than Socket.IO to “code”...
  5. When using Socket.IO, it is possible to have a two-way communication where each side – client or server - can initiate a request. In AJAX, it is only the client who can initiate a request!
  6. Socket.IO has more transport options, including Adobe Flash.

Now, for my own application, what I was most interested in was the speed of making requests and getting data from the (Node.js) server!

Regarding the middleware data communication with the CUBRID database, as ~90% of my data access was read-only, a good data caching mechanism is obviously a great way to go! But about this, I’ll talk next time.

So I decided to put up their (AJAX and socket.io) speed to test, to see which one is faster (at least on my hardware & software environment)....! My middleware was setup to run on an i5 processor, 8GB of RAM and an Intel X25 SSD drive.

But seriously, every speed test and, generally speaking, any performance test depends so much(!) on your hardware and software configuration, that it is always a great idea to try the things on your own environment, rely less on various information you find on internet and more on your own findings!

The tests I decided to do have to meet the following requirements:

  • Test:
    • AJAX
    • Socket.IO persistent connection
    • Socket.IO non-persistent connections
  • Test 10, 100, 250 and 500 data exchanges between the client and the server
  • Each data exchange between the middleware SERVER (a Node.js web server) and the client (a browser) is a 4KBytes random data string
  • Run the server in release (not debug) mode
  • Use Firefox as the client
  • Minimize the console messages output, for both server and client
  • Do each test after a client full page reload
  • Repeat each test at least 3 times, to make sure the results are consistent

Testing Socket.IO, using a persistent connection

I've created a small Node.js server, which was handling the client requests:

io.sockets.on('connection', function (client) {
    client.on('send_me_data', function (idx) {
        client.emit('you_have_data', idx, random_string(4096));
    });
});

And this is the JS client script I used for test:

1
2
3
4
5
6
7
8
9
10
11
12
13
var socket = io.connect(document.location.href);
 
socket.on('you_have_data', function (idx, data) {
    var end_time = new Date();
    total_time += end_time - start_time;
    logMsg(total_time + '(ms.) [' + idx + '] - Received ' + data.length + ' bytes.');
    if (idx++ < countMax) {
        setTimeout(function () {
            start_time = new Date();
            socket.emit('send_me_data', idx);
        }, 500);
    }
});

Testing Socket.IO, using NON-persistent connection

This time, for each data exchange, I opened a new socket-io connection.

The Node.js server code was similar with the previous one, but I decided to send back the client data immediately after connect, as a new connection was initiated every time, for each data exchange:

1
2
3
io.sockets.on('connection', function (client) {
    client.emit('you_have_data', random_string(4096));
});

The client test code was:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
function exchange(idx) {
    var start_time = new Date();
    var socket = io.connect(document.location.href, {'force new connection' : true});
 
    socket.on('you_have_data', function (data) {
        var end_time = new Date();
        total_time += end_time - start_time;
        socket.removeAllListeners();
        socket.disconnect();
        logMsg(total_time + '(ms.) [' + idx + '] - Received ' + data.length + ' bytes.');
         
        if (idx++ < countMax) {
            setTimeout(function () {
                exchange(idx);
            }, 500);
        }
    });
}

Testing AJAX

Finally, I put AJAX to test...

The Node.js server code was, again, not that different from the previous ones:

1
2
res.writeHead(200, {'Content-Type' : 'text/plain'});
res.end('_testcb(\'{"message": "' + random_string(4096) + '"}\')');

As for the client code, this is what I used to test:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
function exchange(idx) {
    var start_time = new Date();
 
    $.ajax({
        url : 'http://localhost:8080/',
        dataType : "jsonp",
        jsonpCallback : "_testcb",
        timeout : 300,
        success : function (data) {
            var end_time = new Date();
            total_time += end_time - start_time;
            logMsg(total_time + '(ms.) [' + idx + '] - Received ' + data.length + ' bytes.');
             
            if (idx++ < countMax) {
                setTimeout(function () {
                    exchange(idx);
                }, 500);
            }
        },
        error : function (jqXHR, textStatus, errorThrown) {
            alert('Error: ' + textStatus + " " + errorThrown);
        }
    });
}

Remember, when coding together AJAX and Node.js, you need to take into account the you might be doing cross-domain requests and violating same origin policy, therefore you should use the JSONP based format!

Btw, as you can see, I quoted only the most significant parts of the test code, to save space. If anyone needs the full code, server and client, please let me know – I’ll be happy to share them.

OK – it’s time now to see what we got after all this work!

I have run each test for 10, 100, 250 and 500 data exchanges and this is what I got in the end:

Data exchanges Socket.IO NON-persistent (ms.) AJAX (ms.) Socket.IO persistent (ms.)
10 90 40 32
100 900 320 340
250 2,400 800 830
500 4,900 1,500 1,600

Looking into the results, we can notice a few things right away:

  1. For each type of test, the results behave quite linear; this is good – it shows that the results are consistent.
  2. The results clearly show that when using Socket.IO non-persistent connections, the performance numbers are significantly worse than others.
  3. It doesn’t seem to be a big difference between AJAX and the Socket.IO persistent connections – we are talking only about some milliseconds differences. This means that if you can live with less than 10,000 data exchanges per day, for example, there are high chances that the user won’t notice a speed difference...

The graph below illustrates the numbers I obtained in test:

nodejs_socketio_ajax_performance.png

...So what’s next...?

...Well, I have to figure out what kind of traffic I need to support and then I will re-run the tests for those numbers, but this time excluding Socket.IO non-persistent connections. That’s because it is obvious that I need to choose between AJAX and persistent Socket.IO connections.

And I also learned that, most probably, the difference in speed would not be as much as one would expect... at least not for a “small-traffic” web site, so I need to start looking into other advantages and disadvantages for each approach/technology when choosing my solution!

P.S. Here are a few more nice resources to find interesting stuff about Node.js, Socket.IO and AJAX:


  1. Contributing to Open Source Community/Project

    Written by Charis Chau on 23/06/2020 What is an open source project? To answer this, let us start with a burger! Imagine an open source project is a burger selling in a restaurant. Every day, the chef makes thousands of burgers that have the same quality by following the same recipe from the restaurant. One day, customer A comes to the burger place to try the burger, and he/she loves it! Therefore, customer A decides to ask the chef whether he/she can get the recipe. Here, if the restaurant is open source, they will be happy to share the recipe to customer A, vice versa. After customer A gets the receipt, he/she decide to make the burger at home by him/herself! However, customer A is a meat lover and does not like onion that much, so he/she decide to change the recipe by taking out the oni...
    Read More
  2. CUBRID License Model

    Written by Charis Chau on 08/06/2020 Why Licenses Matter? Open source licenses allow software to be freely used, modified, and shared. Choosing a DBMS with suitable licenses could save the development cost of your application or the Total Cost of Ownership (TCO) for your company. Choosing a DBMS without a proper license, you might find yourself situate in a legal grey area! CUBRID Licenses Unlike other open source DBMS vendors, CUBRID is solely under open source license instead of having a dual license in both commercial license and open source license. Which means that for you, it is not mandatory to purchase a license or annual subscription; company/organizational users can achieve the saving from Total Cost of Ownership (TCO). Since CUBRID has been open source DBMS from 2008, CUBRID has ...
    Read More
  3. Our Experience of Creating Large Scale Log Search System Using ElasticSearch

    Written by Lee Jae Ik on 01/05/2018 At NHN, we have a service called NELO (NHN Error Log System) to manage and search logs pushed to the system by various applications and other Web services. The search performance and functionality of NELO2, the second generation of the system, have significantly been improved through ElasticSearch. Today I would like to share our experience at NHN in deploying ElasticSearch in Log Search Systems. ElasticSearch is a distributed search engine based on Lucene developed by Shay Banon. Shay and his team have recently released the long-awaited version 0.90. Here is a link to a one-hour recorded webinar where Clinton Gormley, one of the core ElasticSearch developers, explains what's new in ElasticSearch 0.90. If you are developing a system which requires a s...
    Read More
  4. A Node.js speed dilemma: AJAX or Socket.IO?

    Written by CUBRID Community on 07/14/2017 One of the first things I stumbled upon when I started my first Node.js project was how to handle the communication between the browser (the client) and my middleware (the middleware being a Node.js application using the CUBRID Node.js driver (node-cubrid) to exchange information with a CUBRID 8.4.1 database). I am already familiar with AJAX (btw, thank God for jQuery!!) but, while studying Node.js, I found out about the Socket.IO module and even found some pretty nice code examples on the internet... Examples which were very-very easy to (re)use... So this quickly becomes a dilemma: what to choose, AJAX or sockets.io? Obviously, as my experience was quite limited, first I needed more information from out there... In other words, it was time to do s...
    Read More
  5. Become a Jave GC Expert Series 5 : The Principles of Java Application Performance Tuning

    Written by Se Hoon Park on 06/30/2017 This is the fifth article in the series of "Become a Java GC Expert". In the first issue Understanding Java Garbage Collection we have learned about the processes for different GC algorithms, about how GC works, what Young and Old Generation is, what you should know about the 5 types of GC in the new JDK 7, and what the performance implications are for each of these GC types. In the second article How to Monitor Java Garbage Collection we have explained how JVM actually runs the Garbage Collection in the real time, how we can monitor GC, and which tools we can use to make this process faster and more effective. In the third article How to Tune Java Garbage Collection we have shown some of the best options based on real cases as our examples that you can...
    Read More
Board Pagination Prev 1 2 3 Next
/ 3

Join the CUBRID Project on