Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance Test #15

Open
mohit-rathee opened this issue Mar 30, 2024 · 17 comments
Open

Performance Test #15

mohit-rathee opened this issue Mar 30, 2024 · 17 comments
Assignees

Comments

@mohit-rathee
Copy link
Owner

Test web-chat under various stress full conditions.
Give analysis of experiences under these conditions.

@mohit-rathee
Copy link
Owner Author

Messages Typed : Manually (bare hands)
Total messages sent : 400
Total time taken : 60 sec
No of users present : 1

RESULT :
150ms for each message
no lag at all

@mohit-rathee
Copy link
Owner Author

Messages Typed : Automatically (Selenium)
Total messages sent : 1000
Total time taken : 159 sec
No of users present : 2

RESULT :
159ms for each message
no lag at all

@mohit-rathee
Copy link
Owner Author

Messages Typed : Automatically (Selenium)
Total messages sent : 1000 ( 100 messages by 10 different users)
Total time taken : 211 sec
No of users present : 10

RESULT :
211ms for each message
no lag at all

@mohit-rathee
Copy link
Owner Author

All tests done till now were performed with single pc and multiple chrome instances. So due to hardware limitations it's stress test is still incomplete.
No lag at all is faced by me up until this point.

@mohit-rathee
Copy link
Owner Author

Shocking test results:

We tried to do the same stress tests for whatsapp under same network conditions with the help of selenium library. And we found shocking results.
It took more than 600 sec to send 1000 messages. (100 per minute)
600ms per message
which is 3x slower than web-chat.

My Instincts about this result :

May be it is WhatsApp api who provide message delivery in few miliseconds but on browser it's not that fast.
or may be whatsapp claims to be fast on its internal network of servers but doesn't guarantees on internet.

@ofmukesh
Copy link

great 🔥
So how you tested the api's like how you called the multiple apli's at a same time (by different users) and is the time result is the total time taken by the request or the result is the time taken on the process by the server...?

@mohit-rathee
Copy link
Owner Author

mohit-rathee commented Mar 31, 2024

By some bash scripting and selenium I fired up multiple chrome instance which will create random users into application. And they all send random messages as programmed. code is here
Well the total time taken is actually the Time of last msg - Time of first msg (estimates the time taken to process all requests)
I need to consider using some other utilities like wscat which can send more and more messages into web-chat without needing to open a chrome instance

@mohit-rathee
Copy link
Owner Author

mohit-rathee commented Mar 31, 2024

Here are the response time of various api's used by web-chat backend server.
(measurements are taken from browsers network tab)
To send message in a channel : 118 ms
To create a channel : 121 ms
To give reaction on messages : 113 ms
To reply on a message : 115 ms
To DM a person : 90 ms
To load previous message of a channel (30 messages): 100 ms

The term Load means our browser will send a status of data it has cached about the server and if new data is present then server will only provide with new data.
In case of no cache found server will send all the messages, media, users present in server.

To Load (without cache) : 230 ms
To Load (with cache) : 100 ms

At the time of testing, sqlite3 database is choosed which is present with backend
if we opt to choose a database which is outside our backend then some more latency can be seen

@mohit-rathee
Copy link
Owner Author

mohit-rathee commented Mar 31, 2024

Here are the response time of these api's but from backend perspective.
(time is measured by backend server to process each request)
To send message in a channel : 12 ms
To create a channel : 31 ms
To give reaction on messages : 9 ms
To DM a person : 1 ms
To load previous message of a channel (30 messages): 12 ms

These completely depends on amount to data present on server.
To Load (without cache) : 40 ms
To Load (with cache) : 11 ms

@ofmukesh
Copy link

ofmukesh commented Apr 1, 2024

Channel Creation Why does API take longer? Do you know the reason?

@mohit-rathee
Copy link
Owner Author

mohit-rathee commented Apr 1, 2024

Generally one database operation takes 10 - 15 ms.
And to create new channel it need to do 2 of them:

  1. add the channel name into channels table.
  2. create an actual new table inside database.

so overall it takes 25 to 30 ms

@ofmukesh
Copy link

ofmukesh commented Apr 1, 2024

So, how can we reduce the time? think about it...

when you are free tommorow for the meet ?

@mohit-rathee
Copy link
Owner Author

Approach to reduce response time :

The problem we are facing is due to calling of multiple database requests.

In the Load api also we are calling (n+3) requests.
where n is number of tables present and remaining 3 for other necessary information.

And in create channel api we fires 2 requests.

So if we can reduce these requests, we can reduce the response time too.

By using Native SQL Commands :

ORM's provide a way to run an sql command too. But if we take advantage of
Compound SQL Commands. We can see blazingly fast response time.

@ofmukesh
Copy link

ofmukesh commented Apr 2, 2024

Great 🥳, we can try that.

And for the Create Channel process, we will use UUID and ACID operations & ( sync to async func.. ) to reduce the processing time. We will discuss these things again, so schedule a meeting tomorrow when you are free.

Also create another branch for this repo, so that we can easily manage the main branch.

@mohit-rathee
Copy link
Owner Author

We successfully executed compound SQL (creating new tables in database and updating its details in another table) and it generally took 8 to 9 ms.
The another task was to reflect those changes into our ORM:

  1. First Approach : When we used general (reflect/automap) functions to map all changes from database to our ORM then it becomes a linear complexity. As the number of tables increases, time taken by these function also increase.
    RESULT : Even in just 50 tables the autoload function took 1 sec.

  2. Second Approach : By learning how the sqlalchemy library works by creating classes from metadata requested from database. We optimised our application to manually create classes and metadata in ORM and then just map the class with the metadata of that table. In this way it took constant time for mapping new table.
    RESULT : Around 4 to 5 ms.

According to this create new channel will take approximately 15 to 17 ms. 🥳
_Note that this experiment is done independently from web-chat. So results from this experiment can be seen anywhere in any project. experiment code

@ofmukesh
Copy link

ofmukesh commented Apr 9, 2024

cool 🔥

@ofmukesh
Copy link

ofmukesh commented Apr 9, 2024

To send message in a channel : 12 ms
now work on this api

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants