mardi 28 février 2017

What are best practices for high traffic web application ? like tracking page loads of multiple sites

We have the following server resources :

16GB RAM Memory
Intel Xeon E5-2650 v2 @ 2.60GHz (8 Core)
240GB SSD (RAID 10)
1 IP Address (2 extra)
Unmetered Traffic / 1Gbit Port / 100Mbps guaranteed

We are using piece of javascript on around 30 websites untill now sends HTTP requests to PHP script located on our server to track the page loads and save the website information in our MySQL DB.

So the requests sometimes reach millions a day and the server becomes too slow, once I disable sending the requests to our server, everything works fine.

I was thinking that it might be issue with many transactions to db in the same time, so I stopped the storing to db part in the script and got the same issue.

Any suggestions regarding the above issue ?

Assume we have moved to cloud hosting with more resources, is it still fine to save such requests to db immediatly ? I was thinking of saving into file and dump it each 4 hours for ex.




Aucun commentaire:

Enregistrer un commentaire