mercredi 29 juin 2016

How can my web site make many concurrent HTTP requests?

I have a site that loads many individual pieces of data from a remote server - that is, a public API server separate from my webserver. The number of concurrent requests the site makes goes way beyond browser limits, so using direct Ajax calls causes severe performance problems.

The site currently uses a PHP script to act as an Ajax middleman: the client sends the server a list of URLs, the webserver (which has no such concurrent HTTP request limit) makes the requests on the client's behalf, and the webserver forwards the API server's responses to the client.

One big issue I have with this workaround is that the client has to wait for the webserver to receive every HTTP response before it can see any of the data. I know I can fix this issue by using something like Socket.io, but that seems like overkill.

If this is a common issue, what solutions exist to deal with it?

If it's not, how do sites that rely heavily on data from APIs avoid or mitigate this issue?




Aucun commentaire:

Enregistrer un commentaire