Thursday 15 July 2010

asynchronous - Use PHP to sync large amounts of text -


There are many laptops in my area which need to get daily information from our servers. Each laptop has the installation of Server 2GO (Originally Apache, PHP, MySQL is running as an executable) which launches a local webpage The webpage calls the URL to our server using the following code:

  $ Handle = fopen ($ downloadURL, "rb"); $ Contents = stream_get_contents ($ handle); Fclose ($ handle);  

$ downloadURL brings a ton of information from a MySQL database on our servers and outputs the results as a tool I currently return the results as my own SQL statements I mean (i.e. - if I query from the database "name by name", then I can return to this tool "Name names in the text names" John Smith) "it takes information from online database and

The problem I have is that the amount of data is too large, the laptop webpage timed out while getting information from the server. I have set the PHP timeout limit too much, but still it is running in problems. Can someone think of a better way of doing this? Should the stream_gate_computers be connected to the server That's it, if I flush the data into smaller sections?

Thanks for any input.

What if you send data and generate SQL on the receiving side? It will save you lots of bytes to transmit you.

Is the data updated incrementally? To wit. Can you send changes only since the last update?

If you need to send a large part of the data, then you want to see seconds or zip methods, and unzip on the other side. (I have not noticed how to do this, but I think it is available in PHP)


No comments:

Post a Comment