I am running a query that takes 0,0053 seconds on the run, but about 12secs about fetching ... by which time is about to end, after getting about 100,000 rows of PHP
I am thinking that using a cursor or something is a way to speed up phishing, such as in python where you can go every line without loop because everything is php
Plan A: SELECT COUNT (*) ...
To see whether it is 100K or 300K or whatever, in an empirical formula (like $ sec = 10 + $ count / 2000
), plug it into set_time_limit ( $ Sec)
in pla Please. (It assumes that you COUNT (*)
!)
are not time-consuming in planning B: "chunk" of data so that you try to bring it all at once Do not (Keep in mind that PHP has a default memory limit that you can threaten to do as much as possible.) Chunking should not use OFFSET
and LIMIT
, its Instead of "Remember where it was left"; Otherwise it will be slow and slow and eventually the time will end. More details on chucking are in.
No comments:
Post a Comment