开发者

Fetching mysql php table is slow

开发者 https://www.devze.com 2023-03-31 15:33 出处:网络
I have a rather large table that takes about 10-20 seconds to fetch depending on the query. That makes website seems slower due to that.

I have a rather large table that takes about 10-20 seconds to fetch depending on the query. That makes website seems slower due to that.

I was thinking of using LIMIT to limit the number of results and using ajax to display the whole table bit by bit.

Right now I am using ajax to display the whole table at once.

I am thinking of doing something like this...

Backend:

 $offset = 0;
 $rowsToFetch = 100;
 {
   //Loop
   $query = "SELECT * FROM TABLE_NAME LIMIT $offset, $rowsToFetch"
   $offset = +100;
   $rowsToFetch = +100;
 }

Frontend:

Call ajax func开发者_JAVA技巧tion that will call the back end part recursively and display part by part until finished.

I am sure it's a bit more complicated than that.

Is there a better way to do this? Any ideas how to start? Pitfalls to lookout for?

P.S. I don't want to use pagination. I want to display everything in one page(Benefits: CTRL+F)

Edit: Fetching in the database(command line) takes about 5-7 seconds. But in the website displaying the records take about 10-20 secs. Query takes about 4-5 secs. I timed individual sections of the page to identify that.


That won't speed up the query itself, because the first thing done by mysql is implementing the whole query, and after that it just takes the needed number of results. Consider using nidexes instead or eliminate join number, try sevaral simple queries.


Use pagination if it'S not necessary to show such a big table in one page


How many records does it contain? 20 seconds is very long for such a query. But depending on how you measured it, you got not only the query itself, but also the traffic of pushing the results to the client.

You could try to flush the results while you are processing the results of the query, but I doubt if that will help you when you're using Ajax. The results will only be processed when the request is complete.


Fetching piece by piece would be even slower. A LIMIT query with an order-by clause still has to process all the rows in the table(s) so the results can be ordered properly, after which the LIMIT is applied. In other words, if you have 10,000 rows and fetch 100 at a time, you'd be making the database run 100 full queries. In other words, processing 1,000,000 rows worth of data, just to end up displaying 10,000 - 1% efficiency in terms of rows found v.s. rows served.

What you should do is use a multipart/x-mixed-replace response, and send out chunks of the response as your script fetches/processes them. This lets you run a single query, and still serve up the whole thing in easily digestible chunks.


P.S. I don't want to use pagination. I want to display everything in one page(Benefits: CTRL+F) If I were you, I'd asked myself: I wonder whether the users of the application would want to have pagination or not.

Because I believe that you do not want pagination, but I hardly believe the average user wants your app to display everything in one page to be able to use ctrl+F.

I also believe, your average user not only wants to have paginated pages but also wants to have some sort of search functionality.


You certainly won't improve response times, but, you can display a message or a progress bar before each request. So the user will have an idea of how long the process will take. If your table is large and you want to display all information in one page, there's no way of making if faster unless you improve hardware.


Are you compressing the HTTP response? Can the data be cached on the client?

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号