开发者

How to run Bulk mysql queries successfully?

开发者 https://www.devze.com 2023-04-12 11:58 出处:网络
I have created a php script that reads an excel sheet of thousands of data & adds all the data into MySQL database through insert query. Now开发者_Python百科 the issue is, the data inside excel sh

I have created a php script that reads an excel sheet of thousands of data & adds all the data into MySQL database through insert query. Now开发者_Python百科 the issue is, the data inside excel sheet has gone from thousands to millions. So now timeout occurs while running this script. I want to know what would be the best solution to run all the queries without timeout ?

Also one more thing, I used this script on a web hosting first, where it ended timeout on very few records, so later on I executed this script on my local WAMP server & it did stored more records than the hosting, but still it went timeout. So if the solution to this can work on online & local servers both, it would be the best.


i in my script use this:

ini_set("memory_limit", "-1");
set_time_limit(0);


You can always try setting:

ini_set('memory_limit', '-1');

[docs]

Or a more appropriate value

Alternatively, farm more out to SQL, or set up a cron job to launch the script in smaller intervals until the update is complete...

Also see here, to help make your SQL more efficient if you're running INSERTs, also take a look at LOAD DATA INFILE:

http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html


If you have millions of rows, you might be using not the right tool.

If possible, I'd suggest using LOAD DATA INFILE (docs) functions, which can accept a csv file directly. That way you would skip the PHP layer for the insertion entirely. If you still need to process the file before adding it to the database, you could try to convert the data into needed format, save to csv and then use the above mentioned function.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号