开发者

Background process for importing data in PHP

开发者 https://www.devze.com 2023-04-10 14:12 出处:网络
Details When a user first logs into my app I need to import all of their store\'s products from an API, this can be anywhere from 10 products to 11,000. So I\'m thinking I need to to inform the user

Details

When a user first logs into my app I need to import all of their store's products from an API, this can be anywhere from 10 products to 11,000. So I'm thinking I need to to inform the user that we'll 开发者_运维技巧import their products and email them when we're finished.

Questions

What would be the best way to go about importing this data without requiring the user to stay on the page?

Should I go down the pcntl_fork route?

Would system style background tasks be better?


AFAIK there is no way to pcntl_fork() from a web server process, you can only do it from the command line. You can, however, start a child process using exec() (or similar) that will continue to run after you have terminated.

I don't know how "correct" this is, but I would do something like this:

upload.php - Get the user to upload their products list in whatever format you want. I shall assume you know how to do this and won't include any code - if you want an example let me know.

store.php - the upload form submits to this file:

// Make sure a file was uploaded and you have the user's ID
if (!isset($_FILES['file'],$_POST['userId']))
  exit('No file uploaded or bad user ID');

// Make sure the upload was successful
if ($_FILES['file']['error'])
  exit('File uploaded with error code '.$_FILES['file']['error']);

// Generate a temp name and store the file for processing
$tmpname = microtime(TRUE).'.tmp';
$tmppath = '/tmp/'; // ...or wherever you want to temporarily store the file
if (!move_uploaded_file($_FILES['file']['tmp_name'],$tmppath.$tmpname))
  exit('Could not store file for processing');

// Start an import process, then display a message to the user
// The ' > /dev/null &' is required here - it let's you start the process asynchronously
exec("php import.php \"{$_POST['userId']}\" \"$tmppath$tmpname\" > /dev/null &");

// On Windows you can do this to start an asynchronous process instead:
//$WshShell = new COM("WScript.Shell");
//$oExec = $WshShell->Run("php import.php \"{$_POST['userId']}\" \"$tmppath$tmpname\"", 0, false);

exit("I'm importing your data - I'll email you when I've done it");

import.php - handles the import and sends an email

// Make sure the required command line arguments were passed and make sense
if (!isset($argv[1],$argv[2]) || !file_exists($argv[2])) {
  // handle improper calls here
}

// Connect to DB here and get user details based on the username (passed in $argv[1])

// Do the import (pseudocode-ish)
$wasSuccessful = parse_import_data($argv[2]);

if ($wasSuccessful) {
  // send the user an email
} else {
  // handle import errors here
}

// Delete the file
unlink($argv[2]);

The main issue with this approach is that if lots of people upload lists to be imported at the same time, you would risk stressing your system resources with multiple simultaneous versions of import.php running.

For this reason, it is possibly better to schedule a cron job to import the lists one at a time as suggested by Aaron Bruce - but which approach is best for you will depend on your precise requirements.


I think the "standard" way to do this in PHP would be to run a cron every five minutes or so that checks a queue of pending imports.

So your user logs in, part of the log in process is to add them to your "pending_import" table (or however you choose to store the import queue). Then the next time the cron fires it will take care of the current contents of your queue.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号