开发者

High number of concurrent read of a php file

开发者 https://www.devze.com 2023-04-06 06:50 出处:网络
I have a php file that dynamically read information from various sources, like search.php?q=keyword&category=cat-number

I have a php file that dynamically read information from various sources, like

search.php?q=keyword&category=cat-number

Imagine that we have 10,000 concurrent visitors. Is it better to split the traffic to different files (whenever possible) e.g.

1,000 on cat1.php?q=keyword
1,000 on cat2.php?q=keyword
1,000 on cat3.php?q=keyword
....

I mean creating a search php file for each category to split searches over several php files.

Is there theoretically and practically a performance drawback when increasing the numb开发者_如何学运维er of concurrent accesses to a single php file?


No. In fact it should be easier for your server to use the same file over and over again. When using APC, only that file needs to be cached, and the OS file system will cache files too. Even the disk will be able to get to the file easier, because its head is always in the same position. All theoretically, of course, in practice I don't think you'll notice.


No, there is no use in splitting traffic like that.


The thing you're trying to do is load balancing, but a pointless one. Having different files won't do anything except add overhead. Also, it won't be PHP that's the bottleneck when performing searches if your data source is remote location (another website) or a database.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号