1

I have been doing some tests with PHP, and I have noticed that performing include on multiple files compare to one containing all the functions is much slower.

My test involved creating 1025 files, 1024 of which contained the text <?php class cls$i {} ?> (where $i was the file number), and 1 file that was a concatenation of all the text in the files before. I then had two functions, one for testing each case. The test that loaded the single file took about 6ms to compile to bytecode and make the contents available to the system, however, the combination of 1024 files took almost 600ms.

In terms of size, the 1024 individual files are exactly the same size as the single file. I am also running APC to cache the bytecode, but in practice, it only shaves off a few milliseconds.

I also created a ramdisk which held all the files, but that was only marginally (10ms on average) faster.

So, having said that, why are individual files SOOOOO much slower than a single file? Is it down to significant inefficiencies in the loading engine within PHP, or have I made a considerable cockup in the configuration (on my local system, a standard AMPPS installation)?

4

1 に答える 1

3

私の最初の推測では、すべてのstatシステム コールが行われているということです。

apc.stat設定でオフにするとどうなりますか?

APC はバイトコードをキャッシュしapc.stat=1ますが、もし .

編集:あなたのコメントに応じて、さらに深くなります。含まれているファイルをどのように参照していますか? 相対パスを使用している場合は、include_path が関与しています。

言い換えると:

include "somefile.php";

より遅くなる可能性が高い

include __DIR__ . '/otherfile.php';
于 2013-07-18T00:24:26.847 に答える