I have been doing some tests with PHP, and I have noticed that performing include
on multiple files compare to one containing all the functions is much slower.
My test involved creating 1025 files, 1024 of which contained the text <?php class cls$i {} ?>
(where $i was the file number), and 1 file that was a concatenation of all the text in the files before. I then had two functions, one for testing each case. The test that loaded the single file took about 6ms
to compile to bytecode and make the contents available to the system, however, the combination of 1024 files took almost 600ms
.
In terms of size, the 1024 individual files are exactly the same size as the single file. I am also running APC
to cache the bytecode, but in practice, it only shaves off a few milliseconds.
I also created a ramdisk which held all the files, but that was only marginally (10ms on average) faster.
So, having said that, why are individual files SOOOOO much slower than a single file? Is it down to significant inefficiencies in the loading engine within PHP, or have I made a considerable cockup in the configuration (on my local system, a standard AMPPS installation)?