Honestly, I failed to imagine the task you need this for. )) But usually when you need to ensure a uniqueness of some sort of sets, you use hashes.
Let's say, for example, we have 900 arrays of the following structure:
$arr1 = [n1, n2, n3, n4, n5, n6];
$arr2 = [m1, m2, m3, m4, m5, m6];
...
And we need another array, which shouldn't be the same as any of these.
The solution, I think, is to create a metastructure for storing these arrays alongside its hashes. Something like...
$arrayCollection = array(
hash($arr1) => $arr1,
hash($arr2) => $arr2,
...
);
Then, when a new set is created, I'd generate its hash too - and just check whether the element with the same hash already exists in my collection. Like this:
do {
$newArr = generateArray();
$newArrHash = hash($newArr);
} while (isset($arrayCollection[$newArrayHash]));
It would be quite fast, much faster then just comparing these sets again and again. )
As for hash
function, it can be as simple as...
function hash(array $arr) {
return implode('|', $arr);
}
... or you may need to wrap it into some digest function (md5($x, true)
will do just fine, I suppose), if $arr
may contain some very large strings/numbers.