1

I've got this server setting a live traffic log DB that holds a big stats table. Now I need to create a smaller table from it, let's say 30 days back.

This server also has a slave server that copies the data and is 5 sec behind the master. I created this slave in order to reduce server process for selecting queries so it only works with insert/update for the traffic log.

Now I need to copy the last day to the smaller table, and still not to use the "real" DB, so I need to select from the slave and insert to the real smaller table. (The slave only allows read operations).

I am working with PHP and I can't solve this with one query using two different databases at one query... If it's possible, please let me know how?

When using two queries I need to hold the last day as a PHP MySQL object. For 300K-650K of rows, it's starting to be a cache memory problem. I would use a partial select by ID(by setting the ids at the where term) chunks but I don't have an auto increment id field and there's no id for the rows (when storing traffic data id would take a lot of space).

So I am trying this idea and I would like to get a second opinion.

If I will take the last day at once (300K rows) it will overload the PHP memory. I can use limit chunks, or a new idea: selecting one column at a time and copying this one to the new real table. But I don't know if the second method is possible. Does insert looks at the first open space at a column level or row level? the main idea is reducing the size of the select.. so is it possible to build a select by columns and then insert them as columns at mysql?

4

2 に答える 2

0

これが単にPHPのメモリの問題である場合は、PDOを使用して、すべてを同時に取得するのではなく、一度に1つの結果行を取得してみてください。

PHP.net for PDOから:

<?php
function getFruit($conn) {
    $sql = 'SELECT name, color, calories FROM fruit ORDER BY name';
    foreach ($conn->query($sql) as $row) {
        print $row['name'] . "\t";
        print $row['color'] . "\t";
        print $row['calories'] . "\n";
    }
}
?>
于 2012-12-26T21:58:19.557 に答える
0

さて、ここでphpが奇妙になり始めます..私はあなたのアドバイスを受けて、データにチャンクを使い始めました。2000 行のジャンプで制限を進めるためにループを使用しました。しかし、興味深いのは、php のメモリ使用量とメモリ ピーク関数を使い始めたときに、chunks メソッドが大規模なループで機能しない理由は、新しい値を var に設定してもメモリが解放されないためであることがわかりました。新しい設定の前にあったもの..したがって、メモリをphpに保持するには、unsetまたはnullを使用する必要があります–

于 2013-01-05T17:24:36.820 に答える