1

I am running a R-script over multiple cores which inserts data into a table which already holds over 700m rows. Because the table is becoming too large, deadlocks are becoming a common occurrence, even more so since I'm running the same script on multiple cores. I've been trying to work around these deadlocks with TryCatch but to no avail, the script still crashes, making me rerun the script. Is there any function in RMySQL itself to counter deadlocks or does anyone have any advice on a way to work around these deadlocks?

This is the code I was using to try and avoid it but it is in no way pretty (and doesn't work even). It just continues trying to insert the script until it works but the deadlock crashes the script alltogether.

   while(done == FALSE){
       dberror = tryCatch({
          dbSendQuery(con, SQLrs)
          done = TRUE
        }, dberror = function(e){
          print("failed, try again")
        })
    }
4

1 に答える 1

0

申し訳ありませんが、あなたの問題に少し遅れていますが、同様の問題があり、SQL で MySQL タイムアウトを調整することで解決できました。RMySQL を通じてできることは何もないと思います。MySQL connect_timeout を調整する方法については、この Stackoverflow の投稿を参照してください。

于 2015-04-24T17:51:21.813 に答える