7

Business:
I encountered a problem - when operating with large datasets with Django ORM, canonical way is manipulate with every single element. But of course this way is very inefficient. So I decided to use raw SQL.

Substance:
I have a basic code which forms SQL query, which updates rows of table, and commiting it:

from myapp import Model
from django.db import connection, transaction
COUNT = Model.objects.count()
MYDATA = produce_some_differentiated_data() #Creating individual value for each row
cursor = connection.cursor()
str = []
for i in xrange(1, COUNT):
    str.append("UPDATE database.table\n"
               "SET field_to_modify={}\n"
               "WHERE primary_key_field={};\n".format(MYDATA, i))


str = ''.join(str)
cursor.execute(str)
transaction.commit_unless_managed() #This cause exception

And on last statement I get this, even when SIZE is small:

_mysql_exceptions.ProgrammingError: (2014, "Commands out of sync; you can't run this command now")

Maybe Django do not allow execute multiple SQL queries at once?

ps Closing cursor before commiting helps to avoid exception, but is this correct?

My expectations:
Im looking for every possible solid solution for bulk operations (preferably inside Django). I dont care about will it be ORM or raw SQL, I would have stand with code I pasted above, if I could avoid error. In case of no solutions it will be good at least, just for curiosity, to know reason of this exception.

What I have learned besides answers:
In Django 1.4 was introduced bulk_create, for efficient multiple INSERT operations

4

3 に答える 3

6

Django 1.4+ has a pretty decent support for bulk operations in it's ORM and you should see if you can use that - it's most portable way and pretty nice to work with too.

It allows not only updating the same value for the field in all objects (that's trivial), but also to update field values based on other fields as well as perform some limited calculations. I am not sure if it fits your need (depends how "produce_some_differentiated_data" works) - some of calculations you could do, some of them probably not. Some example:

image_id_list = [1,5,6]
Image.objects.filter(image_id__in=image_id_list).
     update(views_number=F('views_number') + 1)

The above example will convert into SQL similar to:

UPDATE image SET views_number = views_number + 1 WHERE image_id IN (1,5,6);

Which is fastest way of doing bulk update - way faster than running multiple queries. Running multiple queries in single SQL statement is not really improving the speed of operation. What does improve it is to make a single query like the above that is operating on many rows at the same time. You can build fairly complex formulas in the update statement so the best if your "produce_some_differentiated_data" method can be expressed this way. Even if it cannot be done directly, you can probably make some modification to the model and add some extra fields to make that happen. That might pay off if such bulk operations are executed often.

From Django's documentation:

Django supports the use of addition, subtraction, multiplication, division and modulo arithmetic with F() objects, both with constants and with other F() objects.

More about it here: https://docs.djangoproject.com/en/dev/topics/db/queries/#updating-multiple-objects-at-once

于 2012-12-29T12:50:07.313 に答える
5

Use cursor.executemany(query, param_list) if you need a raw SQL.

param_list = [("something_1", 1), ("something_2", 2),...]
# or everything like [(some_number_1, 1)...]. Apostrophes around the substituted
# "%s" and the complete necessary escaping is added automatically for string
# parameters.

cursor.executemany("""UPDATE database.table
        SET field_to_modify=%s
        WHERE primary_key_field=%s""",
        param_list)

It has many advantages:

  • The query string is much shorter than a big query and can be fast analyzed/optimized without consuming unnecessary resources by the database planner. (If you parse parameters into SQL manually, you get many different SQL command that must be analyzed individually.)
  • Parsing strings into SQL manually is a bad practise because it can be a security issue (SQL injection attack) if you don't escape unexpected apostrophes and backslashes from the user input correctly.

It is an undocumented method though both methods execute(self, sql, params=()) and executemany(self, sql, param_list) are supported for cursor objects by all native db backends (mysql, postgesql_psycopg2, sqlite3, oracle) for long time since Django-0.96 to the current 1.5-beta. A useful similar answer is https://stackoverflow.com/a/6101536/448474 .

The method executemany had two fixed issues related to exception handling in previous years. So, verify for your Django version that you get helpful error messages if you intentionally cause a database exception, too much %s or too little etc. Yet, a few minutes of initial tinkering/testing are faster than many hours of waiting for slow methods.

于 2012-12-29T16:47:56.793 に答える
1

Have you tried transactions.

https://docs.djangoproject.com/en/dev/topics/db/transactions/

You need something like this.

@transaction.commit_manually
def viewfunc(request):
    for row in rows:
        row.modify() # wherever you want to change
    transaction.commit()

The problem is not the ORM overhead, is the db connection, avoid so many calls and execute a few commits of several rows each one.

In the example above you could split rows in half, or a third or a tenth, etc..

于 2012-12-28T16:51:51.683 に答える