1

I have celery Python worker processes that are restarted every day or so. They execute Python/Django programs.

I have set certain quasi-global values that should persist in memory for the duration of the process. Namely, I have certain MySQL querysets that do not change often and are therefore evaluated one time and stored as a CONSTANT as soon as the process starts (a bad example being PROFILE = Profile.objects.get(user_id=5)).

Let's say that I want to reset this value in the celery process without exec-ing a whole new program.

This value is imported (and used) in a number of different modules. I'm assuming I'd have to go through each one in sys.modules that imports the CONSTANT and delete/reset the key? Is that right?

This seems very hacky. I usually use external services like Memcached for coordination of memory among multiple processes, but every once in a while, I figure local memory is preferable to over the network calls to a NoSQL store.

4

1 に答える 1

3

It's a bit hard to say without seeing some code, but importing just sets a reference, exactly as with variable assignment: that is, if the data changes, the references change too. Naturally though this only works if it's the parent context that you've imported (otherwise assignment will change the reference, rather than updating the value.)

In other words, if you do this:

from mypackage import mymodule
do_something_with(mymodule.MY_CONSTANT)

#elsewhere
mymodule.MY_CONSTANT = 'new_value'

then all references to mymodule.MY_CONSTANT will get the new value. But if you did this:

from mypackage.mymodule import MY_CONSTANT

# elsewhere
mymodule.MY_CONSTANT = 'new_value'

the original reference won't get the new value, because you've rebound MY_CONSTANT to something else but the first reference is still pointing at the old value.

于 2012-07-13T16:18:43.357 に答える