Thoughts and projects from an infrastructure engineer
Painless Instrumentation of Celery Tasks Using Statsd and Graphite
For one of my clients and side projects, we’ve been working hard to build in application-level metrics to our wide portfolio of services. Among these services is one built on top of the Celery distributed task queue. We wanted a system that required as little configuration as possible to publish new metrics. For this reason, we decided on using statsd and graphite. Getting statsd and graphite running was the easy part, but we needed a quick, painless way of adding the instrumentation code for the most basic metrics to our Celery-backed service.
For us, those basic metrics consisted of:
Number of times a worker starts on a specific task
Number of times a task raises an exception
Number of times a task completes successfully (no exceptions)
How long each task takes to complete
Since the code to enable these metrics just wraps the code being instrumented it seemed only natural to use a decorator. Below is the code I wrote to do just that.
"""Decorator to quickly add statsd (graphite) instrumentation to Celerytask functions.With some slight modification, this could be used to instrument justabout any (non-celery) function and be made abstract enough to customizemetric names, etc.Stats reported include number of times the task was accepted by a worker(`started`), the number of successes, and the number of times the taskraised an exception. In addition, it also reports how long the task tookto complete. Usage:>>> @task>>> @instrument_task>>> def mytask():>>> # do stuff>>> passPlease note that the order of decorators is important to Celery. Seehttp://ask.github.com/celery/userguide/tasks.html#decorating-tasksfor more information.Uses `simple_decorator` fromhttp://wiki.python.org/moin/PythonDecoratorLibrary#Property_DefinitionLimitation: Does not readily work on subclasses of celery.tasks.Taskbecause it always reports `task_name` as 'run'"""# statsd instrumentationfromceleryimportcurrent_appimportstatsd@simple_decoratordefinstrument_task(func):"""Wraps a celery task with statsd instrumentation code"""definstrument_wrapper(*args,**kwargs):stats_conn=statsd.connection.Connection(host=current_app.conf['STATSD_HOST'],port=current_app.conf['STATSD_PORT'],sample_rate=1)task_name=func.__name__counter=statsd.counter.Counter('celery.tasks.status',stats_conn)counter.increment('{task_name}.started'.format(**locals()))timer=statsd.timer.Timer('celery.tasks.duration',stats_conn)timer.start()try:ret=func(*args,**kwargs)except:counter.increment('{task_name}.exceptions'.format(**locals()))raiseelse:counter.increment('{task_name}.success'.format(**locals()))timer.stop('{task_name}.success'.format(**locals()))returnretfinally:try:deltimerdelcounterdelstats_connexcept:passreturninstrument_wrapperdefsimple_decorator(decorator):"""Borrowed from: http://wiki.python.org/moin/PythonDecoratorLibrary#Property_Definition Original docstring: This decorator can be used to turn simple functions into well-behaved decorators, so long as the decorators are fairly simple. If a decorator expects a function and returns a function (no descriptors), and if it doesn't modify function attributes or docstring, then it is eligible to use this. Simply apply @simple_decorator to your decorator and it will automatically preserve the docstring and function attributes of functions to which it is applied."""defnew_decorator(f):g=decorator(f)g.__name__=f.__name__g.__module__=f.__module__# or celery throws a fitg.__doc__=f.__doc__g.__dict__.update(f.__dict__)returng# Now a few lines needed to make simple_decorator itself# be a well-behaved decorator.new_decorator.__name__=decorator.__name__new_decorator.__doc__=decorator.__doc__new_decorator.__dict__.update(decorator.__dict__)returnnew_decorator