A logger for multi threaded python that does not make python segfault

published Aug 26, 2015 11:19   by admin ( last modified Aug 26, 2015 11:19 )

I introduced logging to a a python application that is multi threaded, and it started to segfault. I switched to ConcurrentLogHandler but it still segfaulted. I then switched to log over http with restapi-logging-handler, and in that way introduced a bit of complexity, but eventually decided to take another sweep at pypi to see if there wasn't a simpler solution and found this:

multiprocessing-logging 0.2.1 : Python Package Index

It seems to work like a charm in initial tests. I use it like this (abbreviated code):

import multiprocessing_logging

log_file = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'logs', 'app.log')
mhandler = multiprocessing_logging.MultiProcessingHandler('worker-logger', sub_handler= logging.handlers.RotatingFileHandler(
              log_file, maxBytes=20000000, backupCount=0))