Logging module not writing to file

21,836

Solution 1

You can try running this snippet in your main file.

import logging 
logging.basicConfig(
    level=logging.INFO, 
    format='%(asctime)s [%(levelname)s] - %(message)s',
    filename='filename.txt')  # pass explicit filename here 
logger = logging.get_logger()  # get the root logger
logger.warning('This should go in the file.')
print logger.handlers   # you should have one FileHandler object

Solution 2

I add the following lines before the logging.basicConfig() and it worked for me.

for handler in logging.root.handlers[:]:
    logging.root.removeHandler(handler)

Solution 3

If you are using 'root' logger which is by default has name "", than you can do this trick:

logging.getLogger().setLevel(logging.INFO)    
logger = logging.getLogger('')
logger.handlers = []

In addition you may want to specify logging level as in code above, this level will persist for all descendant loggers.

If instead, you specified particular logger, than do

logger = logging.getLogger('my_service')
logger.handlers = []
fh = logging.FileHandler(log_path)
fh.setLevel(logging.INFO)
# create console handler
ch = logging.StreamHandler()
ch.setLevel(logging.INFO)
logger.addHandler(fh)
logger.addHandler(ch)
logger.info('service started')

The above code will create new logger 'my_service'. In case that logger has been already created it clears all handles. That it adds handles for writing in specified file and console. See official documentation as well.

You can also use hierarchical loggers. It is done directly.

logger = logging.getLogger('my_service.update')
logger.info('updated successfully')

Solution 4

In addition to Forge's answer on using logging.basicConfig(), as of Python 3.8 a parameter to basicConfig() got added. To quote the docs:

"""
This function does nothing if the root logger already has handlers 
configured, unless the keyword argument *force* is set to ``True``.
...
force     If this keyword  is specified as true, any existing handlers
          attached to the root logger are removed and closed, before
          carrying out the configuration as specified by the other
          arguments.
"""

This is why yue dong's answer to remove all handlers has worked for some as well as Alexander's answer to reset logger.handlers to [].

Debugging logger.handlers (as Forge's answer suggests) led me to see a single StreamHandler there (so basicConfig() did nothing for me until I used force=True as parameter).

Hope that helps in addition to all the other answers here too!

Share:
21,836
flybonzai
Author by

flybonzai

Updated on August 14, 2021

Comments

  • flybonzai
    flybonzai over 2 years

    I'm using logging module, and I've passed in the same parameters that I have on other jobs that are currently working:

    import logging
    from inst_config import config3
    
    logging.basicConfig(
        level=logging.INFO,
        format='%(asctime)s [%(levelname)s] - %(message)s',
        filename=config3.GET_LOGFILE(config3.REQUESTS_USAGE_LOGFILE))
    logging.warning('This should go in the file.')
    
    if __name__ == '__main__':
        logging.info('Starting unload.')
    

    Using this method to create the filename:

    REQUESTS_USAGE_LOGFILE = r'C:\RunLogs\Requests_Usage\requests_usage_runlog_{}.txt'.format(
            CUR_MONTH)
    def GET_LOGFILE(logfile):
        """Truncates file and returns."""
        with open(logfile, 'w'):
            pass
        return logfile
    

    When I run it, however, it is creating the file, and then still outputting the logging info to the console. I'm running in Powershell.

    Just tried putting it inside the main statement like this:

    if __name__ == '__main__':
        logging.basicConfig(
        level=logging.INFO,
        format='%(asctime)s [%(levelname)s] - %(message)s',
        filename=config3.GET_LOGFILE(config3.REQUESTS_USAGE_LOGFILE))
    
        logging.warning('This should go in the file.')
    

    Still no luck.

  • flybonzai
    flybonzai about 8 years
    I didn't include my imports, but it is indeed loaded at the top of the file. I've added my imports to the OP.
  • flybonzai
    flybonzai about 8 years
    Still didn't work. It's not creating the file since it's never writing to it. Could the problem be powershell? The weird thing is I have other jobs with the exact same configuration that run without a hitch.
  • John Gordon
    John Gordon about 8 years
    You say here that it's not creating the file, but in the main post you say When I run it, however, it is creating the file. Which is it?
  • flybonzai
    flybonzai about 8 years
    That led me to the solution @Forge, for some reason it defaulted to a streaming handler this time around, so I added a fileHandler.
  • Guy Avraham
    Guy Avraham over 4 years
    @yue dong - Can you explain why this solved the problem (p.s. - it solved for me as well) ?
  • wl2776
    wl2776 almost 4 years
    @yue dong, I'm second. Could you explain, why removing handlers from root logger helps?
  • Joel Carneiro
    Joel Carneiro over 3 years
    It works, but what kind of dark magic is in there?
  • Sriram
    Sriram almost 3 years
    Thanks, but this behavior is more like a logging module bug than anything?
  • Mr Pablo
    Mr Pablo almost 3 years
    This didn't work for me and instead broke my code :/
  • Clumsy cat
    Clumsy cat almost 2 years
    So I have a program that often crashed with a segfault (due to c++ externals), and I need to do this. I suspect it's something that's needed when the code doesn't exit cleanly or with a regular python error.