Java: How to log raw JSON as JSON and avoid escaping during logging with logback / slf4j

17,982

Solution 1

Logback doesn't do anything unusual with JSON. It's just a string that gets logged as normal. The escaping is probably happening on your end, unless you're talking about some kind of JSON Appender that's writing it out in that format. I'm pretty sure Logback itself doesn't have anything like that, so you'd want to look at wherever you got the Appender from instead if that's your problem. An SSCCE would help with further troubleshooting.

Solution 2

If you have a Json formatted messages, the upper solutions work, but are not so nice, since you don´t want to call a logstash specific code, each time you use your logger in the code.

Just adding a

net.logstash.logback.encoder.LogstashEncoder

is not enough, since the message itsself stays escaped. To solve this, try the following in your logback.xml:

<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
     <providers>
        <timestamp/>
        <version/>
        <loggerName/>
        <pattern>
            <pattern>
                {
                "jsonMessage": "#asJson{%message}"
                }
            </pattern>
        </pattern>
    </providers>
</encoder>

The #asJson pattern will unescape your message.

Solution 3

Use the RawJsonAppendingMarker:

log.trace(net.logstash.logback.marker.Markers.appendRaw("jsonMessage", jsonString), null);

Solution 4

I ran into the same problem. I solved it with

<encoder
 class="net.logstash.logback.encoder.LogstashEncoder">          
</encoder

instead of

<encoder
 class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">

In my java code I used:

SRV_PERF_LOGGER.info(net.logstash.logback.marker.Markers.appendRaw("message", jackson.writeValueAsString(dto)), null);

Solution 5

here is an updated (2016) groovy logback config that dumps out your logs in json format to a file, and debug lines in the console. Took me all day to figure out so i thought i'd update the thread.

    import ch.qos.logback.classic.encoder.PatternLayoutEncoder
import ch.qos.logback.core.ConsoleAppender
import ch.qos.logback.core.rolling.FixedWindowRollingPolicy
import ch.qos.logback.core.rolling.RollingFileAppender
import ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy
import net.logstash.logback.encoder.LogstashEncoder

import static ch.qos.logback.classic.Level.INFO
import static ch.qos.logback.classic.Level.WARN

def PROJECT_ID = "com.foo"

    appender("file", RollingFileAppender) {
        file = "/tmp/logs/${PROJECT_ID}.json"
        encoder(LogstashEncoder)
        rollingPolicy(FixedWindowRollingPolicy) {
            maxIndex = 1
            fileNamePattern = "logs/${PROJECT_ID}.json.%i"
        }
        triggeringPolicy(SizeBasedTriggeringPolicy) {
            maxFileSize = "1MB"
        }
    }


    appender("STDOUT", ConsoleAppender) {
        encoder(PatternLayoutEncoder) {
            pattern = "%d{HH:mm:ss.SSS} %-5level %logger{36} - %msg%n"
        }
    }

    logger("com.foo", INFO, ["STDOUT", "file"], false)

    root(WARN, ["STDOUT", "file"])
Share:
17,982
kei1aeh5quahQu4U
Author by

kei1aeh5quahQu4U

Updated on June 05, 2022

Comments

  • kei1aeh5quahQu4U
    kei1aeh5quahQu4U almost 2 years

    I'm using SLF4J with Logback in a JAX-RS application... I want to log to JSON in such a way that my message is not encoded again but printed raw into the logfile:

    At the moment it looks like this:

    {"@timestamp":1363834123012,"@message":"{\"text\":\"From MLK to Barack 
    Ob...\n\"}"
    

    But I want to have this:

      {"@timestamp":1363834123012,"@message": { "text ": "From MLK to Barack 
    Ob...\n\}
    

    The reason is I want to parse the JSON again and want to avoid the unescaping of the data.

    I've written a custom logback encoder but I found no way to avoid the escaping. Can I pass a object to logback and change the settings based on the type of the object?

    Edit: I've found a way - not exactly elegant - as requested a SSCE:

    In my Application

    // SLF4J Logger
    private static Logger logger = LoggerFactory.getLogger(MyClass.class);
    // A logback? Marker
    private Marker foo = MarkerFactory.getMarker("foo");
    // Jackson ObjectMapper()
    ObjectMapper mapper = new ObjectMapper();
    
    // Log something... 
    logger.info(foo, mapper.writeValueAsString(json));
    

    I've used a variation of the Logstash-Encoder found here: https://github.com/logstash/logstash-logback-encoder

    package my.package;
    
    import static org.apache.commons.io.IOUtils.*;
    
    import java.io.IOException;
    import java.util.Map;
    import java.util.Map.Entry;
    
    import org.codehaus.jackson.JsonGenerator.Feature;
    import org.codehaus.jackson.JsonNode;
    import org.codehaus.jackson.map.ObjectMapper;
    import org.codehaus.jackson.node.ObjectNode;
    import org.slf4j.Marker;
    
    import ch.qos.logback.classic.spi.ILoggingEvent;
    import ch.qos.logback.classic.spi.IThrowableProxy;
    import ch.qos.logback.classic.spi.ThrowableProxyUtil;
    import ch.qos.logback.core.CoreConstants;
    import ch.qos.logback.core.encoder.EncoderBase;
    
    public class JsonEncoder extends EncoderBase<ILoggingEvent> {
    
        private static final ObjectMapper MAPPER = new ObjectMapper().configure(
            Feature.ESCAPE_NON_ASCII, true);
        private static Marker M;
    
        private boolean immediateFlush = true;
    
    @Override
    public void doEncode(ILoggingEvent event) throws IOException {
    
        M = event.getMarker();
    
        ObjectNode eventNode = MAPPER.createObjectNode();
    
        eventNode.put("@timestamp", event.getTimeStamp());
        //
        if (M != null) {
            if (M.getName().equals("foo")) {
                JsonNode j = MAPPER.readTree(event.getFormattedMessage());
                eventNode.put("@foo", j);
            }
        } else {
            eventNode.put("@message", event.getFormattedMessage());
        }
        eventNode.put("@fields", createFields(event));
    
        write(MAPPER.writeValueAsBytes(eventNode), outputStream);
        write(CoreConstants.LINE_SEPARATOR, outputStream);
    
        if (immediateFlush) {
            outputStream.flush();
        }
    
    }
    
    private ObjectNode createFields(ILoggingEvent event) {
             // not important here
        return fieldsNode;
    
    }
    
    @Override
    public void close() throws IOException {
        write(LINE_SEPARATOR, outputStream);
    }
    
    public boolean isImmediateFlush() {
        return immediateFlush;
    }
    
    public void setImmediateFlush(boolean immediateFlush) {
        this.immediateFlush = immediateFlush;
    }
    }
    

    It's works now! Yeah! But I guess it's not the best way to do it (serialize, deserialize the JSON...)

  • kei1aeh5quahQu4U
    kei1aeh5quahQu4U about 11 years
    Hi thanks for your time to look at the problem. I found a somewhat hacky soluation and added something that should resemble an SSCCE!