UnsatisfiedLinkError: /tmp/snappy-1.1.4-libsnappyjava.so Error loading shared library ld-linux-x86-64.so.2: No such file or directory

18,822

Solution 1

Error message states that *libsnappyjava.so cannot find ld-linux-x86-64.so.2. This is a glibc dynamic loader, while Alpine image doesn't run with glibc. You may try to get it running by installing libc6-compat package in your Dockerfile, e.g.:

RUN apk update && apk add --no-cache libc6-compat

Solution 2

In my case, install the missing libc6-compat didn't work. Application still throw java.lang.UnsatisfiedLinkError.

Then I find in the docker, /lib64/ld-linux-x86-64.so.2 exist and is a link to /lib/libc.musl-x86_64.so.1, but /lib only contains ld-musl-x86_64.so.1, not ld-linux-x86-64.so.2.

So I add a file named ld-linux-x86-64.so.2 linked to ld-musl-x86_64.so.1 in /lib dir and solve the problem.

Dockerfile I use:

FROM openjdk:8-jre-alpine
COPY entrypoint.sh /entrypoint.sh
RUN apk update && \
  apk add --no-cache libc6-compat && \
  ln -s /lib/libc.musl-x86_64.so.1 /lib/ld-linux-x86-64.so.2 && \
  mkdir /app && \
  chmod a+x /entrypoint.sh
COPY build/libs/*.jar /app
ENTRYPOINT ["/entrypoint.sh"]

In conclusion:

RUN apk update && apk add --no-cache libc6-compat
ln -s /lib/libc.musl-x86_64.so.1 /lib/ld-linux-x86-64.so.2

Solution 3

There are two solutions of this problem:

  1. You may use some other base image with pre-installed snappy-java lib. For example openjdk:8-jre-slim works fine for me

  2. And the other solution is to still use openjdk:8-jdk-alpine image as base one, but then install snappy-java lib manually:

FROM openjdk:8-jdk-alpine
RUN apk update && apk add --no-cache gcompat
...

Solution 4

If you are adding docker file through build.sbt then correct way to do it is

dockerfile in docker := {
  val artifact: File = assembly.value
  val artifactTargetPath = s"/app/${artifact.name}"

  new Dockerfile {
    from("openjdk:8-jre-alpine")
    copy(artifact, artifactTargetPath)
    run("apk", "add", "--no-cache", "gcompat")
    entryPoint("java", "-jar", artifactTargetPath)
  }

installing gcompat will serve your purpose

Share:
18,822
el323
Author by

el323

Experienced software engineer skilled in designing and building big data applications, microservices, distributed solutions and devops.

Updated on June 07, 2022

Comments

  • el323
    el323 almost 2 years

    I am trying to run a Kafka Streams application in kubernetes. When I launch the pod I get the following exception:

    Exception in thread "streams-pipe-e19c2d9a-d403-4944-8d26-0ef27ed5c057-StreamThread-1"
    java.lang.UnsatisfiedLinkError: /tmp/snappy-1.1.4-5cec5405-2ce7-4046-a8bd-922ce96534a0-libsnappyjava.so: 
    Error loading shared library ld-linux-x86-64.so.2: No such file or directory 
    (needed by /tmp/snappy-1.1.4-5cec5405-2ce7-4046-a8bd-922ce96534a0-libsnappyjava.so)
            at java.lang.ClassLoader$NativeLibrary.load(Native Method)
            at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
            at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
            at java.lang.Runtime.load0(Runtime.java:809)
            at java.lang.System.load(System.java:1086)
            at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:179)
            at org.xerial.snappy.SnappyLoader.loadSnappyApi(SnappyLoader.java:154)
            at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47)
            at org.xerial.snappy.SnappyInputStream.hasNextChunk(SnappyInputStream.java:435)
            at org.xerial.snappy.SnappyInputStream.read(SnappyInputStream.java:466)
            at java.io.DataInputStream.readByte(DataInputStream.java:265)
            at org.apache.kafka.common.utils.ByteUtils.readVarint(ByteUtils.java:168)
            at org.apache.kafka.common.record.DefaultRecord.readFrom(DefaultRecord.java:292)
            at org.apache.kafka.common.record.DefaultRecordBatch$1.readNext(DefaultRecordBatch.java:264)
            at org.apache.kafka.common.record.DefaultRecordBatch$RecordIterator.next(DefaultRecordBatch.java:563)
            at org.apache.kafka.common.record.DefaultRecordBatch$RecordIterator.next(DefaultRecordBatch.java:532)
            at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.nextFetchedRecord(Fetcher.java:1060)
            at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1095)
            at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.access$1200(Fetcher.java:949)
            at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:570)
            at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:531)
            at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1146)
            at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1103)
            at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests(StreamThread.java:851)
            at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:808)
            at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:774)
            at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:744)
    

    Previously I have tried launching kafka and kafka-streams-app using docker containers and they worked perfectly fine. This is the first time I am trying with Kubernetes.

    This is my DockerFile StreamsApp:

    FROM openjdk:8u151-jdk-alpine3.7
    
    COPY /target/streams-examples-0.1.jar /streamsApp/
    
    COPY /target/libs /streamsApp/libs
    
    CMD ["java", "-jar", "/streamsApp/streams-examples-0.1.jar"]
    

    What can I do to get past this issue? Kindly help me out.

    EDIT:

    / # ldd /usr/bin/java 
        /lib/ld-musl-x86_64.so.1 (0x7f03f279a000)
    Error loading shared library libjli.so: No such file or directory (needed by /usr/bin/java)
        libc.musl-x86_64.so.1 => /lib/ld-musl-x86_64.so.1 (0x7f03f279a000)
    Error relocating /usr/bin/java: JLI_Launch: symbol not found