Bash/csh: test for end of file (EOF) of stdin

9,862

Solution 1

This seems to work in both csh and bash and deal nicely with binary input (also \0 as first char):

# Set $_FIRST_CHAR_FILE to the name of a temp file.
eval `echo $SHELL | grep -E "/(t)?csh" > /dev/null && echo setenv _FIRST_CHAR_FILE /tmp/$$.first_char_file || echo export _FIRST_CHAR_FILE=/tmp/$$.first_char_file`

dd bs=1 count=1 of=$_FIRST_CHAR_FILE >&/dev/null
test -s "$_FIRST_CHAR_FILE" && ( cat $_FIRST_CHAR_FILE; rm $_FIRST_CHAR_FILE; cat - ) | program

Thanks to @glenn-jackman for giving the idea of reading a little bit before passing this and the rest of stdin through cat.

Solution 2

Try reading a line from stdin first:

IFS= read -r line
if [[ -n "$line" ]]; then
    # the line is non-empty.
    # add the line back into the stream and pipe it into your program
    { echo "$line"; cat -; } | your_program
fi
Share:
9,862

Related videos on Youtube

Ole Tange
Author by

Ole Tange

I am strong believer in free software. I do not believe in Santa, ghosts, fairies, leprechauns, unicorns, goblins, and gods. Author of GNU Parallel.

Updated on September 18, 2022

Comments

  • Ole Tange
    Ole Tange almost 2 years

    I would like to do:

    if not eof(stdin):
       pass stdin to program
    else:
       do nothing
    

    I have a feeling that it can be written fairly close to:

    if test ! --is-eof - ; then
      exec program
    

    The problem I am trying to solve is that program reads from stdin but crashes if it gets no input. I do not have access to the source for program thus program cannot be changed. The binary input is bigger than the memory size so putting stdin to a file first is unacceptably slow. Processing all the input line-by-line in bash is unacceptably slow also.

    The solution should ideally work under both csh and bash.

    • David H
      David H over 11 years
      You have to read a stream before you can test for EOF. Sounds like you have a poorly designed script.
    • slhck
      slhck over 11 years
      Well, what is that program which crashes? Wouldn't it be better to try and get it not to crash?
  • glenn jackman
    glenn jackman over 11 years
    This should be a comment for the question, not an answer
  • Scott - Слава Україні
    Scott - Слава Україні over 11 years
    @glenn: I disagree. @M is asking a question about the question only to illuminate the situations in which each of his two answers apply. … // … I agree that it isn’t a good answer, but it is an answer.
  • Scott - Слава Україні
    Scott - Слава Україні over 11 years
    Your answer fails if the first line of the input is a blank line. I suggest if IFS= read -r line; then …. Or, if the input might be an incomplete line (e.g., echo "The quick brown fox jumps …\c" | OleTange.sh), do if IFS= read -r line  ||  [[ -n "$line" ]]; then ….
  • Ole Tange
    Ole Tange over 11 years
    I like the idea. It does impose a limit that there must be a \n before size of memory amount of data, which is suboptimal, but might be an acceptable limit. Also it seems not to work if the first \0 is before the first \n, which is not acceptable as the input is binary data. E.g: printf "abc\0def\nghi\njkl" | if ... . Can we ask read to read a number of bytes (incl \0) instead of a full line?
  • Ole Tange
    Ole Tange about 10 years
    Would be wonderful if it worked in general. But alas it does not: /dev/fd/* does not exist on hurd, aix, hpux, qnx, tru64, ultrix. /dev/stdin does not exist on irix, aix, hpux, tru64, ultrix.
  • dddJewelsbbb
    dddJewelsbbb about 3 years
    Although you (sort of) mention it, this answer is a race condition in and of itself. If a program is not ready to send input (although it "should be" available), the timeout can happen before a program like cat can produce its output. Thus, this answer should be avoided due to nondeterministic race-condition-like timing.
  • Hachi
    Hachi about 3 years
    you're right. In my use cases this race condition is not an issue