How can I delete a newline if it is the last character in a file?

112,749

Solution 1

perl -pe 'chomp if eof' filename >filename2

or, to edit the file in place:

perl -pi -e 'chomp if eof' filename

[Editor's note: -pi -e was originally -pie, but, as noted by several commenters and explained by @hvd, the latter doesn't work.]

This was described as a 'perl blasphemy' on the awk website I saw.

But, in a test, it worked.

Solution 2

You can take advantage of the fact that shell command substitutions remove trailing newline characters:

Simple form that works in bash, ksh, zsh:

printf %s "$(< in.txt)" > out.txt

Portable (POSIX-compliant) alternative (slightly less efficient):

printf %s "$(cat in.txt)" > out.txt

Note:


A guide to the other answers:

  • If Perl is available, go for the accepted answer - it is simple and memory-efficient (doesn't read the whole input file at once).

  • Otherwise, consider ghostdog74's Awk answer - it's obscure, but also memory-efficient; a more readable equivalent (POSIX-compliant) is:

  • awk 'NR > 1 { print prev } { prev=$0 } END { ORS=""; print }' in.txt

  • Printing is delayed by one line so that the final line can be handled in the END block, where it is printed without a trailing \n due to setting the output-record separator (OFS) to an empty string.

  • If you want a verbose, but fast and robust solution that truly edits in-place (as opposed to creating a temp. file that then replaces the original), consider jrockway's Perl script.

Solution 3

You can do this with head from GNU coreutils, it supports arguments that are relative to the end of the file. So to leave off the last byte use:

head -c -1

To test for an ending newline you can use tail and wc. The following example saves the result to a temporary file and subsequently overwrites the original:

if [[ $(tail -c1 file | wc -l) == 1 ]]; then
  head -c -1 file > file.tmp
  mv file.tmp file
fi

You could also use sponge from moreutils to do "in-place" editing:

[[ $(tail -c1 file | wc -l) == 1 ]] && head -c -1 file | sponge file

You can also make a general reusable function by stuffing this in your .bashrc file:

# Example:  remove-last-newline < multiline.txt
function remove-last-newline(){
    local file=$(mktemp)
    cat > $file
    if [[ $(tail -c1 $file | wc -l) == 1 ]]; then
        head -c -1 $file > $file.tmp
        mv $file.tmp $file
    fi
    cat $file
}

Update

As noted by KarlWilbur in the comments and used in Sorentar's answer, truncate --size=-1 can replace head -c-1 and supports in-place editing.

Solution 4

head -n -1 abc > newfile
tail -n 1 abc | tr -d '\n' >> newfile

Edit 2:

Here is an awk version (corrected) that doesn't accumulate a potentially huge array:

awk '{if (line) print line; line=$0} END {printf $0}' abc

Solution 5

gawk

   awk '{q=p;p=$0}NR>1{print q}END{ORS = ""; print p}' file
Share:
112,749

Related videos on Youtube

Todd Partridge 'Gen2ly'
Author by

Todd Partridge 'Gen2ly'

Linux fan, wanna be lover, fairly average.

Updated on May 06, 2022

Comments

  • Todd Partridge 'Gen2ly'
    Todd Partridge 'Gen2ly' almost 2 years

    I have some files that I'd like to delete the last newline if it is the last character in a file. od -c shows me that the command I run does write the file with a trailing new line:

    0013600   n   t  >  \n
    

    I've tried a few tricks with sed but the best I could think of isn't doing the trick:

    sed -e '$s/\(.*\)\n$/\1/' abc
    

    Any ideas how to do this?

    • SourceSeeker
      SourceSeeker over 14 years
      newline is only one character for unix newlines. DOS newlines are two characters. Of course, literal "\n" is two characters. Which are you actually looking for?
    • pavium
      pavium over 14 years
      Although the representation might be \n, in linux is is one character
    • KriptSkitty
      KriptSkitty over 14 years
      Can you elaborate on why you want to do this? Text files are supposed to end with an end-of-line, unless they are entirely empty. It seems strange to me that you'd want to have such a truncated file?
    • pavium
      pavium over 14 years
      The usual reason for doing something like this is to delete a trailing comma from the last line of a CSV file. Sed works well, but newlines have to be treated differently.
    • Todd Partridge 'Gen2ly'
      Todd Partridge 'Gen2ly' over 14 years
      Yeah this is for Linux so thanks for correcting that newline is just one character. Fixed in post.
    • tchrist
      tchrist almost 11 years
      Please never delete the final newline in a file of newline-terminated lines. It screws up all kinds of things.
    • Cory Mawhorter
      Cory Mawhorter about 9 years
      @ThomasPadron-McCarthy "In computing, for every good reason there is to do something there exists a good reason not to do it and visa versa." -Jesus -- "you shouldn't do that" is a horrible answer no matter the question. The correct format is: [how to do it] but [why it may be bad idea]. #sacrilege
    • wisbucky
      wisbucky over 5 years
      One reason to remove the trailing newline is if you're piping the string to somewhere else, and you can't have a trailing newline.
  • SourceSeeker
    SourceSeeker over 14 years
    That takes out all the newlines. Equivalent to tr -d '\n'
  • silbana
    silbana over 14 years
    You can make it safer by using chomp. And it beats slurping the file.
  • Todd Partridge 'Gen2ly'
    Todd Partridge 'Gen2ly' over 14 years
    This works good too, probably less blasphemous than paviums's.
  • Todd Partridge 'Gen2ly'
    Todd Partridge 'Gen2ly' over 14 years
    Good original way to think about it. Thanks Dennis.
  • Todd Partridge 'Gen2ly'
    Todd Partridge 'Gen2ly' over 14 years
    Blasphemy though it is, it works very well. perl -i -pe 'chomp if eof' filename. Thank you.
  • Todd Partridge 'Gen2ly'
    Todd Partridge 'Gen2ly' over 14 years
    Still looks like a lot of characters to me... learning it slowly :). Does the job though. Thanks ghostdog.
  • Ether
    Ether over 14 years
    The funny thing about blasphemy and heresy is it's usually hated because it's correct. :)
  • SourceSeeker
    SourceSeeker over 14 years
    You are correct. I defer to your awk version. It takes two offsets (and a different test) and I only used one. However, you could use printf instead of ORS.
  • Rob Kennedy
    Rob Kennedy over 14 years
    Sinan, although Linux and Unix might define text files to end with a newline, Windows poses no such requirement. Notepad, for example, will write only the characters you type without adding anything extra at the end. C compilers might require a source file to end with a line break, but C source files aren't "just" text files, so they can have extra requirements.
  • ysth
    ysth over 14 years
    but that has the disadvantage of not reseting ownership/permissions for the file...err, wait...
  • ysth
    ysth over 14 years
    in that vein, most javascript/css minifiers will remove trailing newlines, and yet produce text files.
  • silbana
    silbana over 14 years
    @Rob Kennedy and @ysth: There is an interesting argument there as to why such files are not actually text files and such.
  • BCoates
    BCoates over 12 years
    you can make the output a pipe with process substitution: head -n -1 abc | cat <(tail -n 1 abc | tr -d '\n') | ...
  • SourceSeeker
    SourceSeeker over 12 years
    @BCoates: That doesn't do the same thing. Yours only gives the last line (without a newline). The OP wants the whole file with only the last newline removed. Your pipeline would work like this: head -n -1 ifscomma && cat <(tail -n 1 ifscomma | tr -d '\n') or head -n -1 ifscomma | cat - <(tail -n 1 ifscomma | tr -d '\n'). In the latter one, the hyphen causes cat to concatenate what comes across the pipe with the output of the process substitution. Otherwise, the output of head would be ignored.
  • SourceSeeker
    SourceSeeker over 12 years
    I forgot to edit the name of the file in my previous comment to change it from the test file I was using to the sample name "abc": s/ifscomma/abc/g
  • hese
    hese about 12 years
    This worked faster than the perl command on a 1MB file for me. Great thanks!
  • Olumide
    Olumide almost 12 years
    Its not pretty but it works. Give it up for the swiss army chainsaw.
  • Romuald Brunet
    Romuald Brunet over 11 years
    Small correction: you can use perl -pi -e 'chomp if eof' filename, to edit a file in-place instead of creating a temporary file
  • mklement0
    mklement0 over 11 years
    Halfway there; complete approach here.
  • Denis Barmenkov
    Denis Barmenkov about 11 years
    os.path.isfile() will tell you about file presence. Using try/except might catch a lot of different errors :)
  • rudimeier
    rudimeier about 11 years
    Using -c instead of -n for head and tail should be even faster.
  • aditsu quit because SE is EVIL
    aditsu quit because SE is EVIL about 11 years
    perl -pie 'chomp if eof' filename -> Can't open perl script "chomp if eof": No such file or directory; perl -pi -e 'chomp if eof' filename -> works
  • Yevhen Pavliuk
    Yevhen Pavliuk almost 11 years
    awk '{ prev_line = line; line = $0; } NR > 1 { print prev_line; } END { ORS = ""; print line; }' file this should be easier to read.
  • technosaurus
    technosaurus over 10 years
    I found the cat-printf combo out by accident (was trying to get the opposite behavior). Note that this will remove ALL trailing newlines, not just the last.
  • ChrisV
    ChrisV about 10 years
    For me, head -n -1 abc removed the last actual line of the file, leaving a trailing newline; head -c -1 abc seemed to work better
  • Kyle Strand
    Kyle Strand about 9 years
    Why is -pie "blasphemy," and why doesn't it behave the same as -pi -e? Anyone know?
  • Admin
    Admin about 9 years
    @KyleStrand I don't know about the "blasphemy" part other than perhaps the mere fact of recommending perl could be considered blasphemy on an awk website, but the reason -pie and -pi -e don't work the same way is that the -i option takes an optional argument. -pie uses e as the argument to -i, specifying the backup suffix, and then interprets 'chomp if eof' as a filename, since it isn't preceded by an -e option. -pi -e omits the argument for -i, and allows -e to be treated as an option.
  • mklement0
    mklement0 about 9 years
    Works, but removes all trailing newlines.
  • mklement0
    mklement0 about 9 years
    Effectively the same as the accepted answer, but arguably clearer in concept to non-Perl users. Note that there's no need for the g or the parentheses around eof: perl -pi -e 's/\n$// if eof' your_file.
  • mklement0
    mklement0 about 9 years
    Verbose, but both fast and robust - seems to be the only true in-place file-editing answer here (and since it may not be obvious to everyone: this is a Perl script).
  • Dakkaron
    Dakkaron over 8 years
    Best solution of all so far. Uses a standard tool that really every Linux distribution has, and is concise and clear, without any sed or perl wizardry.
  • Admin
    Admin over 7 years
    This is a decent way if it's not too expensive (repetitive).
  • done
    done over 7 years
    How about: awk 'NR>1 {print p} {p=$0} END {printf $0}' file.
  • m13r
    m13r about 7 years
    This also turns \r\n to \n
  • Chris Stryczynski
    Chris Stryczynski almost 7 years
    This has issues when \n is present. As it gets converted to a new line.
  • Thor
    Thor over 6 years
    Also seems to work for multi-line files it the $(...) is quoted
  • Karl Wilbur
    Karl Wilbur over 6 years
    Nice solution. One change is that I think I'd use truncate --size=-1 instead of head -c -1 since it just resizes the input file rather than reading in the input file, writing it out to another file, then replacing the original with the output file.
  • liran
    liran about 6 years
    isn't all of Perl blasphemy? ;)
  • Brian Hannay
    Brian Hannay almost 6 years
    truncate: missing file operand
  • wisbucky
    wisbucky over 5 years
    Note that head -c -1 will remove the last character regardless if it is a newline or not, that's why you have to check whether the last character is a newline before you remove it.
  • wisbucky
    wisbucky over 5 years
    I think this will only remove it if the last line is blank. It will not remove the trailing newline if the last line is not blank. For example, echo -en 'a\nb\n' | sed '${/^$/d}' will not remove anything. echo -en 'a\nb\n\n' | sed '${/^$/d}' will remove since the entire last line is blank.
  • wisbucky
    wisbucky over 5 years
    Here's a sed solution that works even for a non-blank last line: stackoverflow.com/a/52047796
  • michael
    michael over 5 years
    just another variant on the "one liner" with cat examples show in the comments (no pipe necessary, using only process substitution, feel free to change the head or tail options as necessary): cat <(head -n -1 abc) <(tail -n 1 abc | tr -d '\n')
  • michael
    michael over 5 years
    it's just missing the trailing filename in the example, i.e., [ -z $(tail -c1 filename) ] && truncate -s -1 filename (also, in reply to the other comment, the truncate command does not work with stdin, a filename is required)
  • michael
    michael over 5 years
    definitely need to quote that... /bin/echo -n "$(cat infile)" Also, I'm not sure what the max len of echo or the shell would be across os/shell versions/distros (I was just googling this & it was a rabbit hole), so I'm not sure how portable (or performant) it actually would be for anything other than small files -- but for small files, great.
  • Robin A. Meade
    Robin A. Meade over 4 years
    @sorontar The first argument to printf is the format argument. Thus if the input file had something that could be interpreted as a format specifier like %d, you'd get an error. A fix would be to change it to printf "%s" $0
  • user963601
    user963601 over 4 years
    cat foo | perl -pe 'chomp if eof' removes the newline from foo, but git status still reports a diff. Maybe the file was \r\n and perl just removes the \n?
  • Edward Falk
    Edward Falk about 4 years
    Unfortunately does not work on Mac. I suspect it doesn't work on any BSD variant.
  • Nathan
    Nathan almost 4 years
    @Ether A fun read about that