-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
This looks like a bug, but perhaps I'm just missing something obvious. I'm trying to combine multiple raw (non-JSON) files and embed them in a JSON object. These files are ultimately found by a glob, so I don't think I can use --rawfile
without extensive bash machinery on top; instead I need to reduce them with -R
to read each line, and recombine them based on input_filename
.
$ jq -Rn 'reduce inputs as $line ({}; .[input_filename] += [$line])' foo.txt bar.txt baz.txt
This almost works, but the last line of each file is inserted into the wrong list:
{
"foo.txt": [
"First line of foo.",
"I'm a text file with multiple lines!"
],
"bar.txt": [
"Last line of foo.First line of bar.",
"I'm also a text file!"
],
"baz.txt": [
"Last line of bar.First line of baz.",
"I'm a third text file.",
"Last line of baz."
]
}
It took me a while to work out, but this is because there's no bare newline at the end of my files. If I add that to the test files, it works correctly. ie, if I rewrite foo.txt
from:
First line of foo.\nI'm a text file with multiple lines!\nLast line of foo.
to
First line of foo.\nI'm a text file with multiple lines!\nLast line of foo.\n
But I can't guarantee that the real files will be terminated by an empty line. Surely inputs
should 'split' at EOF, as well as each newline? It clearly does for the final file, since we get a final entry for the last line there, so why does it combine the entries from earlier files, across EOF marks?
Just to confirm, this also happens with direct invocations of input
:
$ jq -Rn '[input] + [input] + [input] + [input] + [input] + [input] + [input]' foo.txt bar.txt baz.txt
[
"Last line of baz.",
"I'm a third text file.",
"Last line of bar.First line of baz.",
"I'm also a text file!",
"Last line of foo.First line of bar.",
"I'm a text file with multiple lines!",
"First line of foo."
]