The clojure.java.shell/sh
function has this construct at the end:
(with-open [stdout (.getInputStream proc)
stderr (.getErrorStream proc)]
(let [out (future (stream-to-enc stdout out-enc))
err (future (stream-to-string stderr))
exit-code (.waitFor proc)]
{:exit exit-code :out @out :err @err}))
So the outputs of a process are read in two futures, while the current thread awaits for process to end (and generate exit code). If an exception happens during an execution of a future, that exception is stored and thrown (wrapped in ExecutionException), when futures are derefed (in this case this is after process ends).
I've had a case where I was calling git with sh
and the generated output was large enough to produce OutOfMemoryError
. And here's where you get a "deadlock" (it's not one technically).
- Git was shoveling data into pipe, while Java process was reading it in the future, main thread was awaiting process end
- Future reading
out
encounters an OutOfMemory exception, it is stored and out
reader loop terminates
- Pipe buffer between processes becomes full, as git is writing but nobody is reading anymore
- Git stalls indefinitely trying to write to pipe
- Java process main thread awaits git process end indefinitely
- The future is never derefed so no exception stacktrace or message is ever produced
This can happen with any type of exception during one of those 2 future's reads. Another realistic scenario would be if I specify an encoding and underlying process returns some bytes that cannot be parsed in that encoding.
This "bug" was quite frustrating because there is no error message and very little indication of what is wrong. In my case it happened sporadically in production and it was quite hard to track down.
I don't know what the remedy is, but it might be smart to do some sort of mitigation for this.