Welcome! Please see the About page for a little more info on how this works.

0 votes
in IO by

I am trying to read a json HTTP stream and use the json elements, indefinitely.

I came up with this code below, but it does close the stream and only returns 30 results - how do I indefinitely use the HTTP stream?

Thanks for your help!

(ns core
  (:require [clj-http.client :as http]
            [cheshire.core :as json]
            [clojure.java.io :as io]
            [clojure.core.async :as async]))

(def gh-url "https://api.github.com/events")
(def chan (async/chan 100))

  (loop [r (async/<! chan)]
    (when (not-empty r) (println (:type r)))
    (recur (async/<! chan))))

(defn read-gh-stream [url]
  (with-open [stream (-> url (http/get {:as :stream}) :body)]
    (let [lines (-> stream io/reader (json/parse-stream true))]
      (doseq [l lines]
          (async/>! chan l))))))

1 Answer

+1 vote

The GitHub api will only return 30 events per http call, it will not keep streaming events to you as they happen. If you want the next 30 events you'll have to make another request to the GitHub api. See the documentation here: https://developer.github.com/v3/activity/events/

Thanks for replying! I will try with Twitter persistent http streams.  

Can I expect the code I wrote to be correct then? Or will `do-seq` close the stream?

PS: I am also trying to consume a log file by streaming it to `core.async`, but the connection to the file always closes at some point.
edited by
`with-open` will close the stream after it's body (doseq) finishes. You will have to re-run `read-gh-stream` each time.
If `lines` is lazy I expect `doseq` to keep processing the stream, though `parse-stream` docs says:

If the top-level object is an array, it will be parsed lazily
Thanks, I understand.

I'm still struggling at reading a (constantly updated) log file while sending each new line to a channel. I can't seem to find an example showing how to do this.