Sunday 7 June 2009

Learning some lessons from Programming Clojure

My copy of Programming Clojure by Stuart Halloway arrived over the weekend. It's a thoroughly enjoyable read so far and I've definitely learned a few things (for example, I hadn't realized that keywords can act as functions for maps, nor did I realize that a set is a function for determining membership. D'oh


;; A keyword is a function
user> (:houses {:biscuits 4 :houses 7})
7

;; A map is a function
user> ({:biscuits 4 :houses 7}) :houses})
7

;; A set is a function on membership
user> (#{4 5 6} 7)
nil

;; On success returns the element
user> (#{4 5 6} 4)
4


The section on Functional Programming was very interesting, especially with respect to laziness. Stuart presents "Six Rules of Clojure FP", which have a heavy emphasis on laziness. Item 5 is:

Know the sequence library. You can often write code without using recur or the lazy APIs at all


Looking back at my neural network code made me realize I'd reinvented several wheels there and the code could be significantly tidied by just using the standard libraries. Revisiting the sequence documentation helped remind me the various tools in the armoury, particularly iterate and reduce.

As a reminder, the two lumps of code in question are run-patterns and train-network.



(defn run-patterns
[network samples expecteds]
(if (empty? samples)
network
(let [expected (first expecteds)
sample (first samples)
[ah ao] (run-network sample network)
updated-network (back-propagate expected sample [ah ao] network)]
(recur updated-network (rest samples) (rest expecteds)))))

(defn train-network
([samples expected iterations]
(train-network (create-network (count (first samples))
num-hidden (count (first expected)))
samples expected iterations))
([network samples expected iterations]
(if (zero? iterations)
network
(recur (run-patterns network samples expected) samples expected (dec iterations)))))


Starting with run-patterns. This takes a sequence of values (the samples and expected values) and reduces it down to a single output (the network) using the supplied function. This can be refactored to be a simple reduce operation (sequence in, value out) and this simplifies the code significantly.


(defn run-patterns
[network samples expecteds]
(reduce
(fn [n expectations]
(let [[sample expected] expectations
[ah ao] (run-network sample n)]
(back-propagate expected sample [ah ao] n)))
network ;; initial value
(map list samples expecteds)))


Next is train-network. At the moment the implementation is burdened by a hand-rolled looping construct for a specified number of iterations. What if instead of calculating a fixed number, we just calculated an infinite amount of trained neural networks (with laziness obviously!) and let the caller decide what value they'd like?

The new version of train-network drops the number of iterations and returns an infinite sequence of neural networks, each trained one more time than the last. The Clojure function, iterate does the job here:

Returns a lazy sequence of x, (f x), (f (f x)) etc. f must be free of side-effects


Once we refactor to use iterate we get the following. The definition of train-network is considerably simpler, and the only change to the example function is specifying the number of iterations to use.



(defn train-network
([samples expected]
(train-network (create-network (count (first samples))
num-hidden (count (first expected)))
samples expected))
([network samples expected]
(iterate (fn [n] (run-patterns n samples expected)) network)))

(defn example[]
(let [x (nth (apply train-network xor-test-data) 100)]
(println (first (run-network [0 0] x)) "-->" 0)
(println (first (run-network [0 1] x)) "-->" 1)
(println (first (run-network [1 0] x)) "-->" 1)
(println (first (run-network [1 1] x)) "-->" 0)))



I've still got to read the remaining chapters of the book, but I'm looking forward to learning more. Definitely recommended.

No comments:

Post a Comment