If you have an understanding of functions like map, filter, and reduce, transducers are actually pretty easy.
Say you have `(map inc [1 2])`. You can run that, and get `'(2 3)`.
A transducer is the `(map inc)` part of that call (slightly confusingly, this isn't partial application or currying). You can apply it to something like `[1 2]`, but you can also compose with it, by combining it with say, `(filter even?)` to get something that represents the process of incrementing everything, then removing odd numbers. Or you can put in things that aren't collections, like asynchronous channels, and get back a new channel with the values modified accordingly.
That's pretty much it.
What I think I love most about Clojure is that there are fantastic, esoteric, academic ideas that, when I read about them in a context like this for the first time, I a) do not understand them, and b) have no idea how they would be useful. Then I read an example or two, and suddenly it's apparent that the tool is really as simple as it can be--there's very little accidental complexity--and is extremely useful.
The way you explain it, it's no different from functions and function composition; in which case, why invent new vocabulary?
I do remember looking into them before and translating them into Haskell and they ended up not being identical to functions in the trivial sense that you suggest, but I forget how.
Transducers are functions. The thing is that they are functions that are designed to serve as the functional argument to reduce. And they pair with ordinary functions which are not transducers.
For instance if we have (map inc [1 2]), there exists a transducer function T such that:
(reduce T [1 2]) == (map inc [1 2])
I.e. we can somehow do "map inc" using reduce.
Okay?
Now, the clever thing is this: why don't we allow map to be called without the list argument? Just let it be called like this:
(map inc)
This looks like partial application, right? Now what would partial application do? It would curry the "inc", returning a function of one argument that takes a list, i.e.:
;; if it were partial application, then:
((map inc) [1 2]) ;; same as (map inc [1 2])
But Hickey did something clever; he overloaded functions like map so that (map inc) returns T!
(reduce (map inc) [1 2]) ;; same as (map inc [1 2])
The cool thing is that this (map inc) composes with other functions of its kind. So you can compose together the transducers of list processing operations, which are then put into effect inside a single reduce, and the behavior is like the composition of the original list processors.
It's like a linear operator; like LaPlace. Composition of entire list operations in the regular domain corresponds to composition of operations on individual elements in the "t-domain".
> (reduce (map inc) [1 2]) ;; same as (map inc [1 2])
This is wrong. You've missed the point.
(map inc [1 2])
is actually roughly equivalent to
(reduce ((map inc) conj) [] [1 2])
which, due to the use of `reduce`, is eager. To get laziness back:
(sequence (map inc) [1 2])
Transducers are not reducing functions, they return reducing functions when applied to reducing functions. `((map inc) conj)` is a version of `conj` that calls `inc` on all the rhs args before `conj`ing them into the lhs arg.
I suspected I had to be not quite understanding something, because why would we want, say, a map based on reduce that chokes on lazy lists, in a language where laziness is important.
Transducers are mostly function and function composition, but with a specific signature. There are a handful of contracts on a transducer, so it is useful to have a name, so you can say "this function takes a transducer" and "this function returns a transducer".
Older collection functions were semi-lazy by default. They auto-realized in chunks of 32. Transducers wrap the functions then pass the data through, allowing full-laziness (or not if you want) by default.
They also work well in cases where you don't know if/when a new value is coming, like in channels or observables. This is because they aren't necessarily wed to the seq abstraction from the get-go.
I think people are seriously underestimating the protocol involved and understanding that protocol is required if you want to build your own transducers compatible streaming, which is always useful. There's also the issue that the protocol in question may not be generic enough. From the presentations I've seen it claims that it might work with Rx streams as well, however I don't think it can deal with back-pressure. This is just a personal opinion and I'd like to be proven wrong.
That said the concept is pretty cool and we shouldn't fear learning new things. After all, the whole point of learning a new programming language is to be exposed to new ways of solving problems, otherwise why bother?
Adding to weavejester comment, back-pressure is to be handled with core.async channels [1]. I guess in Rich's terms, RX/FRP "complects" the communication of messages with flow of control [2]. Although this last statement may not be true anymore given that RX now has many functions to control the scheduler (eg. observeOn/buffer/delay, etc).
Because Clojure isn't a pure functional language, transducers may be stateful. `take` uses a volatile (i.e. a fast, mutable variable) to retain state. I don't believe a `flatMap` transducer exists in Clojure yet.
Say you have `(map inc [1 2])`. You can run that, and get `'(2 3)`.
A transducer is the `(map inc)` part of that call (slightly confusingly, this isn't partial application or currying). You can apply it to something like `[1 2]`, but you can also compose with it, by combining it with say, `(filter even?)` to get something that represents the process of incrementing everything, then removing odd numbers. Or you can put in things that aren't collections, like asynchronous channels, and get back a new channel with the values modified accordingly.
That's pretty much it.
What I think I love most about Clojure is that there are fantastic, esoteric, academic ideas that, when I read about them in a context like this for the first time, I a) do not understand them, and b) have no idea how they would be useful. Then I read an example or two, and suddenly it's apparent that the tool is really as simple as it can be--there's very little accidental complexity--and is extremely useful.