###Introduction
Fusion and transduction may be the most practical tools I’ve picked up in my time studying Functional Programming. They’re not tools I use every day, nor are they strictly necessary, but, they completely changed the way I think about programming, modularity, and abstraction in software engineering, permanently, and for the better.
And to be clear, that’s really the point of this article: Not to evangelize FP, offer a silver bullet, or illuminate some magic secret sauce that’s “better” than what you do now. Rather, the point is to shed light on different ways of thinking about programming, and broaden your sense of possible solutions to everyday problems.
These aren’t easy techniques to use fluently, and it’ll probably take some time, tinkering, and deliberate practice to fully comprehend what’s going on here. For most, it’s an entirely new level of abstraction.
But, if you put in the time, you might just come out with the sharpest sense of abstractions around functions that you’ve ever had.
Recall the definition of pure functions as functions which have no side effects, and always returns the same value for any given input.
Since pure functions always return the same value for a given input, we can safely pass their return values directly to other functions.
This enables such niceties as:
// niceties
colorBackground(wrapWith(makeHeading(createTitle(movie))), 'div')), 'papayawhip')
Here, we use makeHeading
to create a string heading out of movie
; use this string to create a new heading (makeHeading
delegates to document.createElement
); wrap this heading in a div
; and finally, call colorBackground
, which updates the element’s styling to set a background of papayawhip
…Which is my favorite flavor of CSS.
Let’s be explicit about the composition at work in this snippet. At each step of the pipeline, a function accepts an input, and returns an output, which the input determines completely. More formally: At each step, we add another referentially transparent function to the pipeline. Yet more formally: papayaWhipHeading
is a composition of referentially transparent functions.
It’s worth pointing out that a functional eye might spot the below possibility, as well. But you’re not here for illustrative-yet-contrived examples. You’re here to learn about fusion.
Let’s dash through the rest of those prerequisites, and look at chaining Array methods.
One of the nicer features of map
is that it automatically returns an array with its results.
const capitalized = ["where's", 'waldo'].map(function(word) {
return word.toUpperCase();
});
console.log(capitalized); // ['WHERE'S', 'WALDO']
Of course, there’s nothing particularly special about capitalized
. It has all the same methods any other array does.
Since map
and filter
return arrays, we can chain calls to either method directly to their return values.
const screwVowels = function(word) {
return word.replace(/[aeiuo]/gi, '');
};
// Calling map on the result of calling map
const capitalizedTermsWithoutVowels = ["where's", 'waldo']
.map(String.prototype.toUpperCase)
.map(screwVowels);
This isn’t a particularly dramatic result: Chained array methods like this are common in JS-land. But, it merits attention for its leading to code like the following:
// Retrieve a series of 'posts' from JSON Placeholder (for fake demonstration data)
// GET data
fetch('https://jsonplaceholder.typicode.com/posts')
// Extract POST data from response
.then(data => data.json())
// This callback contains the code you should focus on--the above is boilerplate
.then(data => {
// filter for posts by user with userId == 1
const sluglines = data
.filter(post => post.userId == 1)
// Extract only post and body properties
.map(post => {
const extracted = {
body: post.body,
title: post.title
};
return extracted;
})
// Truncate "body" to first 17 characters, and add 3-character ellipsis
.map(extracted => {
extracted.body = extracted.body.substring(0, 17) + '...';
return extracted;
})
// Capitalize title
.map(extracted => {
extracted.title = extracted.title.toUpperCase();
return extracted;
})
// Create sluglines
.map(extracted => {
return `${extracted.title}\n${extracted.body}`;
});
});
This is maybe a few more map
calls than is common, sure…But, consider map
alongside filter
, and this style becomes a lot more believable.
Using “single-purpose” callbacks in sequential calls to map
and filter
lets us write simpler code, at the cost of overhead due to function invocation and the requirement for “single-purpose” callbacks.
We also enjoy the benefits of immutability, as map
and filter
don’t modify the array you call them on. Rather, they create new arrays each time.
This lets us avoid confusion due to subtle side effects, and preserves the integrity of our initial data source, allowing us to pass it to multiple processing pipelines without issue.
On the other hand, allocating a whole new array on every invocation of map
or filter
seems a little heavy-handed.
The sequence of calls we made above feels a bit “heavy-handed”, because we only care about the array we get after we make all of our calls to map
and filter
. The intermediate arrays we generate along the way are throwaways.
We create them for the sole purpose of providing the next function in the chain with data in the format it expects. We only hang on to the last array we generate. The JavaScript engine eventually garbage collects the intermediate arrays that we built up but didn’t need.
If you’re using this style of programming to process large lists, this can lead to considerable memory overhead. In other words: We’re trading memory and some incidental code complexity for testability and readability.
For simplicity, let’s consider a sequence of calls to map
.
// See bottom of snippet for `users` list
users
// Extract important information...
.map(function (user) {
// Destructuring: https://jsonplaceholder.typicode.com/users
return { name, username, email, website } = user
})
// Build string...
.map(function (reducedUserData) {
// New object only has user's name, username, email, and website
// Let's reformat this data for our component
const { name, username, email, website } = reduceduserdata
const displayname = `${username} (${name})`
const contact = `${website} (${email})`
// Build the string want to drop into our UserCard component
return `${displayName}\n${contact}`
})
// Build components...
.map(function (displayString) {
return UserCardComponent(displayString)
})
// Hoisting so we can keep the important part of this snippet at the top
var users = [
{
"id": 1,
"name": "Leanne Graham",
"username": "Bret",
"email": "Sincere@april.biz",
"address": {
"street": "Kulas Light",
"suite": "Apt. 556",
"city": "Gwenborough",
"zipcode": "92998-3874",
"geo": {
"lat": "-37.3159",
"lng": "81.1496"
}
},
"phone": "1-770-736-8031 x56442",
"website": "hildegard.org",
"company": {
"name": "Romaguera-Crona",
"catchPhrase": "Multi-layered client-server neural-net",
"bs": "harness real-time e-markets"
}
},
{
"id": 2,
"name": "Ervin Howell",
"username": "Antonette",
"email": "Shanna@melissa.tv",
"address": {
"street": "Victor Plains",
"suite": "Suite 879",
"city": "Wisokyburgh",
"zipcode": "90566-7771",
"geo": {
"lat": "-43.9509",
"lng": "-34.4618"
}
}
}
]
To restate the problem: This produces an intermediate, “throwaway” array with every invocation of map
. This implies we don’t allocate intermediate arrays if we can find a way to execute all of our processing logic, but only invoke map
once.
One way to get away with a single call to map
is to do all of our work inside of a single callback.
const userCards = users.map(function (user) {
// Destructure user we're interested in...
const { name, username, email, website } = user
const displayName = `${username} (${name})`
const contact = `${website} (${email})`
// Create display string for our component...
const displayString = `${displayName}\n${contact}`
// Build/return UserCard
return UserCard(displayString)
})
This eliminates intermediate arrays, but this is a step backwards. Throwing everything into a single callback loses the readability and testability benefits that motivated sequenced calls to map
in the first place.
One way to improve the readability of this version is to extract the callbacks into their own functions, and use them within the call to map
, rather than the literal function declarations.
const extractUserData = function (user) {
return { name, username, email, website } = user
}
const buildDisplayString = function (userData) {
const { name, username, email, website } = reducedUserData
const displayName = `${username} (${name})`
const contact = `${website} (${email})`
return `${displayName}\n${contact}`
}
const userCards = users.map(function (user) {
const adjustedUserData = extractUserData(user)
const displayString = buildDisplayString(adjustedUserData)
const userCard = UserCardComponent(displayString)
return userCard
})
This is logically equivalent to what we started with, due to referential transparency. But, it’s definitely easier to read, and arguably easier to test.
The real victory here is that this version makes the structure of our processing logic much clearer: sounds like function composition, doesn’t it?
We can go one step further. Instead of saving the result of each function invocation to a variable, we can simply pass the result of each call directly to the next function in the sequence.
const userCards = users.map(function (user) {
const userCard = UserCardComponent(buildDisplayString(extractUserData(user)))
return userCard
})
Or, if you like code more terse:
const userCards =
users.map(user => UserCardComponent(buildDisplayString(extractUserData(user))))
This restores all of the testability, and some of the readability, of our original chain of map
calls. And since we’ve managed to express this transformation with only a single call to map
, we’ve eliminated the memory overhead imposed by intermediate arrays.
We did this by converting our sequence of calls to map
, each of which received a “single-purpose” callback, into a single call to map
, in which we use a composition of those callbacks.
This process is called fusion, and allows us to avoid the overhead of intermediate arrays while enjoying the testability and readability benefits of sequenced calls to map
.
One last improvement. Let’s take a cue from Python, and be explicit about what we’re doing.
const R = require('ramda');
// Use composition to use "single-purpose" callbacks to define a single transformation function
const buildUsercard = R.compose(UserCardComponent, buildDisplayString, extractUserData)
// Generate our list of user components
const userCards = users.map(buildUserCard)
We can write a helper to make this even cleaner.
const R = require('ramda')
const fuse = (list, functions) => list.map(R.compose(...functions))
// Then...
const userCards = fuse(
// list to transform
users,
// functions to apply
[UserCardComponent, buildDisplayString, extractUserData]
)
If you’re like me, this is the part where you start using map
and filter
just everywhere, even stuff you probably shouldn’t use it for.
But the high doesn’t last long with this one. Check this:
users
// Today, I've decided I hate the letter a
.filter(function (user) {
return user.name[0].toLowerCase() == 'a'
})
.map(function (user) {
const { name, email } = user
return `${name}'s email address is: ${email}.`
})
Fusion works fine with a sequence of map
calls. It works just as well with a sequence of calls to filter
. Unfortunately, it breaks with sequential calls involving both methods. Fusion only works for sequenced calls to one of these methods.
That’s because they interpret the return values of their callbacks differently. map
takes the return value and pushes it into an array, regardless as to what it is.
filter
, on the other hand, interprets the truthiness of the callback’s return value. If the callback returns true
for an element, it keeps that element. Otherwise, it throws it out.
Fusion doesn’t work because there’s no way to tell the fused function which callbacks should be used as filters, and which should be used as simple transformations.
In other words: This approach to fusion only works in the special case of a sequence of calls to map
and filter
.
As we’ve seen, fusion only works for a series of calls only involving map, or only involving filter. This isn’t very helpful in practice, where we’ll typically invoke both. Recall that we were able to express map
and filter
in terms of reduce
.
// Expressing `map` in terms of `reduce`
const map = (list, mapFunction) => {
const output = list.reduce((transformedList, nextElement) => {
// use the mapFunction to transform the nextElement in the list
const transformedElement = mapFunction(nextElement);
// add transformedElement to our list of transformed elements
transformedList.push(transformedElement);
// return list of transformed elements
return transformedList;
}, [])
// ^ start with an empty list
return output;
}
// Expressing `filter` in terms of `reduce`
const filter = (list, predicate) => {
const output = list.reduce(function (filteredElements, nextElement) {
// only add `nextElement` if it passes our test
if (predicate(nextElement)) {
filteredElements.push(nextElement);
}
// return the list of filtered elements on each iteration
return filteredElements;
}, [])
})
}
In theory, this means that we can replace our calls to map
and then filter
with calls to reduce
. Then, we’d have a chain of calls involving only reduce
, but which implements the same mapping/filtering logic we’re already using.
From there, we can apply a technique very similar to what we’ve seen with fusion to express our series of reductions in terms of a single function composition.
The first step is to re-express our calls to map
and filter
in terms of reduce
.
Previously, we wrote our own versions of map
and filter
, which looked like this:
const mapReducer = (list, mapFunction) => {
const output = list.reduce((transformedList, nextElement) => {
// use the mapFunction to transform the nextElement in the list
const transformedElement = mapFunction(nextElement);
// add transformedElement to our list of transformed elements
transformedList.push(transformedElement);
// return list of transformed elements
return transformedList;
}, [])
// ^ start with an empty list
return output;
}
const filterReducer = (list, predicate) => {
const output = list.reduce(function (filteredElements, nextElement) {
// only add `nextElement` if it passes our test
if (predicate(nextElement)) {
filteredElements.push(nextElement);
}
// return the list of filtered elements on each iteration
return filteredElements;
}, [])
})
}
We used these to demonstrate the relationship between reduce
and map
/filter
, but we need to make some changes if we want to use this in reduce
chains.
Let’s start by removing those calls to reduce
:
const mapReducer = mapFunction => (transformedList, nextElement) => {
const transformedElement = mapFunction(nextElement);
transformedList.push(transformedElement);
return transformedList;
}
const filterReducer = predicate => (filteredElements, nextElement) => {
if (predicate(nextElement)) {
filteredElements.push(nextElement);
}
return filteredElements;
}
Earlier, we filtered and mapped an array of user
names. Let’s start rewriting that logic with these new functions to make all this a little less abstract.
// filter's predicate function
function removeNamesStartingWithA (user) {
return user.name[0].toLowerCase() != 'a'
}
// map's transformation function
function createUserInfoString (user) {
const { name, email } = user
return `${name}'s email address is: ${email}.`
}
users
.reduce(filterReducer(removeNamesStartingWithA), [])
.reduce(mapReducer(createUserInfoString), [])
This produces the same result as our previous filter
/map
chain.
This is quite a few layers of indirection involved. Take some time to step through the above snippet before moving on.
Take another look at mapReducer
and filterReducer
.
const mapReducer = mapFunction => (transformedList, nextElement) => {
const transformedElement = mapFunction(nextElement);
transformedList.push(transformedElement);
return transformedList;
}
const filterReducer = predicate => (filteredElements, nextElement) => {
if (predicate(nextElement)) {
filteredElements.push(nextElement);
}
return filteredElements;
}
Rather than hard-code transformation or predicate logic, we allow the user to pass in mapping and predicate functions as arguments, which the partial applications of mapReducer
and filterReducer
remember due to closure.
This way, we can use mapReducer
and filterReducer
as “backbones” in building arbitrary reduction chains by passing the predicate
or mapFunction
appropriate for our use case.
If you look closely, you’ll notice that we still make explicit calls to push
in both of these reducers. This is important, because push
is the function that allows us to combine, or reduce, two objects into one:
// Object 1...
const accumulator = ["an old element"];
// Object 2...
const next_element = "a new element";
// A single object that combines both! Eureka!
accumulator.push(next_element);
// ["an old element", "a new element"]
console.log(accumulator)
Recall that combining elements like this is the whole point of using reduce
in the first place.
If you think about it, push
isn’t the only function we can use to do this. We could use unshift
, instead:
// Object 1...
const accumulator = ["an old element"];
// Object 2...
const next_element = "a new element";
// A single object that combines both! Eureka!
accumulator.unshift(next_element);
// ["a new element", "an old element"]
console.log(accumulator);
As written, our reducers lock us into using push
. If we wanted to unshift
, instead, we’d have to re-implement mapReducer
and filterReducer
.
The solution is abstraction. Rather than hard-code push
, we’ll let the user pass the function they want to use to combine elements as an argument.
const mapReducer = combiner => mapFunction => (transformedList, nextElement) => {
const transformedElement = mapFunction(nextElement);
transformedList = combiner(transformedList, transformedElement);
return transformedList;
}
const filterReducer = combiner => predicate => (filteredElements, nextElement) => {
if (predicate(nextElement)) {
filteredElements = combiner(filteredElements, nextElement);
}
return filteredElements;
}
We use it like this:
// push element to list, and return updated list
const pushCombiner = (list, element) => {
list.push(element);
return list;
}
const mapReducer = mapFunction => combiner => (transformedList, nextElement) => {
const transformedElement = mapFunction(nextElement);
transformedList = combiner(transformedList, transformedElement);
return transformedList;
}
const filterReducer = predicate => combiner => (filteredElements, nextElement) => {
if (predicate(nextElement)) {
filteredElements = combiner(filteredElements, nextElement);
}
return filteredElements;
}
users
.reduce(
filterReducer(removeNamesStartingWithA)(pushCombiner), [])
.reduce(
mapReducer(createUserInfoString)(pushCombiner), [])
At this point, everything’s in place for our final trick: Composing these transformations to fuse those chained calls to reduce
. Let’s see it in action first, and then review.
const R = require('ramda');
// final mapReducer/filterReducer functions
const mapReducer = mapFunction => combiner => (transformedList, nextElement) => {
const transformedElement = mapFunction(nextElement);
transformedList = combiner(transformedList, transformedElement);
return transformedList;
}
const filterReducer = predicate => combiner => (filteredElements, nextElement) => {
if (predicate(nextElement)) {
filteredElements = combiner(filteredElements, nextElement);
}
return filteredElements;
}
// push element to list, and return updated list
const pushCombiner = (list, element) => {
list.push(element);
return list;
}
// filter's predicate function
const removeNamesStartingWithA = user => {
return user.name[0].toLowerCase() != 'a'
}
// map's transformation function
const createUserInfoString = user => {
const { name, email } = user
return `${name}'s email address is: ${email}.`
}
// use composition to create a chain of functions for fusion (!)
const reductionChain = R.compose(
filterReducer(removeNamesStartingWithA)
mapReducer(createUserInfoString),
)
users
.reduce(reductionChain(pushCombiner), [])
We can go one further by implementing a helper function.
const transduce = (input, initialAccumulator, combiner, reducers) => {
const reductionChain = R.compose(...reducers);
return input.reduce(reductionChain(combiner), initialAccumulator)
}
const result = transduce(users, [], pushCombiner, [
filterReducer(removeNamesStartingWithA)
mapReducer(createUserInfoString),
]);
There are more solutions to almost any problem than anyone could ever enumerate; the more of them you meet, the clearer you’ll think about your own, and the more fun you’ll have doing so.
I hope meeting Fusion and Transduction piques your interest, helps you think more clearly, and, ambitious as it is, was at least a little bit fun.
Thanks for learning with the DigitalOcean Community. Check out our offerings for compute, storage, networking, and managed databases.
While we believe that this content benefits our community, we have not yet thoroughly reviewed it. If you have any suggestions for improvements, please let us know by clicking the “report an issue“ button at the bottom of the tutorial.
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
Sign up for Infrastructure as a Newsletter.
Working on improving health and education, reducing inequality, and spurring economic growth? We'd like to help.
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.