Video #46: The Many Faces of Flat‑Map: Part 5
Episode: Video #46 Date: Feb 4, 2019 Access: Members Only 🔒 URL: https://www.pointfree.co/episodes/ep46-the-many-faces-of-flat-map-part-5

Description
Finishing our 3-part answer to the all-important question “what’s the point?”, we finally show that standing on the foundation of our understanding of map, zip and flatMap we can now ask and concisely answer very complex questions about the nature of these operations.
Video
Cloudflare Stream video ID: 8e11560a31588ee55dc2d7852a63ec1d Local file: video_46_the-many-faces-of-flat-map-part-5.mp4 *(download with --video 46)*
References
- Discussions
- Railway Oriented Programming — error handling in functional languages
- Monad (functional programming))
- 0046-the-many-faces-of-flatmap-pt5
- Brandon Williams
- Stephen Celis
- Mastodon
- GitHub
- CC BY-NC-SA 4.0
- source code
- MIT License
Transcript
— 0:05
So let’s talk the third and final part of “what’s the point?”. We’ve now spent a bunch of time getting comfortable with the idea of flatMap , justifying why we should use it, and why we should build an intuition for it. Once we did that, we convinced ourselves that the signature of flatMap and its friends is so important that we’re going to defend it from anyone that may disparage it: you shouldn’t change its signature, it’s there for a reason.
— 0:36
The reason we’ve done all this work is that now we can build off that foundation and ask very complex questions: questions that may have been seemingly intractable had we not taken this deep journey of discovery.
— 1:07
We’re going to look at composition of functions when it comes to flatMap . We saw that map had a wonderful property: the map of the compositions is the same as the composition of the map s. What that meant was that if you have a big chain of map s, you can collapse all that into a single map and call it once with the composition of all the units of work. Is there a version of this for flatMap ? There is!
— 1:33
Next, we know that flatMap can flatten nested containers, like optionals of optionals and results of results, but what about nested containers of different types, like an array of results, or array of parallels, etc. Is there anything we can discover with those kinds of nested containers.
— 1:58
Finally, what is the precise relationship between map , zip , and flatMap ? Can some operations be derived from others, what does it say about types that can do so, and is there some kind of hierarchy between these things?
— 2:16
These are some pretty complicated questions that we want to ask and we can finally answer them! Function composition and flatMap
— 2:23
Let’s start with a simpler one: composition. We’ve gotten very familiar with function composition on this series, so what does function composition look like when it comes to flatMap ?
— 2:37
We know that function composition is a great way to reuse code, for if you have a function from A to B and a function from B to C then you can just compose em to get a whole new function A to C .
— 2:49
Here’s a higher-order function that captures that idea: func pipe<A, B, C>( _ lhs: @escaping (A) -> B, _ rhs: @escaping (B) -> C ) -> (A) -> C { return { a in rhs(lhs(a)) } }
— 3:00
And we can take it for a spin: pipe({ $0 + 1 }, { $0 * $0 }) // (Int) -> Int Here we’ve created a brand new function that first increments and then squares an integer. pipe({ $0 + 1 }, String.init) // (Int) -> String // (Int) -> String And here we’ve created a brand new function that first increments an integer and then returns its string representation.
— 3:31
We even have an infix operator version of this so that we can compose any number of functions together: let f = { $0 + 1 } >>> { $0 * $0 } >>> { $0 + 1 } // (Int) -> Int
— 4:08
Infix operators with associativity is what allows us to do this. We can’t recover this with plain functions until Swift gets variadic generics, which would allow you to write functions with any number of generic parameters.
— 4:20
It turns out that the functions that we use when flat-mapping do not get to participate in this nice composition and that’s a bummer.
— 4:28
For example, what about this function? _ = { try? Data.init(contentsOf: $0) } // (URL) -> Data? And this function? _ = { try? JSONDecoder().decode(User.self, from: $0) } // (Data) -> User?
— 4:47
They can’t be composed together with pipe or >>> because their types do not match up. They are very close, but one returns an optional Data and the other wants an honest Data .
— 5:03
However, we can slightly tweak pipe and make it play nicely with these functions. func chain<A, B, C>( _ lhs: @escaping (A) -> B?, _ rhs: @escaping (B) -> C? ) -> (A) -> C? { return { a in lhs(a).flatMap(rhs) } } By using flatMap under the hood, we’re able to compose these two functions together. Now map and flatMap have different names, so we don’t want to reuse pipe here. Let’s call it chain instead.
— 6:16
Now we can chain together those two operations that can fail into nil : chain( { try? Data.init(contentsOf: $0) }, { try? JSONDecoder().decode(User.self, from: $0) } ) // (URL) -> User?
— 6:39
We could even take it a step further by pipe -ing the
URL 7:02
Just like we saw that pipe is limited by the number of arguments it takes, chain is similarly limited. However, we can also fix it with an associative infix operator to allow us to chain any number of things together. We even introduced this operator in our second episode on side effects , because it was the way we could restore composition in functions that described side effects. We called it the fish operator: func >=> <A, B, C>( _ f: @escaping (A) -> B?, _ g: @escaping (B) -> C? ) -> ((A) -> C?) { return { a in f(a).flatMap(g) } }
URL 7:30
And now we can completely flatten our earlier code. URL.init(fileURLWithPath:) >>> { try? Data.init(contentsOf: $0) } >=> { try? JSONDecoder().decode(User.self, from: $0) }
URL 7:46
This is reading pretty nicely now, and just as we saw in the past how map and flatMap can at-a-glance annotate a line as having a pure transformation or as having to do some work, these operators at-a-glance annotate a line as taking the result of a pure function or taking the result of a function that did some work.
URL 8:03
In the past we’ve talked about how function composition distributes over map , well it turns out that “fishy” composition distributes over flatMap . For example, earlier we wrote some code that had two flatMap s in a row: Bundle.main.path(forResource: "user", ofType: "json") .map(URL.init(fileURLWithPath:)) .flatMap { try? Data.init(contentsOf: $0) } .flatMap { try? JSONDecoder().decode(User.self, from: $0) } Instead, we could have used >=> to compose each of those flatMap s together. Bundle.main.path(forResource: "user", ofType: "json") .map(URL.init(fileURLWithPath:)) .flatMap( { try? Data.init(contentsOf: $0) } >=> { try? JSONDecoder().decode(User.self, from: $0) } )
URL 8:29
What we’re seeing is that the flatMap of the “fishy” compositions is the composition of the flatMap s.
URL 8:35
All of these operations are so lightweight and easy to move about. We could even describe the unit of work as a single transformation through function composition by precomposing our pure unit of work at the beginning. Bundle.main.path(forResource: "user", ofType: "json") .flatMap( URL.init(fileURLWithPath:) >>> { try? Data.init(contentsOf: $0) } >=> { try? JSONDecoder().decode(User.self, from: $0) } )
URL 8:51
This is the exact same unit of work we composed earlier, which means we could have given it a name and passed it to flatMap directly. let loadUser = URL.init(fileURLWithPath:) >>> { try? Data.init(contentsOf: $0) } >=> { try? JSONDecoder().decode(User.self, from: $0) } Bundle.main.path(forResource: "user", ofType: "json") .flatMap(loadUser)
URL 9:04
It’s nice that we can mix and match these units of work in such a lightweight manner, and it’s just a matter of using a different operator to thread operations that may fail.
URL 9:27
Way back when we did our side effects episode we wanted to let everyone know that what we were describing is simply flatMap , but there was a lot of work to be done before we could make it seem natural.
URL 9:41
And all types that support flatMap support this form of composition. We can define chain and >=> for Result , Func , Parallel , etc. In our episode on side effects we saw that >=> restored function composition for functions that describe side effects, and here we’re seeing the same thing for functions that go into optionals, but it works with everything. Nested containers
URL 10:03
Understanding that flatMap helps us solve a nested container problem, we might ask what other types of nested containers could we consider and get insight into. Turns out this is a pretty big can of worms. There are lots of different types of nested containers that we haven’t even thought of yet.
URL 10:32
For example, consider the nested containers where each container is of a different type: // Parallel<Result<A, E>>
URL 10:37
This is a very important type because it is often the case that async values can return failures, like a network request.
URL 10:51
Can we define map , zip and flatMap on this nested structure? Let’s start with the signature of map . func map<A, B, E>( _ f: @escaping (A) -> B ) -> (Parallel<Result<A, E>>) -> Parallel<Result<B, E>> { } This function asks: given a function from A to B can we lift it up to functions from Parallel<Result<A, E>> to Parallel<Result<B, E>> ?
URL 11:09
So let’s try: func map<A, B, E>( _ f: @escaping (A) -> B ) -> (Parallel<Result<A, E>>) -> Parallel<Result<B, E>> { return { parallelResultA in parallelResultA.map { resultA in resultA.map { a in f(a) } } } }
URL 11:46
Weirdly, that satisfied the compiler! To define map on this nested type, we just had to nest the map s. At the beginning of this series on flatMap , this kind of nesting was exactly how we approached nested containers: if we had an array of arrays or an optional of an optional, we tried to use map , which led to more nesting. But in our case here, it’s exactly what we need, because we’re not trying to flatten anything, we’re just trying to dive through each container to transform a value.
URL 12:26
What about zip ? func zip<A, B, E>( _ lhs: Parallel<Result<A, E>>, _ rhs: Parallel<Result<B, E>> ) -> Parallel<Result<(A, B), E>> { } Given a parallel result of A and a parallel result of B , can we construct a parallel result of (A, B) ?
URL 12:43
Let’s see what it takes. func zip<A, B, E>( _ lhs: Parallel<Result<A, E>>, _ rhs: Parallel<Result<B, E>> ) -> Parallel<Result<(A, B), E>> { return zip(lhs, rhs).map { resultA, resultB in zip(resultA, resultB) } }
URL 13:18
Again, this was pretty straightforward to implement. In order to zip a nested container, we needed to nest our zip s. We first zip the outside, then map into it to zip the inside.
URL 13:32
We can even simplify this by using zip(with:) . func zip<A, B, E>( _ lhs: Parallel<Result<A, E>>, _ rhs: Parallel<Result<B, E>> ) -> Parallel<Result<(A, B), E>> { return zip(with: zip)(lhs, rhs) }
URL 13:50
What about flatMap ? Let’s start with the signature: func flatMap<A, B, E>( _ f: @escaping (A) -> Parallel<Result<B, E>> ) -> (Parallel<Result<A, E>>) -> Parallel<Result<B, E>> { } Can we lift functions from A s into parallel results of B into functions from parallel results of A into parallel results of B ?
URL 14:12
We can start by calling flatMap on our parallel result. func flatMap<A, B, E>( _ f: @escaping (A) -> Parallel<Result<B, E>> ) -> (Parallel<Result<A, E>>) -> Parallel<Result<B, E>> { return { parallelResultA in parallelResultA.flatMap { resultA in } } } And then maybe, as we did with map and zip , we should further flatMap on our inner result. func flatMap<A, B, E>( _ f: @escaping (A) -> Parallel<Result<B, E>> ) -> (Parallel<Result<A, E>>) -> Parallel<Result<B, E>> { return { parallelResultA in parallelResultA.flatMap { resultA in resultA.flatMap { a in f(a) // Parallel<Result<B, E>> } } } } We were able to get what we want to return from f , but not where we want to return it. In order to flatMap a result, we need to be returning a Result from that inner flatMap , not a Parallel<Result> .
URL 15:28
Instead, we know we need to return a new parallel inside the outer flatMap , so let’s do so manually. func flatMap<A, B, E>( _ f: @escaping (A) -> Parallel<Result<B, E>> ) -> (Parallel<Result<A, E>>) -> Parallel<Result<B, E>> { return { parallelResultA in parallelResultA.flatMap { resultA in Parallel<Result<B, E>> { callback in switch resultA { case let .success(a): f(a).run { resultB in callback(resultB) } case let .failure(error): callback(.failure(error)) } } } } }
URL 16:34
It’s compiling, but it’s remarkably different from the way we tackled nested map and nested zip . We’ve needed to drop down to very domain-specific knowledge of both Parallel and Result : we needed to know how to instantiate a Parallel with a callback, and we needed to know how to switch on a result to consider each case separately.
URL 17:03
Meanwhile, map and zip didn’t need to have any knowledge of Parallel or Result other than that they have a map operation and a zip operation. We could even replace Parallel and Result with any other types that have map and zip and the bodies of the functions wouldn’t have to change.
URL 17:21
This is a universal truth: if you have two generic containers and both support a map and a zip operation, when you nest them, you can always define a map and a zip on the nested containers: you map or zip on the outer container, and then you map or zip on the inner container. You don’t need to know anything about the containers themselves except for their ability to call map and zip .
URL 17:49
But this is just not true for flatMap . In order to define flatMap on a nested container you need to have intimate knowledge of what those containers are. In our case of Parallel and Result we needed information about them in order to define flatMap .
URL 18:07
So what we’re seeing is that generic containers compose very nicely when it comes to map and zip , but they do not compose nicely when it comes to flatMap . And this is a question we may not have even considered if we hadn’t spent so much time with flatMap to understand these signatures and shapes. Map from flatMap
URL 18:29
And finally, now that we are all so comfortable with map , zip and flatMap , let’s determine the relationship between these operations. We’ve said that zip generalizes map in that it allows us to map with functions that take multiple arguments, and we’ve said that flatMap generalizes both map and zip in that it can do things neither one of those operations can do. If we only had flatMap at our disposable, we could implement functions with the map and zip signature?
URL 19:07
Forget for a moment that we have map or zip . All we have is flatMap . Let’s try to redefine those operations in terms of flatMap .
URL 19:17
Let’s start with optionals. Let’s define a newMap operation with the same signature as map . extension Optional { func newMap<NewWrapped>( _ f: (Wrapped) -> NewWrapped ) -> NewWrapped? { } }
URL 19:42
Can we define this in terms of flatMap ? extension Optional { func newMap<NewWrapped>( _ f: (Wrapped) -> NewWrapped ) -> NewWrapped? { return self.flatMap { Optional<NewWrapped>.some(f($0)) } } }
URL 20:15
We can! We merely re-wrap the result of the transformation.
URL 20:19
Let’s do the same with arrays. extension Array { func newMap<NewElement>( _ f: (Wrapped) -> NewElement ) -> [NewElement] { return self.flatMap { [f($0)] } } }
URL 20:52
Sure enough, we can define map on arrays in terms of flatMap .
URL 20:57
We’re starting to see a pattern, so let’s see how Result fares. extension Result { func newMap<B>( _ f: (A) -> B ) -> Result<B, E> { return self.flatMap { .success(f($0)) } } }
URL 21:26
Alright. How about Validated ? extension Validated { func newMap<B>( _ f: (A) -> B ) -> Validated<B, E> { return self.flatMap { .valid(f($0)) } } } We just needed to change the type name and the case name.
URL 21:36
What about Func ? extension Func { func newMap<C>( _ f: @escaping (B) -> C ) -> Func<A, C> { return self.flatMap { b in Func { _ in f(b) } } } }
URL 22:04
One more to go! Let’s do Parallel . extension Parallel { func newMap<B>( _ f: (A) -> B ) -> Parallel<B> { return self.flatMap { a in Parallel<B> { callback in callback(f(a)) } } } }
URL 22:21
So it seems like it’s very easy to define map as long as we have flatMap and a way to lift a value into the appropriate context: a value lifts into an optional, or lifts into an array of one element, or lifts into a successful result, or lifts into a constant function, or lifts into a parallel that executes immediately.
URL 22:47
And remember that map on a generic type has at most one, unique implementation (satisfying a simple property), so that means this newMap in terms of flatMap is exactly what has been defined on these types before. Zip from flatMap: Optional and Array
URL 22:58
Now what about zip ? We can recover something with the same signature using just flatMap : func newZip<A, B>(_ a: A?, _ b: B?) -> (A, B)? { return a.flatMap { a in b.flatMap { b in Optional.some((a, b) } } }
URL 23:23
You can check that this behaves exactly the same as the zip we have previously defined on optionals. And that kind makes sense because zip on optionals was just unwrapping each optional in order until we encountered a nil , in which case we short circuit the whole operation and just return nil . That sounds a lot like a sequencing operation that flatMap is so famous for.
URL 23:44
So, so far it seems like map and zip didn’t give us anything new! Seems like we could have just defined flatMap and used it instead of those other two, right?
URL 23:52
Well, let’s confirm this by doing the same for arrays. All we need to do is swap out the containers: func newZip<A, B>(_ a: [A], _ b: [B]) -> [(A, B)] { return a.flatMap { a in b.flatMap { b in [(a, b)] } } }
URL 24:02
Let’s take it for a spin. newZip(["a", "b"], [1, 2]) // [("a", 1), ("a", 2), ("b", 1), ("b", 2)]
URL 24:07
Looks like we got something quite different than the normal zip on arrays. In fact, we seem to have recovered the combos function we started with in our flatMap series when we were trying to justify the need for flatMap. And in fact it is impossible to recover the zip we know and love on arrays using just flatMap , so it is not true that flatMap gives us everything we need. The reason it is impossible is straightforward: the zip on arrays that we are used to allows us to take two independent arrays at the same time and make a pairwise correspondence between their elements. However, flatMap has no such ability, it can only spawn new arrays from values in the first array and then flatten it into a single array. It is absolutely impossible to recover the zip we know and love from flatMap , it’s just a structure that exists outside the reach of flatMap . Zip from flatMap: Result and Validated
URL 25:06
Let’s move on to Result : func newZip<A, B, E>( _ a: Result<A, E>, _ b: Result<B, E> ) -> Result<(A, B), E> { return a.flatMap { a in b.flatMap { b in Result.success((a, b) } } }
URL 25:21
You can check that this function behaves the same as the other zip on Result , and so zip on results doesn’t give anything interesting beyond what flatMap gives us.
URL 25:31
What about Validated ? func newZip<A, B, E>( _ a: Validated<A, E>, _ b: Validated<B, E> ) -> Validated<(A, B), E> { return a.flatMap { a in b.flatMap { b in Validated.valid((a, b) } } }
URL 25:36
The implementation is almost identical to Result , but unlike Result , this newZip is not like the other. It’ll behave the same for two valid types: newZip(Validated<Int, String>.valid(1), .valid("Two")) // .valid((1, "Two"))
URL 25:48
It’ll even behave the same for a single invalid value: newZip(Validated<Int, String>.invalid(NonEmptyArray("Something went wrong.")), .valid(2)) // .invalid(NonEmpty(["Something went wrong."]))
URL 25:55
The moment it breaks down is when we try to zip up two invalid values: newZip( Validated<Int, String>.invalid( NonEmptyArray("Something went wrong.") ), Validated<Int, String>.invalid( NonEmptyArray("Something else went wrong.") ) ) // .invalid(NonEmpty(["Something went wrong."]))
URL 25:59
This newZip does not work like the zip we defined earlier. It’s discarding the second error.
URL 26:07
And just like arrays, we seem to have gotten a new zip on Validated that isn’t as powerful as the old one. The regular zip on validated would accumulate the errors so that we got all of the errors, not just the first error like we do for Result . And it is actually impossible to define the zip we know and love on Validated using only flatMap . Again to understand why is straightforward: zip on Validated had the power of taking two independent validated values and combining them, which allowed us to combine their errors if both were invalid. However when we sequence validation computations, we only have access to the valid data in order to produce a new Validated value, which means we lose the error information.
URL 26:48
It’s pretty fascinating to see that the zip that is induced by flatMap is sometimes weaker than the zip we define from first principles. In the case of Result : the zip is the normal one we get from flatMap , but with Validated , it’s much different: it no longer accumulates multiple errors and so we’re losing information over time. Zip from flatMap: Func and Parallel
URL 27:13
Let’s see what happens with Func and Parallel .
URL 27:20
We can again take an existing flatMap -induced zip and make a few small changes: func newZip<A, B, C>( _ a: Func<A, B>, _ b: Func<A, C> ) -> Func<A, (B, C)> { return a.flatMap { a in b.flatMap { b in Func { _ in (a, b) } } } }
URL 27:56
One can show that this newZip is exactly the same as the zip we’ve previously defined on Func . So in this case zip on Func isn’t buying us anything new that flatMap couldn’t already accomplish.
URL 28:11
And finally, what about Parallel ? func newZip<A, B>( _ lhs: Parallel<A>, _ rhs: Parallel<B> ) -> Parallel<(A, B)> { return a.flatMap { a in b.flatMap { b in Parallel { callback((a, b)) } } } }
URL 28:40
Let’s take it for a spin to see what it’s given us. newZip(delay(by: 2).map { 2 }, delay(by: 3).map { 3 }).run { value in print(value) } // Delaying line 675 by 2.0 // Finished line 675 // Delaying line 675 by 3.0 // Finished line 675 // (2, 3)
URL 29:10
What we just witnessed is that zipping two delayed values caused us to wait 2 seconds before finishing, and then wait 3 seconds before finishing, and then ultimately producing two integers.
URL 29:27
That is not how zip works on Parallel as we previously defined it: zip on Parallel had extra powers: it allowed us to run all of these Parallel s at the same time (in parallel), which was very powerful and could have a big impact on the performance of our code in that we can run a lot of asynchronous tasks in parallel.
URL 29:53
And just like Array and Validated before this, we have come up with a new zip on Parallel that is again not as powerful as the other zip we defined previously. For the other zip would run all the values in parallel and then collect the results, whereas this newZip runs the values in sequential order, and so it takes longer to run them even though there is no dependency between the values. And again it is impossible to define the powerful zip that runs values in parallel using only flatMap . Due to the nature of flatMap being a sequencing operation we cannot possible run tasks in parallel, it’s just not made for that kind of behavior.
URL 30:19
So we’ve come across yet another “exotic” zip . Array , Validated , and Parallel are structures that are not induced by flatMap on those structures: they provide something new that cannot be recovered by flatMap alone. The point
URL 30:42
OK! That was a very, very long “what’s the point?”: we needed three whole episodes to get through it, but I think we came out on the other side with a lot of interesting things.
URL 30:56
Being comfortable with the definitions and semantics of these functional operators, map , zip and flatMap gives us a kind of “functional domain-specific language” for writing transformations on data types. We can take code that works on one type, like optionals, and pretty mechanically convert it to work on many types, like results and parallels.
URL 31:10
Once we see this power we begin to understand why we shouldn’t be smudging the signatures of the operators. Both in terms of making the signature a little different just to suite our needs at the time, like the flatMap overload did in Swift prior to 4.1, and in terms of not renaming the operators carelessly for we run the risk of destroying the intuitions we can share between types.
URL 31:22
And finally, by being comfortable with what these operators represent we could begin to ask really complicated questions. Like, it turns out that taking any two containers that each have map and zip we can naturally construct map and zip on the nesting of the containers. That process always works. However the same isn’t true of flatMap , there is no general way to do that, and that shows that even though flatMap has some powers that map and zip don’t have, the converse is also true.
URL 31:35
We were also able to see that flatMap is powerful enough that it can induce map and it can induce something zip -like. So once you have flatMap you get operations with those signatures automatically. However, the zip you get from flatMap is likely not the one you expect. That zip loses the ability to combine values in an independent manner, it will have the sequencing mechanism baked into it. And that is actually a good thing! Knowing that there are some zip s out there that cannot be induced by any flatMap shows how special zip is.
URL 31:52
These are things that we could not have reasonably discussed before becoming comfortable with the idea of flatMap .
URL 32:02
So, that’s the end of our series on flatMap and I hope it helps you feel comfortable with the concept, the signature, and the intuition. There have actually been a number of episodes we’ve done where we wanted to define flatMap but we hadn’t yet properly sat down and analyzed its structure. So, with our series on monads complete, we are free to use this tool when necessary. Until next time! References Railway Oriented Programming — error handling in functional languages Scott Wlaschin • Jun 4, 2014 This talk explains a nice metaphor to understand how flatMap unlocks stateless error handling. Note When you build real world applications, you are not always on the “happy path”. You must deal with validation, logging, network and service errors, and other annoyances. How do you manage all this within a functional paradigm, when you can’t use exceptions, or do early returns, and when you have no stateful data? This talk will demonstrate a common approach to this challenge, using a fun and easy-to-understand “railway oriented programming” analogy. You’ll come away with insight into a powerful technique that handles errors in an elegant way using a simple, self-documenting design. https://vimeo.com/97344498 A Tale of Two Flat‑Maps Brandon Williams & Stephen Celis • Mar 27, 2018 Up until Swift 4.1 there was an additional flatMap on sequences that we did not consider in this episode, but that’s because it doesn’t act quite like the normal flatMap . Swift ended up deprecating the overload, and we discuss why this happened in a previous episode: Note Swift 4.1 deprecated and renamed a particular overload of flatMap . What made this flatMap different from the others? We’ll explore this and how understanding that difference helps us explore generalizations of the operation to other structures and derive new, useful code! https://www.pointfree.co/episodes/ep10-a-tale-of-two-flat-maps Monad (functional programming) Well, the cat’s out of the bag. For the past 5 episodes, while we’ve been talking about flatMap , we were really talking about something called “monads.” Swift cannot (yet) fully express the idea of monads, but we can still leverage the intuition of how they operate. This reference is to the Wikipedia page for monads, which is terse but concise. https://en.wikipedia.org/wiki/Monad_(functional_programming) Downloads Sample code 0046-the-many-faces-of-flatmap-pt5 Point-Free A hub for advanced Swift programming. Brought to you by Brandon Williams and Stephen Celis . Content Become a member The Point-Free Way Beta previews Gifts Videos Collections Free clips Blog More About Us Community Slack Mastodon Twitter BlueSky GitHub Contact Us Privacy Policy © 2026 Point-Free, Inc. All rights are reserved for the videos and transcripts on this site. All other content is licensed under CC BY-NC-SA 4.0 , and the underlying source code to run this site is licensed under the MIT License .