Using Monads for Control: Maybe it's worth a look?

So, after our last installment, describing the theory of monads, and the previous posts, which focused on representing things like state and I/O, I thought it was worth taking a moment to look at a different kind of thing that can be done with monads. We so often think of them as being state wrappers; and yet, that's only really a part of what we can get from them. Monads are ways of tying together almost anything that involves sequences.

In previous parts of this tutorial, we've seen the Maybe type. It's a useful type for all sorts of things where there might be a value. For example, a dictionary
lookup function would often be written something like:

lookup :: (Ord a) => Dictionary a b -> a -> Maybe b

You might expect (Ord a) => Dictionary a b -> a -> b, meaning a function from a dictionary and a value of type a to a value of type b. But that's not a great type definition for a lookup, because then what would the function return when the key isn't in the dictionary? Maybe gives us a way of handling that: if the key is included, we'll return Just b; otherwise, we'll return Nothing.

The problem with Maybe is that anytime you use it, you need to wrap
things up in pattern matches to check if a Nothing value was returned, and to extract a value if a Just was returned. That gets particularly messy if you have a sequence of things to be looked up. For example, imagine that we're writing a program for the police. They've got a license plate number, and they'd like to get the phone number of the person who owns the car with that plate. But the data is in different places: there's a list of license plate numbers associated with owners; and then there's the telephone book which has listings of peoples names and phone numbers. So here are some trivial implementations of the basic types and lookup functions:


>data LicensePlate = Plate String deriving (Eq,Show)
>
>getPlateOwner :: [(LicensePlate,String)] -> LicensePlate -> Maybe String
>getPlateOwner ((p, n):plates) plate 
>                  | p == plate = Just n
>                  | otherwise = getPlateOwner plates plate
>getPlateOwner [] _ = Nothing
>
>data PhoneNumber = Phone String String String deriving (Eq,Show)
>         -- area code, prefix, number
>getPhoneNumberForName :: [(PhoneNumber,String)] -> String -> Maybe PhoneNumber
>getPhoneNumberForName ((num,name):phonebook) person 
>                  | name==person = Just num
>                  | otherwise = getPhoneNumberForName phonebook person
>getPhoneNumberForName [] _ = Nothing

Suppose we had some databases for that:


>plateDB = [(Plate "abc", "Mark"),(Plate "h1a j2b", "Joe Random"), (Plate "u2k 5u1", "Jane Random")]
>
>phoneDB=[(Phone "321" "123" "3456", "Joe Random"), (Phone "345" "657" "3485", "Mark"), (Phone "588" "745" "1902", "Foo Bar")]

Now, suppose we wanted to write a lookup function from plate to phone number. The naive way would be:


>getPhoneForPlate :: [(LicensePlate,String)] ->  [(PhoneNumber,String)] -> LicensePlate -> Maybe PhoneNumber
>getPhoneForPlate ldb pdb lic =
>   let person = getPlateOwner ldb lic
>   in case person of
>         Just name -> getPhoneNumberForName pdb name
>         Nothing -> Nothing

Now, imagine if we wanted to add another layer - we'd need to add another let/case inside of the "Just name" case. The more we use Maybe, the messier it gets. But Maybe is a monad! So we could use:


>mGetPhoneForPlate ::  [(LicensePlate,String)] -> [(PhoneNumber,String)] -> LicensePlate -> Maybe PhoneNumber
>mGetPhoneForPlate ldb pdb lic =
>    do
>       name <- getPlateOwner ldb lic
>       phone <- getPhoneNumberForName pdb name
>       return phone

What happened here is that "Maybe" as a monad gives us a great way of chaining. Anywhere in the chain, if any step returns "Nothing", then the entire series will stop there, and return "Nothing". Any step where it returns "Just x", we can just capture the value as if there were no "Just". Suddenly chains of Maybe aren't a problem anymore.

An important thing to notice about this is that the monad is being used as a control structure. We've got a sequence of operations chained together in a monad. When we evaluate the monadic sequence, the implementation of the binding operator that combines monads actually does control flow management. Let's take a quick look at the internals of Maybe just to see how that's done:

instance Monad Maybe where
    return         = Just
    fail           = Nothing
    Nothing  >>= f = Nothing
    (Just x) >>= f = f x

Pretty simple, right? return really just wraps the value in a "Just". If there's an error along the way, "fail" will make the monad evaluate to "Nothing". And for chaining, "Nothing" chained with anything else results in "Nothing"; (Just x) chained with something unwraps x, and passes it to the next step in the chain. The control flow is captured in the pattern match between "Nothing" and "Just x" in the instance declaration of the ">>=" chaining operator.

This basic trick, of using the chaining operator to insert control flow into monadic sequences is an incredibly useful technique, which is used by a variety of really cool monads. Some examples that we'll look at in later installments are
monads for backtracking computation (like prolog); for non-deterministic computation; and for a really remarkably elegant way of building parsers.

(In fact, if you're not very lucky, and I have time over the next couple of days to finish it, on friday I'll post one of my own pathological programming languages, which has a parser implemented using the parsing monad. It's a language based on Post's tag systems, which form the most frustratingly simple Turing equivalent computing system that I know of.)

Tags
Categories

More like this

Time for more monads. In this article, I'm going to show you how to implement a very simple state monad - it's a trivial monad which allows you to use a mutable state consisting of a single integer. Then we'll expand it to allow a more interesting notion of state. Let's get the trivial stuff out…
The biggest nightmare for most people learning Haskell is monads. Monads are the key to how you can implement IO, state, parallelism, and sequencing (among numerous other things) in Haskell. The trick is wrapping your head around them. On the most basic level, they're actually pretty easy to…
As promised, I'm finally going to get to the theory behind monads. As a quick review, the basic idea of the monad in Haskell is a hidden transition function - a monad is, basically, a state transition function. The theory of monads comes from category theory. I'm going to assume you know a little…
Last time around, I walked through the implementation of a very simple binary search tree. The implementation that I showed was reasonable, if not great, but it had two major limitations. First, it uses equality testing for the search, so that the implementation is only really suitable for use as…

Along these lines, a nice thing to look at is the Monad instance declaration for Either String. (That behaves very much like Maybe does as a Monad, except that you can get the failure string out later)

There's also other uses, such as [] as a Monad, which can be used for things similar to list comprehensions:

Hugs> do a <- [2,3,5,7,11,13,17]; b <- [55,63,42,21];
if b `mod` a == 0 then return (a,b) else fail ""
[(2,42),(3,63),(3,42),(3,21),(5,55),(7,63),(7,42),(7,21),(11,55)]

Then there's the whacky ((->) a) Monad, which I think is not there for the do syntax at all but rather for the effect on higher level Monad functions like ap:

Hugs> putStrLn$ap(++)show"putStrLn$ap(++)show"
putStrLn$ap(++)show"putStrLn$ap(++)show"

What a strange coincidence! I wrote about a related monad this morning.

Generally speaking, anything which is a structured set will yield a monad. Sets with more than one element will form a Cartesian product when joined (which can also look like "backtracking", depending on how you squint at it).

Of course, for a monad which is strictly sequential, you also have the option of using the corresponding comonad. But that's another topic...