The Snake Lemma in unflattering detail

Proving the snake lemma is something that should not be done in public…” ~ Paolo Aluffi.

The Snake Lemma is a theorem about exact sequences of modules, and is an important tool in homological algebra. Almost every textbook that includes the Snake Lemma leaves it as an “easy” exercise. Atiyah–MacDonald is kind enough to construct the “snake” itself (Prop 2.10), but leaves all the other details to the reader. Mac Lane in his Categories does prove it, but for a general abelian category and not in the more pedestrian context of modules over a ring. It seems an actual proof of this fact written down is quite hard to come by because, well, Aluffi is right: it ain’t pretty.

I get the feeling that the proof is straightforward to those who already have lot of experience with diagram chasing; it took me a long time to solve this problem, and I struggled to think clearly about cokernels in particular. Recently however I watched part of a YouTube video on the lemma and saw how the host demonstrated exactness at one of the “easy” nodes. After this, I found the zen of the diagram chase and was able to complete rest of the proof quite rapidly.

Here I will briefly recall the relevant definitions, and then give a complete proof in full gory detail, against the better judgment of Aluffi: even if it is not pretty, the information should be available somewhere. If you’ve not proved the snake lemma before, you should give it a good go yourself as an instructive introduction to diagram chasing, and then read my solution if you get stuck. You can always have a go at the five-lemma yourself afterwards. Because page space is not as big an issue on a website as in a print book, I have tried to include every sub-diagram I refer to, even if it seems a little excessive.

So, let R be a commutative ring. Recall that a sequence of homomorphisms of R-modules \dots \to M' \stackrel{f}{\to} M \stackrel{g}{\to} M'' \to \dots is called exact at M if \ker{g} = \im{f}. Recall also that the cokernel of a homomorphism f\colon M \to N is N / \im{f}.

Theorem: The Snake Lemma.

Rendered by

be a commutative diagram of R-modules and homomorphisms with exact rows. Then this induces a long exact sequence

Rendered by

This obtains for us a rather pretty diagram

Rendered by

The proof has several steps. We deal first with the “easy” steps. At all times, one should refer to the following enormous diagram, which is exact in every row and every column.

Rendered by

The existence of the tilded arrows

The first step is to show that the “easy” arrows exist at all. Firstly, we deal with \wt{u} and \wt{v}. These arrows are simply the restriction of u and v to the kernels of \alpha and \beta. The question is whether they are well-defined. In other words, given a \in \ker{\alpha}, is u(a) \in \ker{\beta} and similarly for b \in \ker{\beta}. This is not too difficult to see. Take, for example, \wt{u}. Choose a \in \ker{\alpha}. Since the square

Rendered by

commutes, we have \alpha(a) = 0 since a is chosen to be in the kernel, and

    \[u'(\alpha( a) ) = 0 = \beta(u(a)),\]

so u(a) is in the kernel of \beta. Therefore this restriction of the domain and codomain of u makes sense. The argument holds identically for \wt{v} as a restriction of v.

Next, we’ll work on \wt{v'}. This arrow is defined by sending

    \[b +{\im{\beta}} \mapsto v'(b)+{\im{\gamma}}.\]

To show that this is well defined, we need to show that v' sends elements of \im{\beta} to \im(\gamma) (it should become clear why this is sufficient shortly). Choose an element in \im{\beta}, say \beta(b). Lift it back to b \in B. Then \gamma(v(b)) \in \im{\gamma}, which by commutativity of the square

Rendered by

means that v'(\beta(b)) \in \im{\gamma}. In other words, v'(\im{\beta}) \subset \im{ \gamma }. This means that given any element of \coker{\beta}, say b' + \im{\beta}, we can lift back up to an element in B', say b' + \beta(b), and then v'(b' + \beta(b)) maps to v'(b') + \im{\gamma} in \coker{\gamma}, because as we have justed checked, \beta(b) maps into \im{\gamma}. Hence this is well defined. The case for \wt{u'} is again exactly analogous.

Exactness at the tilded arrows

We start with the easier cases. Firstly, we wish to show that the sequence is exact at \ker{\alpha}. But this is just to say that \wt{u} is injective, and since it is just the restriction of an injective map, this is immediate. Now for exactness at \coker{\gamma}, let c' + \im{\gamma} be in \coker{\gamma}. Take a representative c' + \gamma(c). Since v' is surjective, we can choose b' \in v'^{-1}(c' + \gamma(c)). Now if we mod out by \im{\beta}, we have b' + \im{\beta}, whose image under \wt{v'} is c' + \im{\gamma}.

Now we wish to show exactness at \ker{\beta}. It is easy to show that \im{\wt{u}} \subset \ker{\wt{v}}. Indeed, consider \wt{u}(a). Since the tilde maps are just restrictions, we know that v(u(a)) = 0, and so \wt{v}(\wt{u}(a)) = 0. Now suppose that b \in \ker{\wt{v}}. We need to show that it arises as the image of some a \in \ker{\alpha} under \wt{u}. So embed b into B. Since b \in \ker{\wt{v}} it must be the case that b \in \ker{v}. We hence know that there exists a \in A such that u(a) = b, because the sequence is exact at B. We just need to show that a \in \ker{\alpha}. But again we just appeal to the fact that the diagram

Rendered by

commutes. We have

    \begin{align*} \beta(b) = 0 &{}\implies{} \beta(u(a)) = 0 \\ &\implies u'(\alpha(a)) = 0 \\ &\implies \alpha(a) = 0 \\ &\implies a \in \ker{\alpha}. \end{align*}

Thus the sequence is exact at \ker{\beta}.

Now we want exactness at \coker{\beta}. Let a' + \im{\alpha} \in \coker{\alpha}. Then

    \[\wt{v'}\wt{u'}(a' + \im{\alpha}) = v'u'(a') + \im{\gamma}.\]

But v'u'(a') = 0, so \im{\wt{u'}} \subset \ker{\wt{v'}}. Finally, let b' + \im{\beta} \in \ker{\wt{v'}}. We can lift to something in B', say b'. Now by definition of \ker{\wt{v'}}, we have that v'(b') \in \im{\gamma}, so we can then lift to some c \in C such that \gamma(c) = v'(b). This lifts again to some b \in B such that v(b) = c. Now b' - \beta(b) is a representative of b' + \im{\beta}, and we calculate

    \[v'(b' - \beta(b)) = v'(b') - v'(\beta(b)).\]

Since the diagram

Rendered by

commutes we have v'(\beta(b)) = \gamma(v(b)). But due to our choice of b, we have \gamma(v(b)) = v'(b'), so b' - \beta(b) \in \ker{v'}. Thus by exactness at B it follows that b' - \beta(b) lifts uniquely back to a' \in A such that u'(a') = b' - \beta(b). Therefore, \wt{u'}(a' + \im{\alpha}) = b' + \im{\beta}, so this coset is in the image of \wt{u'}. After all this, we have shown exactness at all the easy arrows.

Construction of \delta

Let c \in \ker{\gamma}, and lift to b \in B such that v(b) = c. By the commutativity of the diagram

Rendered by

we have v'(\beta(b)) = \gamma(c) = 0, so \beta(b) \in \ker{v'}. Thus there is a unique a' \in A' such that u'(a') = \beta(b), and we define

    \[\delta(c) = a' + \im{\alpha}.\]

We need to check a couple of things. We had to make a choice, choosing b \in v^{-1}\{c\}. We must show the image of c under \delta independent of the choice. Let b and \widehat{b} be such that v(b) = v(\widehat{b}) = c. Then by commutativity, we have v'(\beta(b)) = v'(\beta(\widehat{b})) = 0. We now use exactness at B' to lift back to a' and \widehat{a'} such that u'(a') = \beta(b) and u'(\widehat{a'}) = \beta(\widehat{b}). Since these are homomorphisms, we have u'(a'-\widehat{a'}) = \beta(b - \widehat{b}). Now v(b - \widehat{b}) = c - c = 0, so b - \widehat{b} \in \ker{v}. That means there is a unique a \in A such that u(a) = b - \widehat{b}. But by commutativity of

Rendered by

we have

    \[ \beta(b - \widehat{b}) = \beta(u(a)) = u'(a' - \widehat{a'}) = u'(\alpha(a)).\]

But u' is injective, so it follows that a' - \widehat{a'} = \alpha(a). Hence a' + \im{\alpha} = \widehat{a'} + \im{\alpha}, and \delta is well defined as a set-theoretic function.

The next question is whether \delta is a homomorphism. This is the only bit I won’t explain in detail because it’s too ugly even for this post, but we essentially check this by recognizing that we constructed \delta by pushing things through and lifting things back up through existing homomorphisms.

Exactness at \ker{\gamma} and \coker{\alpha}

We’re almost there! Let’s do exactness at \ker{\gamma}. Firstly, let c = \wt{v}(b) where b \in \ker{\beta}. Following our prescription for \delta, we choose the unique a' \in A' such that u'(a') = \beta(b). But \beta(b) = 0, therefore a' = 0, so \im{\wt{v}} \subset{\ker{\delta}}. Now suppose that \delta(c) = 0. It means that we lift c to some b \in B, obtain \beta(b), and remark by the usual commutativity of

Rendered by

argument that this pulls back to a' \in A'. The statement \delta(c) = 0 means that a' = \alpha(a) for some a. Consider then b - u(a). Clearly

    \[v(b - u(a)) = v(b) - v(u(a)) = v(b) = c.\]

If we can show is also in the kernel of \beta, we’re done. And indeed:

    \[\beta(b - u(a)) = \beta(b) - \beta(u(a)) = \beta(b) - u'(\alpha(a)) = \beta(b) - \beta(b) = 0.\]

So b-u(a) \in \ker{\beta} and \wt{v}(b- u(a)) = c. So the sequence is exact at \ker{\gamma}.

Now for exactness at \coker{\alpha}. Let a' + \im{\alpha} = \delta(c). We can apply u' to any representative, say the a' \in A' used to construct the image of c in the first place, and a' was chosen to be such that u'(a') = \beta(b) for some b, and so

    \[\wt{u'}(a' + \im{\alpha}) = \im{\beta} = 0.\]

Thus \im{\delta} \subset \ker{\wt{u'}}. Finally, let a' + \im{\alpha} \in \ker{\wt{u'}}. It means that u'(a') = \beta(b). Now

    \[v'(u'(a')) = v'(\beta(b)) = 0 = \gamma(v(b)),\]

so by commutativity of

Rendered by

we have v(b) \in \ker{\gamma}, and by simply following through the construction of \delta on v(b), we have \delta(v(b)) = a' + \im{\alpha}. The proof is now complete.

Links: — this is the video that helped me get started with the proof.