We left off our discussion of regular languages with dashed hopes when we learned that is a regular language if and only if is finite. This means that dealing with , the set of words on the generators of which are trivial in , will never help us resolve the word problem because if is finite, we already know we can solve the word problem with the Cayley graph!

Fortunately all is not lost. The new plan of attack is to identify the situations in which a normal form for some group is also a regular language. In these cases we may be able to use some set of FSA’s to relate a word on the set of group generators to the normal form for the group element that the word reduces to (although ultimately, as we will see, we won’t even need an FSA that accepts a normal form for ).

First we should see an example of a group with a normal form which is also a regular language.

**Definition:** A *regular normal form* of a group is a normal form which is also a regular language.

**Example:**

Let We can set a normal form for this group, namely

It is possible to build an FSA that accepts exactly this set as its language, illustrated below.

With this concept in hand, we can formulate a few interesting results.

**Theorem:** If and both have a regular normal form then so do and .

The proof will be assigned, so suffice it to say that it involves some ideas about combining regular languages that we’ve seen before. Before moving on to the word problem, we can squeeze a little more juice out of the concept of a regular normal form. A regular language is, for lack of a more appropriate phrase, a set of words which has a high degree of structural regularity. For this reason we suspect that a group which permits a normal form with this structure is “nice” in some basic way. For one thing, the set of nodes and edges in an FSA is finite, so every regular language will be finite or will have to somehow make use of cycles in the graph of its FSA. To put this into rigorous terms, we recall the pumping lemma.

**Theorem (The Pumping Lemma):** Let be regular, then there exists such that for all words with , we can find with such that and for all .

The proof worked via pigeonhole by setting , where is the FSA that accepts .

With this in mind we note that if is in infinite group with a regular normal form, then we can use the cycles in the corresponding FSA to generate an element of with infinite order.

**Theorem:** If has a regular normal form and is infinite, then contains an element of infinite order.

Let be the FSA representing . Let satisfy . Here the absolute value sign denotes the word length of the representative of in the regular normal form. We may take a word whose word length is greater than any integer, specifically , because is infinite and therefore its normal form is infinite. Let be the natural transformation that sends words to group elements. By an argument much like the proof of the pumping lemma, we note that the path which yields the word for must contain a loop, therefore we have

Where is given by a path from the start state to some vertex, , is a loop that starts and ends at , and is a path that begins at and ends at some accept state. We can see that must also be accepted for all , so because this language is a normal form, we have that for ,

Since is a homomorphism, we then have

This means that the element must have infinite order, which is all to say that if has a regular normal form, then is not infinite torsion.

Permitting a regular normal form is a great property for a group to have, but we already get the feeling that it is quite specific, and it is possibly difficult to prove whether a group even has it. We will not use these normal forms explicitly in our solution of the word problem, but we will retain the general idea of making an FSA work for us to produce words with useful properties. This is part of what motivates the following definition.

**Definition: **Let be a group. We say has an automatic structure if there exists the following set of FSAs

- where the language surjects onto under , namely ever element of is represented by some word. This is FSA is called the word acceptor.
- where . This FSA is called the equality checker.
- where . This is the word comparator.

A few examples are in order to unpack this definition. First we draw out all the FSA’s for the example .

We have only drawn since would look very similar. In general we do not necessarily have that the language given by is necessarily a normal form for , but in this case we do, which means that drawing is easy as we can replace every edge with the edge and so on, since every word uniquely represents a group element, so implies that and are the same word. We will see this at play in the next example, where . Here we will omit since we understand it’s structure, and again only draw for reasons of symmetry.

Drawing once we have is not too large a task. In the above example, the orange part of the graph is exactly what would be, where the purple arrows turn this graph into the graph of . Further examples of groups with automatic structure include all free groups and all free abelian groups, all finite groups, many coxeter groups, all braid groups, hyperbolic groups, and the special linear group over for . A few groups which do not have automatic structure are infinite torsion groups, special linear groups over with , and most Baumslag-Solitar groups.

We are now in a position to exhibit the crown jewel of our discussion.

**Theorem: **If is automatic then has solvable word problem.

Let . Pick some accepted word . For each letter , we will use the automata that make automatic to find an accepted word for .

We take the word comparator automaton and observe that we can take it to have no two edges leaving the same vertex with the same label by arguments we’ve made before. By definition accepts pairs such that . Sweeping a few things under the rug, we argue that we can follow the path that gives in and read off the word word , where we are guaranteed . This is an algorithm which gives us an accepted word for .

The key idea is that we can use this algorithm generate an accepted word for every element of . There is some word in the language which is the identity in , so just take to be this word. Now we can express every element of as where are letters in our generating set. Note that . We can apply the above algorithm iteratively, first to get an accepted word for the element , and after repetitions to get the element . Call this accepted word . Since we know , to solve the word problem we need to know if . Fortunately we have a finite state automaton that tells us exactly this! We simply follow the path for and see if the pair is accepted.

I like your post! It helped me understand using FSA to solve the world problem. I have a heuristic understanding of the automatic structure. Indeed, we recall that to solve a word problem, we need a Cayley graph to tell how to relate different combinations of generators and use the path to test out things. Thus, the FSA m_G plays the role of vertices of the Cayley graph but allows extra elements. Then, the FSA m_a’s tells us how to travel from one vertex to another. I always imagine them playing the rule of gluing things together into a “graph,” like Cayley graphs. Finally, the FSA m_= tells what paths are “equivalent.”

Horace, this is a wonderful post and it helped me a ton when I was trying to understand automatic structure for the recent problem set! I wonder if we ever actually need to represent the word comparator and equality checker FSAs for every element of the language or is that simply the formal definition of the automatic structure and it is always OK to just show everything for a single element.

Great post! Your pictures are large enough to be very clear, which I know is not always easy to do. I’m still not quite clear on how we generate the different FSAs to show that a group is automatic. How do we get from the first one to the other ones?