Recursion

In mathematics, recursion describes a situation where a function's output is plugged back into it. If we have a function \(f(x)\), we can recurse it to form functions \(f(f(x))\), \(f(f(f(x)))\), etc. \(f(f(\ldots f(f(x))\ldots ))\) with \(t\) copies of \(f\) is often abbreviated \(f^t(x)\). Recursion is very important to googology since it easily produces quickly-growing functions. If \(f(x)\) displays rapid growth, \(f^t(x)\) will grow much faster for large \(t\).

Every breaks into two types of rules: base case, where for some argument value is given immediately, and recursive rule, where function applies for itself.

Recursive hierarchy of functions must have three rules: "brake" rule (starting function), "plugging" rule (it tells how many times the previous function must be plugged) and "decomposing" rule (for diagonalizing purpose, it finds the fundamental sequence of so-called limit ordinals).

Examples of recursive hierarchies are FGH, BEAF, Hyper-E Notation and Bird's array notation.