Recurrence relation

what is Recurrence relation

In mathematics, a recurrence relation is an equation that expresses the
n -th term of a sequence as a function of the  k  preceding terms.

examples

merge sort

sort(array A, int begin, int end):
 if(begin = end)
	 return;
 sort(A, begin , (begin + end)/2)
 sort(A, ((begin + end)/2)+1, end)
 pivot = (begin + end)/2 
 Merge(A,begin, pivot, end)	 	 

we can represent this algorithm as a recurrence relation in the following way:

T(n)={θ(1)if n=1T(n2)+T(n2)+θ(n)if n>1

this formula is the representation of #merge sort. how ever it should be mention that as learned in Asymptotic notations we can ignore any constant or round value, so lets rewrite the formula as the following:

T(n)=2T(n2)+θ(n)  ,T(1)=1

How can we calculate a Recurrence relation

Recursion tree

lets look at the following relation :

T(n)=2T(n2)+θ(n)

we can calculate T(n) by creating this Recursion tree:

Pasted image 20220425015536.png

this will continue until nn and we are looking for the height of the tree in order to solve T(n)

so, lets explain whats going on:

unequal Recursion tree

lets look at the next relation:

T(n)=T(n3) + T(2n3) + Cn

where Cn is the non recursive part of the function.

this time the tree is not even as the tree we saw before and it will look something like this:
recurrsion_tree_ex2.jpeg

each level is performing n work as before. how ever the right branches clearly is performing more work. what is going to happen is that the more the tree branches will expand, the more job will be done on the rightmost side of the tree and it will become more and more unbalanced.

as before the tree will end at nn so that mean the left branch will finish way more quickly.

let's ask our self : so what if the tree is not equal? what will happen? the answer is the following. in the #Recursion tree from before, each branch performed n work.
but that only worked because they were equal. in this case at some point the left branch will finish its work and the right one will continue his.
that means that at some branch bi what will happen is the following:
bbi work<Cn

so how to calculate T(n)?

Recursion tree with more than 2 branches

lets check the following example:

T(n)=3T(n4)+Cn2

where Cn2 is some work inside the function that is non recursive.

the tree will look something like this:
Pasted image 20220425084636.png
lets see whats going on :

Cn2n=0log4n(316)i=O(n2)

Iterative method

in this method we will "open" the formula to get an idea if what will happen at the end. (we need to now the value of T(1) for that but it most of the time will be 1 or 0).

example 1

T(n)=2T(n3)+2

lets open the formula:

T(n3)=2T(n6)+2T(n6)=2T(n9)+2

lets see how its sums up in T(n) :

T(n)=4(2T(n9)+2)+4+2=8T(n9)+8+4+2

if we try to build it until the end we will get :

T(n)=2iT(n3i)+2i+12

notice that 2i+12 is a Geometric progression

how ever, this is not a valid proof and we need to proof it with # Inductive reasoning

T(n)=2k1ֿ T(n(k1)3)+2k+2

continue develop the expression and we will get the above result.
because we know that T(0)=T(1)=T(2)=T(3)=1
now assume i=n13 and we now get the following : T(n)=2n3T(1)+2n3+12=2n3+2n3+12θ(2n3)

positioning method

we will guess what is the big O notation and the Ω notation, by guessing i mean we will assume that there is a constant C according to Asymptotic Analysis rules.

the Master theorem

the master theorem help us solve a specific type of Recurrence relation but is the fastest way from all the above.

for formulas that look like the following:

T(n)=aT(nb)+f(n) where: a1, b>1

where f(n) is the root of the function (the non recursive part).
the base for the master theorem is simple find out who from the root or the recursion part does more work.
we can achieve this with the following rules:

case 1:

f(n)O(nlogbaϵ)T(n)=Θ(nlogba)

this is the case where the recursive part works harder than the root

case 2:

f(n)Θ(nlogba)T(n)=Θ(nlogba log n)

this is the case when both the recursive part and the root does the same work

there is an extended case for that one :
Pasted image 20221130214437.png

case 3:

f(n)Ω(nlogba+ϵ)T(n)=Θ(f(n))

this case is when the root does more work than the recursive part.

examples:

T(n)=8T(n4)+n2
f(n)=n2 , a=8, b=4
logba=log48=1.5

n1.5<n2  0<ϵ<0.5:   f(n)=n2=Ω(n1.5+ϵ)

therefore:
T(n)Θ(f(n))Θ(n2)

T(n)=9T(n3)+n2
log39=2,   f(n)=n2=Θ(nlog39)

therefore: T(n)Θ(n2logn)

switching variables method

lets look at the following relation:

T(n)=2T(n)+logn

this can be a hard expression to develop using the methods above..
we should try make it easier by switching variables:

mark m=logn and notice that: 2m=n now:

T(2m)=2T(2m2)+mmark: S(m)=T(2m)S(m)=2S(m2)+m

this is a recurrence relation we already know S(m)Θ(mlogm)
therefore:

T(n)=T(2m)=S(m)=Θ(mlogm)=Θ(log(n)log(log(n)))