Math 1720
Outline of the course
(a.k.a. study sheet for final exam)
Chapter 6: Transcendental functions
 § 1:

Inverse functions and their derivatives
Basic idea: run a function backwards
y=f(x) ; `assign' the value x to the input y ; x=g(y)
need g a function; so need f is onetoone
f one=to one: if f(x)=f(y) then x=y ; if x ¹ y then f(x) ¹ f(y)
g = f^{1}, then g(f(x)) = x and f(g(x)) = x (i.e., g°f=Id and f°g=Id)
finding inverses: rewrite y=f(x) as x=some expression in y
graphs: if (a,b) on graph of f, then (b,a) on graph of f^{1}
graph of f^{1} is graph of f, reflected across line y=x
horizontal lines go to vertical lines; horizontal line test for inverse
derivative of the inverse:
f^{¢}(f^{1}(x))·(f^{1})^{¢}(x) = 1
if f(a) = b, then (f^{1})^{¢}(b) = 1/f^{¢}(a)
 § 2:

Natural logarithms
log's turn products into sums: log(ab) = log(a) + log(b)
Define lnx = ò_{1}^{x} dt/t = area under 1/t from 1 to x
lnx is a log: ln(ab) = ln(a) + ln(b)
ln(a^{b}) = bln(a) ; ln(a/b) = ln(a)ln(b)
[d/dx](lnx) = 1/x ; [d/dx](ln(f(x))) = [(f^{¢}(x))/f(x)]
ò[(f^{¢}(x))/f(x)]dx = lnf(x)+c
Logarithmic differentiation: f^{¢}(x) = f(x)[d/dx](ln(f(x)))
 § 3:

The exponential function
e^{x} = inverse of lnx ; e^{lnx} = x (x > 0), ln(e^{x}) = x (all x)
e^{a+b} = e^{a} e^{b}, e^{ab} = (e^{a})^{b} ; e^{1} = e = 2.718281828459045¼
[d/dx](e^{x}) = e^{x} ; òe^{x} dx = e^{x}+c
 § 4:

a^{x} and log_{a} x
ln(a^{b}) should be blna, so a^{b} = e^{blna}
a^{b+c} = a^{b} a^{c} ; a^{bc} = (a^{b})^{c}
a^{x} = e^{xlna} ; [d/dx](a^{x}) = a^{x} lna ;
òa^{x} dx = [(a^{x})/lna]+c
x^{r} = e^{rlnx} ; [d/dx](x^{r}) = rx^{r1}
f(x)=a^{x} is either always increasing (a > 1) or always decreasing (a < 1)
inverse is g(x) = log_{a} x = [lnx/lna]
 § 5:

Exponential growth and decay
differential equation f^{¢}(x) = kf(x) has solutions f(x) = Ae^{kx}
A = f(0) = initial condition
Many common phenomena are modelled by exponential functions
population growth
population at t=0 and t=a, what will population be at t=b ?
radioactive decay
halflife = time for half of original substance to decay
given amount at t=0 and t=a, what is halflife?
given amount ant halflife, how much at time t=a ?
(continuous) compound interest A(t) = Pe^{rt}
given initial amount P and interest rate r, how much at time a ?
intital amount and amount after t=a, what is interest rate?
Newton's Law of Cooling
T(t)=temp at time t, S=surrounding (constant) air temp,
then T^{¢}(t) = k(ST(t))
Q=ST(t) is exponential ; T(t) = SAe^{kt} (A=constant)
given temp at time t=0 and t=a, what will temp be at t=b ?
given temp at time t=0 and time t=a, when will temp be X ?
 § 6:

L'Hôpital's Rule
indeterminate forms: limits which `evaluate' to 0/0 ; e.g.
lim_{x® 0}[sinx/x]
LR# 1: if f(a) = g(a) = 0, f and g both differentiable at a, and
g^{¢}(a) ¹ 0, then
lim_{x® a}[f(x)/g(x)] =
[(f^{¢}(a))/(g^{¢}(a))]
what if [(f^{¢}(a))/(g^{¢}(a))] `=' 0/0 ? LR# 2:
if f(a) = g(a) = 0, f and g both differentiable near a, then
lim_{x® a}[f(x)/g(x)] =
lim_{x® a}[(f^{¢}(x))/(g^{¢}(x))]
other indeterminate forms: [(¥)/(¥)], 0·¥,
¥ ¥,
0^{0}, 1^{¥}, ¥^{0}
LR#3: if f,g®¥ as x® a, then
lim_{x® a}[f(x)/g(x)] =
lim_{x® a}[(f^{¢}(x))/(g^{¢}(x))]
other cases: try to turn them into 0/0 or ¥/¥
in last three cases, do this by taking logs, first
 § 8:

Inverse trigonometric functions
Trig functions (sinx, cosx, tanx, etc.)
aren't onetoone; make them!
sinx, p/2 £ x £ p/2 is onetoone; inverse is
arcsinx
sin(arcsinx)=x, all x;
arcsin(sinx)=x IF x in range above
tanx, p/2 < x < p/2 is onetoone; inverse is
arctanx
tan(arctanx)=x, all x;
arctan(tanx)=x IF x in range above
secx, 0 £ x < p/2 and p/2 < x £ p, is
onetoone; inverse is arcsec x
sec(arcsec x)=x, all x;
arcsec(secx)=x IF x in range above
cos(arcsinx), tan(arcsec x), etc.; use right triangles
other inverse trig functions aren't very useful
 § 9:

Derivatives and integrals of inverse trig functions
Derivatives of inverse functions!
Use right triangles to simplify.
[d/dx](arcsinx) =
[1/([Ö(1x^{2})])]
[d/dx](arctanx) =
[1/(x^{2}+1)]
[d/dx](arcsec x) =
[1/(x[Ö(x^{2}1)])]
Integrals: reverse the formulas.
ò[dx/([Ö(a^{2}x^{2})])] =
arcsin([x/a]) + c
ò[dx/(x^{2}+a^{2})] =
[1/a]arctan([x/a]) + c
ò[dx/(x[Ö(x^{2}a^{2})])] =
[1/a]arcsec([x/a]) + c
Chapter 7: Techniques of integration
 § 1:

Basic integration formulas (AKA dirty tricks)
usubstitution
òf(g(x))g^{¢}(x) dx =
òf(u) du_{u = g(x)}
complete the square
ax^{2}+bx+c = a(x^{2}+rx)+c = a(x+r/2)^{2} +(c(r/2)^{2})
Ex: ò[1/(x^{2}+2x+2)] dx
= ò[1/((x+1)^{2}+1)] dx
use trig identities
sin^{2}x+cos^{2}x = 1, tan^{2}x+1 = sec^{2}x,
sin(2x) = 2sinxcosx, [tanx/secx]=sinx, etc.
Ex: ò[(sin^{2} x)/cosx] dx =
ò[(1cos^{2} x)/cosx] dx = ¼
pull fractions apart; put fractions together!
Ex: ò[(x+1)/(x^{3})] dx =
òx^{2}+x^{3} dx = ¼
do poylnomial long division
Ex: ò[(x^{3})/(x^{2}1)] dx =
òx+[x/(x^{2}1)] dx = ¼
add zero, multiply by one
Ex: òsecx dx =
ò[(secx(tanx+secx))/(secx+tanx)] dx = ¼
 § 2:

Integration by parts
Product rule: d(uv = (du)v + u(dv)
reverse: òu dv = uvòv du
Ex: òxcosx dx : set u=x, dv=cosx dx
du=dx, v = sinx (or any
other antiderivative)
So: òxcosx = xsinx òsinx dx = ¼
special case: òf(x) dx; u = f(x), dv=dx
òf(x) dx = xf(x)òxf^{¢}(x) dx
Ex: òarcsinx dx =
xarcsinxò[x/([Ö(1x^{2})])] = ¼
 § 3:

Partial fractions
rational function = quotient of polynomials
Idea: integrate by writing function as
sum of simpler functions
Procedure: f(x) = [p(x)/q(x)]
(0): make sure degree(p) < degree(q); long division if not
(1): factor q(x) into linear and irreducible quadratic factors
(2): group common factors together as powers
(3a): for each group (xa)^{n} ADD:
[(a_{1})/(xa)]+¼+[(a_{n})/((xa)^{n})]
(3b): for each group (ax^{2}+bx+c)^{n} ADD:
[(a_{1}x+b_{1})/(ax^{2}+bx+c)]+¼+[(a_{n}x+b_{n})/((ax^{2}+bx+c)^{n})]
(4) set f(x)=sum; solve for the `undetermined' coefficients
put sum over a common denomenator (=q(x)); set numerators equal.
always works: multiply out, group common powers, set
coeffs of the two
polynomials equal
Ex: x+3 = a(x1)+b(x2) = (a+b)x+(a2b); 1 = a+b, 3 = a2b
linear term (xa)^{n}: set x = a, will give a coefficient
if n ³ 2, take derivatives of both sides! set x=a, gives another coeff.
Ex: [(x^{2})/((x1)^{2}(x^{2}+1))] =
[A/(x1)]+[B/((x1)^{2})]+[(Cx+D)/(x^{2}+1)] =
[(A(x1)(x^{2}+1)+B(x^{2}+1)+(Cx+D)(x1)^{2})/((x1)^{2}(x^{2}+1))] =
¼
 § 4:

Trig substitution
Idea: get rid of square roots by introducing perfect squares
[Ö(a^{2}x^{2})] : set x = asinu . dx = acosu du, [Ö(a^{2}x^{2})] = acosu
Ex: ò[1/(x^{2}[Ö(1x^{2})])] dx =
ò[cosu/(sin^{2} ucosu)] d u_{x = sinu} =
¼
[Ö(a^{2}+x^{2})] : set x = atanu . dx = asec^{2} u du, [Ö(a^{2}+x^{2})] = asecu
Ex: ò[1/((x^{2}+4)^{3/2})] dx =
ò[(2sec^{2} u)/((2secu)^{3})] d u_{x = 2tanu} =
¼
[Ö(x^{2}a^{2})] : set x = asecu . dx = asecutanu du, [Ö(x^{2}a^{2})] = atanu
Ex: ò[1/(x^{2}[Ö(x^{2}1)])] dx =
ò[secutanu/(sec^{2} u tanu)]
d u_{x = secu} = ¼
Undoing the ``usubstitution'': use right triangles
 § 6:

Improper integrals
usual idea: ò_{a}^{b}f(x) dx = F(b)F(a), where F^{¢}(x) = f(x)
Problems: a = ¥, b = ¥; f blows up at a or b or somewhere in between
integral is``improper''; usual technique doesn't work
Solutions:
ò_{a}^{¥}f(x) dx =
lim_{b®¥}ò_{a}^{b}f(x) dx
(similarly for a = ¥)
(blows up at a) ò_{a}^{b}f(x) dx =
lim_{r® a}ò_{r}^{b}f(x) dx
(similarly for blowup at b (or both!))
(blows up at c (b/w a and b))
ò_{a}^{b}f(x) dx =
lim_{r® c}ò_{a}^{r}f(x) dx
+ lim_{s® c+}ò_{s}^{b}f(x) dx
integral
converges if (all of the) limit(s) are finite
Comparison: 0 £ f(x) £ g(x) for all x;
if ò_{a}^{¥}g(x) dx converges, so does
ò_{a}^{¥}f(x) dx
Limit comparison: f,g ³ 0,
lim_{x®¥}[f(x)/g(x)] = L, L ¹ 0,¥, then
ò_{a}^{¥}f(x) dx and
ò_{a}^{¥}g(x) dx either both converge or both diverge
Chapter 8: Infinite series
 § 1:

Limits of sequences of numbers
A sequence is:
a string of numbers
a function f:N®R; write f(n) = a_{n}
Recursive formula: formula for a_{n} based on a_{n1}, a_{n2},¼ e.g., a_{n}=1+(1/a_{n1})
Graph: on number line; in Cartesian plane
Basic question: convergence/divergence
lim_{n®¥}a_{n} = L (or a_{n}® L) if
eventually all of the a_{n} are always as close to L as we like, i.e.
for any e > 0, there is an N so that if n ³ N then a_{n}L < e
Ex.: a_{n} = 1/n converges to 0 ; always choose N=1/e
a_{n} = (1)^{n} diverges; terms of the sequence never settle down to a
single number
Subsequence: a_{nk}}_{k = 1}^{¥} ; chooose only occasional terms out of
] a_{n}}_{n = 1}^{¥}
If a_{n}® L, then a_{nK}® L for every subsequence
If a_{n} is increasing (a_{n+1} ³ a_{n} for every n) and bounded from above
(a_{n} £ M for every n, for some M) , THEN a_{n} converges (but not necessarily to M !)
 § 2:

Limit theorems for sequences
Idea: limits of sequences are alot like limits of functions
Is a_{n}® L and b_{n}® M, then
(a_{n}+b_{n}® L+M (a_{n}b_{n}® LM hsk (a_{n} b_{n}® L M , and
(a_{n}/b_{n}® L/M (provided M, all b_{n} are ¹ 0)
Sqeeze play: if a_{n} £ b_{n} £ c_{n} (for all n large enough) and
a_{n}® L and c_{n}® L , then b_{n}® L
If a_{n}® L and f:R®R is continuous at L, then
f(a_{n})® f(L)
if a_{n} = f(n) for some function f:R®R and
lim_{x®¥}f(x) = L , then
a_{n}® L
(allows us to use L'Hopital's Rule!)
Another basic list: (x = fixed number, k = konstant)
[1/n]® 0 k ® k
x^{[1/n]}® 1 n^{[1/n]}® 1
(1+[x/n])^{n}® e^{x} [(x^{n})/n!]® 0
x^{n}® 0, if x < 1 ; 1, if x = 1 ; diverges, otherwise
 § 3:

Infinite series
An infinite series is an infinite sum of numbers
a_{1}+a_{2}+a_{3}+¼ = å_{n = 1}^{¥} a_{n} (summation notation)
nth term of series = a_{n} ; Nth partial sum of series = s_{N} = å_{n = 1}^{N} a_{n}
Infinite series converges is the sequence of partial sums {s_{N}}_{N = 1}^{¥} converges
We may start the series anywhere: å_{n = 0}^{¥} a_{n},
å_{n = 1}^{¥} a_{n},
å_{n = 3437}^{¥} a_{n}, etc. ;
convergence is unaffected (but the number it adds up to is!)
Ex. geometric series:
a_{n} = ar^{n} ; å_{n = 0}^{¥} a_{n} = [a/(1r)]
if r < 1; otherwise, the series diverges.
Ex. Telescoping series: partial sums s_{N} `collapse' to a simple expression
E.g. å_{n = 1}^{¥} [1/(n(n+2))] ; s_{N} =
[1/2](([1/3]+[1/8])([1/((N1)(N+1))]+[1/(N(N+2))]))
nth term test: if å_{n = 1}^{¥} a_{n} converges, then a_{n}® 0
So if the nth terms don't go to 0, then å_{n = 1}^{¥} a_{n} diverges
Basic limit theorems: if å_{n = 1}^{¥} a_{n} and
å_{n = 1}^{¥} b_{n} converge, then
å_{n = 1}^{¥} (a_{n}+b_{n})=
å_{n = 1}^{¥} a_{n}+å_{n = 1}^{¥} b_{n}
å_{n = 1}^{¥} (a_{n}b_{n})=
å_{n = 1}^{¥} a_{n}å_{n = 1}^{¥} b_{n}
å_{n = 1}^{¥} (k a_{n})= kå_{n = 1}^{¥} a_{n}
(Note: there are no no NO results corresponding to products or quotients.)
Truncating a series: å_{n = 1}^{¥} a_{n} =
å_{n = N}^{¥} a_{n} + å_{n = 1}^{N1} a_{n}
 § 4:

The integral test
Idea: å_{n = 1}^{¥} a_{n} with a_{n} ³ 0 all n, then
the partial sums
{ s_{N}}_{N = 1}^{¥} form an increasing sequence;
so converge exactly when bounded from above
If (eventually) a_{n} = f(n) for a decreasing function f:[a,¥)®R, then
ò_{a+1}^{N+1} f(x) dx £ s_{N} = å_{n = a}^{N} a_{n}
£ ò_{a}^{N} f(x) dx
so å_{n = a}^{¥} a_{n} converges exactly when
ò_{a}^{¥} f(x) dx converges
Ex: å_{n = 1}^{¥} [1/(n^{p})] converges exactly when p > 1 (pseries)
 § 5:

Comparison tests
Again, think å_{n = 1}^{¥} a_{n} , with a_{n} ³ 0 all n
Convergence depends only on partial sums s_{N} being bounded
One easy way to determine this: compare series with
one we know converges or diverges
Comparison test: If b_{n} ³ a_{n} ³ 0 for all n (past a certain point), then
if å_{n = 1}^{¥} b_{n} converges, so does å_{n = 1}^{¥} a_{n}
if å_{n = 1}^{¥} a_{n} diverges, so does å_{n = 1}^{¥} b_{n}
(i.e., smaller than a convergent series converges;
bigger than a divergent series diverges)
More refined: Limit comparison test: a_{n} and b_{n} ³ 0 for all n,
[(a_{n})/(b_{n})]® L
If L ¹ 0 and L ¹ ¥, then åa_{n} anf åb_{n} either
both converge or both diverge
If L = 0 and åb_{n} converges, then so does åa_{n}
If L = ¥ and åb_{n} diverges, then so does åa_{n}
(Why? eventually (L/2) b_{n} £ a_{n} £ (3L/2) b_{n} ; so can use comparison test.)
Ex: å1/(n^{3}1 converges; Lcomp with å1/n^{3}
ån/3^{n} converges; Lcomp with å1/2^{n}
å1/(nln(n^{2}+1) diverges; Lcomp with å1/(nlnn)
 § 6:

The ratio and root tests
Previous tests have you compare your series with something else
(another series, an improper integral)
Ratio Test: åa_{n}, a_{n} > 0 all n;
lim_{n® ¥}[(a_{n+1})/(a_{n})] = L
If L < 1 then åa_{n} converges
If L > 1, then åa_{n} diverges
If L = 1, then try something else!
Root Test: åa_{n}, a_{n} > 0 all n;
lim_{n® ¥}(a_{n}(^{1/n} = L
If L < 1 then åa_{n} converges
If L > 1, then åa_{n} diverges
If L = 1, then try something else!
Ex: å[(4^{n})/n!] converges by ratio test
å[(n^{5})/(n^{n})] converges by the root test
 § 8:

Power series
Idea: turn a series into a function, by making the terms a_{n} on x
replace a_{n} with a_{n} x^{n} ; series of powers
å_{n = 0}^{¥} a_{n} x^{n} = power series centered at 0
å_{n = 0}^{¥} a_{n} (xa)^{n} = power series centered at a
Big question: where does it converge? Solution from ratio test
lim[(a_{n+1})/(a_{n})]=L, set R=[1/L]
then å_{n = 0}^{¥} a_{n} (xa)^{n} converges for xa < R
diverges for xa > R ; R = radius of convergence
Ex.: å_{n = 0}^{¥} x^{n} = [1/(1x)] ; conv. for x < 1
Idea: partial sums å_{k = 0}^{n} a_{k} x^{k} are ;
if f(x)=å_{n = 0}^{¥} a_{n} x^{n}, then the poly's make good approximations for f
Differentiation and integration of power series
Idea: if you diff. or int. each term of a power series, you get a power series
which is the deriv. or integral of the original one.
If f(x) = å_{n = 0}^{¥} a_{n} (xa)^{n} has radius of conv R,
then so does g(x) = å_{n = 1}^{¥} n a_{n} (xa)^{n1}, and
g(x) = f^{¢}(x)
AND so does g(x) = å_{n = 0}^{¥} [(a_{n})/(n+1)] (xa)^{n+1}, and
g^{¢}(x) = f(x)
Ex: f(x) = å_{n = 0}^{¥} [(x^{n})/n!], then
f^{¢}(x) = f(x) , so (since f(0) = 1) f(x) = e^{x} =
å_{n = 0}^{¥} [(x^{n})/n!]
Ex.: [1/(1x)] = å_{n = 0}^{¥} x^{n}, so
ln(1x) = å_{n = 0}^{¥} [(x^{n+1})/(n+1)]
(for x<1)
Ex:. arctanx = ò[1/(1(x^{2}))] dx = òå_{n = 0}^{¥} (x^{2})^{n} dx =
å_{n = 0}^{¥} [((1)^{n}x^{2n+1})/(2n+1)] (for x < 1)
 § 9:

Taylor and MacLaurin series
Idea: start with function f(x), find power series FOR it.
IF f(x) = å_{n = 0}^{¥} a_{n} (xa)^{n}, then (term by term diff.)
f^{(n)}(a) = n!a_{n} ; SO a_{n} = [(f^{(n)}(a))/n!]
Starting with f, define P(x) = å_{n = 0}^{¥} [(f^{(n)}(a))/n!] (xa)^{n} ,
the Taylor series for f, centered at a.
P_{n}(x) = å_{k = 0}^{n} [(f^{(k)}(a))/k!] (xa)^{k} , the nth Taylor
polynomial for f.
Ex.: f(x) = sinx, then P(x) = å_{n = 0}^{¥} [((1)^{n})/((2n+1)!)] x^{2n+1}
Big questions: Is f(x) = P(x) ? (I.e., does f(x)P_{n}(x) tend to 0 ?)
If so, how well do the Pn's approximate f ? (I.e., how small IS f(x)P_{n}(x) ?)
 § 10:

Error estimates
f(x) = å_{n = 0}^{¥} [(f^{(n)}(a))/n!] (xa)^{n}
means that the
value of f at a point x (far from a) can be determined just from the behavior of f
near a (i.e., from the derivs. of f at a). This is a VERY powerful property, one that
we wouldn't ordinarily expect to be true. The amazing thing is that it often IS:
P(x,a) = å_{n = 0}^{¥} [(f^{(n)}(a))/n!] (xa)^{n} ;
P_{n}(x,a) = å_{k = 0}^{n} [(f^{(k)}(a))/k!] (ka)^{n} ;
R_{n}(x,a)= f(x)P_{n}(x,a) = nth remainder term = error in using P_{n} to approximate f
Taylor's remainder theorem : estimates the size of R_{n}(x,a)
If f(x) and all of its derivatives (up to n+1) are continuous on [a,b], then
f(b) = P_{n}(b,a) + [(f^{(n+1)}(c))/((n+1)!)] (ba)^{n+1} , for some
c in [a,b]
i.e., for each x, R_{n}(x,a) =
[(f^{(n+1)}(c))/((n+1)!)] (xa)^{n+1} , for some c between a and x
so if F^{(n+1)}(x)leq M for every x in [a,b], then
R_{n}(x,a) £ [M/((n+1)!)] (xa)^{n+1} for every x in [a,b]
Ex.: f(x)=sinx, then f^{(n+1)}(x) £ 1 for all x, so
R_{n}(x,0) £ [(x^{n+1})/((n+1)!)]® 0
as n®¥
so sinx = å_{n = 0}^{¥} [((1)^{n})/((2n+1)!)] x^{2n+1}
Similarly, cosx = å_{n = 0}^{¥} [((1)^{n})/(2n)!] x^{2n}
Use Taylor's remainder to estimate values of functions:
e^{x} = å_{n = 0}^{¥} [((x)^{n})/(n)!], so
e=e^{1}=å_{n = 0}^{¥} [1/(n)!]
R_{n}(1,0) = [(f^{(n+1)}(c))/((n+1)!)] = [(e^{c})/((n+1)!)] £ [(e^{1})/((n+1)!)] £ [4/((n+1)!)]
since e < 4 (since ln(4) > (1/2)(1)+(1/4)(2) = 1)
(Riemann sum for integral of 1/x)
so since [4/((13+1)!)] = 4.58×10^{11},
e =
1+1+[1/2]+[1/6]+[1/24]+[1/120]+ ¼+[1/13!] , to 10 decimal places.
Other uses: if you know the Taylor series, it tells you the values of the derivatives at the center.
Ex.: e^{x}=å_{n = 0}^{¥} [((x)^{n})/(n)!], so
xe^{x} = å_{n = 0}^{¥} [((x)^{n+1})/(n)!], so
15th deriv of xe^{x} , at 0, is 15!(coeff of x^{15}) =
[15!/14!] = 15
Substitutions: new Taylor series out of old ones
Ex. sin^{2} x = [(1cos(2x))/2] =
[1/2](1å_{n = 0}^{¥}[((1)^{n}(2x)^{2n})/(2n)!]
= [1/2](1(1[((2x)^{2})/2!]+[((2x)^{4})/4!][((2x)^{6})/6!]+¼
= [(2x^{2})/2!][(2^{3}x^{4})/4!]+[(2^{5}x^{6})/6!][(2^{7}x^{8})/8!]+¼
Integrate functions we can't handle any other way:
Ex.: e^{x2} = å_{n = 0}^{¥} [((x)^{2}n)/(n)!], so
òe^{x2} dx = å_{n = 0}^{¥} [((x)^{2n+1})/(n!(2n+1))]
File translated from T_{E}X by T_{T}H, version 0.9.