Location (Chap. 1)  Correction 
Location (Chap. 2)  Correction 
page 6  The text above Remark 1 should read
"Replacing C by f(a) ..." 
page 6  some of the decimal expansions in the box at
the top are approximate (not exactly equal); same with 1/3 further
down 
page 12  line 5 formula has extra f(x_k) and delta
x; on line 7, extra delta x 
page 15  first integral has value 100/3 (only
approximately 333.33) 
Location (Chap. 3)  Correction 
page 8  in solution of 3., cos(0)=1 (not 1) 
Location (Chap. 4)  Correction 
page 4  in (4.3), y(0) = y_0 (not 0) 
Location (Chap. 5)  Correction 
Section 5.1  Most references to "surface" in this section should actually refer to "volume". 
Location (Chap. 6)  Correction 
page 3  line 9: in defn. of f'(x), limit as Delta x > 0
(not x > 0) 
page 23  About 1/3 of the way down the page:
"Then du = sec(x) tan(x) dx while dv = ...".
The "dv" should be a "v".

page 25  #5 should be cos(u) + C (not cos(u) + C) 
Location (Chap. 8)  Correction 
page 4  The formulas for (and explanation of) mean and
variance are a little confusing. First of all, the x_{i} here are
not the same as in Fig. 8.1. The x is a random variable
 some quantity we are interested in (such as the number of "heads")
which depends on the outcome of the random events (such as tossing a coin)
 which can take on the possible values x_{0}, x_{1},
x_{2}, etc.
For the example where x = # of "heads" in n coin tosses, the
possible values are x_{0}=0, x_{1}=1,...,x_{n}=n;
that is, x_{i} = i for i=0,1,...,n
(the index set doesn't always have to be i=0,1,..,n, but it is in
this example). Then the expected value (or mean) of the random
variable x is given by
sum_{{i=0}}^{n} x_{i} p(x_{i})
= x_{0} p(x_{0}) + x_{1} p(x_{1}) + ... +
x_{n} p(x_{n}),
where p(x_{i}) is the probability that x takes the value
x_{i} (you could also write this as p(x = x_{i})).

page 8  The "multiplication principle" can be stated more
generally for any two events e_{1} and e_{2}:
P(e_{1}
and e_{2})=P(e_{1})P(e_{2} assuming that e_{1}
happened.)

page 9  The "addition principle" as stated is for
mutually exclusive events e_{1} and e_{2}
(not independent events). More generally, for any two
events (mutually exclusive or otherwise):
P(e_{1} or e_{2}) = P(e_{1}) + P(e_{2}) 
P(e_{1} and e_{2})

page 16  in Fig. 8.3 (b) and (c) it should be
P(n,k) = n(n1)(n2)...(nk+1) (the product should not
go to (nk) as shown) 
Location (Chap. 9)  Correction 
page 8  Section 9.2.1 should be titled Random
Nonassortative Mating. Assortative mating refers to the case in which
individuals with the same alleles are more likely to mate with each other than
with those having the other alleles (e.g. AA mates with AA preferentially over Aa).
