Commit 6d278ada authored by Jim Hefferon's avatar Jim Hefferon

corrections from David Guichard

parent bb892c5e
This diff is collapsed.
......@@ -2446,7 +2446,7 @@ we have the matrix whose rows are the corresponding $\iota$'s.
\end{equation*}
\begin{example} \label{ex:TwoThreePermsAndMatrices}
These are the permutation matrices associated with the $2$-permutations
These are the permutation matrices for the $2$-permutations
listed in \nearbyexample{ex:AllTwoThreePerms}.
\begin{equation*}
P_{\phi_1}
......
This diff is collapsed.
......@@ -294,7 +294,7 @@ Every space is isomorphic to itself under the identity map.
\begin{definition} \label{df:Automorphism}
%<*df:Automorphism>
An \definend{automorphism}\index{automorphism} %
An \definend{automorphism}\index{automorphism}
is an isomorphism of a space with
itself\index{isomorphism!of a space with itself}.
%</df:Automorphism>
......@@ -373,7 +373,7 @@ captures our intuition of vector spaces being the same.
Of course, the definition itself is persuasive:~a vector space
consists of a set and some structure
and the definition simply requires
that the sets correspond and that the structures corresponds also.
that the sets correspond and that the structures correspond also.
Also persuasive are the examples above, such as
\nearbyexample{exam:TwoWideIsoTwoTall}
giving the isomorphism between the space of two-wide row vectors and
......
......@@ -1349,7 +1349,7 @@ The difference is that homomorphisms are subject to fewer restrictions
because they needn't be onto and
needn't be one-to-one.
We will examine what can happen with homomorphisms
that cannot happen to isomorphisms.
that cannot happen with isomorphisms.
We first consider the effect of
not requiring that a homomorphism be
......@@ -1673,12 +1673,12 @@ is a subspace of the domain.
%</lm:NullspIsSubSp>
\end{lemma}
\begin{remark}
This result is about inverse images of sets
$h^{-1}(S)=\set{\vec{v}\in V\suchthat h(\vec{v})\in S}$
whereas the examples above consider inverse images of single vectors.
\begin{remark}\
The examples above consider inverse images of single vectors
but this result is about inverse images of sets
$h^{-1}(S)=\set{\vec{v}\in V\suchthat h(\vec{v})\in S}$.
We use the same term in both cases by defining the inverse
image of the element $h^{-1}(\vec{w})$ as
image of a single element $h^{-1}(\vec{w})$ as
the inverse image of the one-element set $h^{-1}(\set{\vec{w}})$.
\end{remark}
......@@ -1688,7 +1688,7 @@ Let $\map{h}{V}{W}$ be a homomorphism
and let $S$ be a subspace of the range space of $h$.
Consider the inverse image of $S$.
It is nonempty because it contains $\zero_V$, since
\( h(\zero_V)=\zero_W \) and \( \zero_W \) is an element $S$,
\( h(\zero_V)=\zero_W \) and \( \zero_W \) is an element of $S$,
as $S$ is a subspace.
To finish we show that it is closed under linear combinations.
Let \( \vec{v}_1 \) and \( \vec{v}_2 \) be two elements of $h^{-1}(S)$.
......@@ -1708,7 +1708,7 @@ is a member of $S$.
%<*df:NullSpace>
The \definend{null space}\index{homomorphism!null space}\index{null space}
or \definend{kernel}\index{kernel} of a linear map
\( \map{h}{V}{W} \) is the inverse image of $0_W$.
\( \map{h}{V}{W} \) is the inverse image of $\zero_W$.
\begin{equation*}
\nullspace{h}=h^{-1}(\zero_W)=\set{\vec{v}\in V\suchthat h(\vec{v})=\zero_W}
\end{equation*}
......
......@@ -98,9 +98,9 @@ We can also consider how to compute the representation of the sum of two maps
from the representation of those maps.
\begin{example}
Let two linear maps with the same domain and codomain
\( \map{f,g}{\Re^2}{\Re^3} \) be represented
with respect to some bases $B$ and $D$ by these matrices.
Suppose that two linear maps with the same domain and codomain
\( \map{f,g}{\Re^2}{\Re^3} \) are represented
with respect to some bases $B$ and~$D$ by these matrices.
\begin{equation*}
\rep{f}{B,D}=
\begin{mat}
......@@ -933,8 +933,8 @@ function composition is possible.
\;\stackrel{g}{\longrightarrow}\;
\text{dimension \( m \) space}
\end{equation*}
So matrix product has a $\nbym{m}{r}$ matrix~$G$
times a $\nbym{r}{n}$ matrix~$F$ to get a
So the matrix product has an $\nbym{m}{r}$ matrix~$G$
times an $\nbym{r}{n}$ matrix~$F$ to get an
$\nbym{m}{n}$ result~$GF$.
Briefly,
`$\nbym{m}{r}\text{\ times\ }\nbym{r}{n}\text{\ equals\ }\nbym{m}{n}$'.
......
File mode changed from 100755 to 100644
......@@ -1291,8 +1291,8 @@ tells us that a linearly independent set is maximal when it spans the space.
\end{answer}
\recommended \item
\begin{exparts}
\partsitem Show that if the\( \set{\vec{u},\vec{v},\vec{w}} \)
is linearly independent then so is
\partsitem Show that if the set \( \set{\vec{u},\vec{v},\vec{w}} \)
is linearly independent then so is the set
\( \set{\vec{u},\vec{u}+\vec{v},\vec{u}+\vec{v}+\vec{w}} \).
\partsitem What is the relationship between the linear independence
or dependence of \( \set{\vec{u},\vec{v},\vec{w}} \) and the
......
......@@ -362,7 +362,7 @@ Then we have this.
Here, although we've omitted the subscript \( B \) from the column,
the fact that the right side is a representation is clear from the context.
The advantage the notation and the term `coordinates' is that they
The advantage of the notation and the term `coordinates' is that they
generalize the familiar case:~in \( \Re^n \)
and with respect to the standard
basis \( \stdbasis_n \), the vector starting at the origin and ending at
......@@ -1428,7 +1428,7 @@ so \( \vec{\delta}_1 \) is a nontrivial linear combination of elements of
\( B \).
By the Exchange Lemma, we can swap \( \vec{\delta}_1 \) for a
vector from \( B \), resulting in a basis \( B_1 \), where one element is
\( \vec{\delta} \) and all of the \( n-1 \) other elements
\( \vec{\delta}_1 \) and all of the \( n-1 \) other elements
are \( \vec{\beta} \)'s.
%</pf:AllBasesSameSize1>
......@@ -1536,7 +1536,7 @@ Any linearly independent set can be expanded to make a basis.
If a linearly independent set
is not already a basis then it must not span the space.
Adding to the set a vector that is not in the span
will preserves linear independence.
will preserve linear independence.
Keep adding until the resulting set does span the space,
which the prior corollary
shows will happen after only a finite number of steps.
......@@ -2360,7 +2360,7 @@ set.
Lemma~One.III.\ref{le:EchFormNoLinCombo}
says that no nonzero row of an echelon form matrix is
a linear combination of the other rows.
This is restates that result in this chapter's terminology.
This restates that result in this chapter's terminology.
%</pf:RowsEchMatLI>
\end{proof}
......@@ -4268,7 +4268,7 @@ To show that $V$ is a direct sum of the two, we need only show that
the spaces are independent\Dash no nonzero member of the first is expressible
as a linear combination of members of the second, and vice versa.
This is true because any relationship
$\vec{w}_1=c_1\vec{w}_{2,1}+\dots+d_k\vec{w}_{2,k}$ (with $\vec{w}_1\in W_1$
$\vec{w}_1=c_1\vec{w}_{2,1}+\dots+c_k\vec{w}_{2,k}$ (with $\vec{w}_1\in W_1$
and $\vec{w}_{2,j}\in W_2$ for all $j$) shows that the vector on the left is
also in $W_2$, since the right side is a combination of members of $W_2$.
The intersection of these two spaces is trivial, so $\vec{w}_1=\zero$.
......
File mode changed from 100755 to 100644
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment