Skip to content
Next
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Switch to GitLab Next
Sign in / Register
Toggle navigation
L
linearalgebra
Project
Project
Details
Activity
Releases
Dependency List
Cycle Analytics
Insights
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Packages
Packages
Container Registry
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Jim Hefferon
linearalgebra
Commits
dcc4189f
Commit
dcc4189f
authored
Jan 09, 2012
by
Jim Hefferon
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
spell checks for the third chapter
parent
5526a5ff
Changes
11
Expand all
Hide whitespace changes
Inline
Sidebyside
Showing
11 changed files
with
12389 additions
and
12389 deletions
+12389
12389
book.pdf
book.pdf
+5354
5381
bookans.tex
bookans.tex
+16
16
homogeom.tex
homogeom.tex
+4
4
jhanswer.pdf
jhanswer.pdf
+6980
6953
lstsqs.tex
lstsqs.tex
+1
1
map1.tex
map1.tex
+1
1
map2.tex
map2.tex
+4
4
map3.tex
map3.tex
+8
8
map4.tex
map4.tex
+6
6
map5.tex
map5.tex
+8
8
map6.tex
map6.tex
+7
7
No files found.
book.pdf
View file @
dcc4189f
This diff is collapsed.
Click to expand it.
bookans.tex
View file @
dcc4189f
...
...
@@ 11167,7 +11167,7 @@
\end{aligned}
\end{multline*}
(An alternate proof is to simply note that this is a
property of differentiation that is familar from calculus.)
property of differentiation that is famil
i
ar from calculus.)
These two maps are not inverses as this composition
does not act as the identity map on
...
...
@@ 11687,7 +11687,7 @@
\colvec{f_1(\vec{v}) \\ f_2(\vec{v})}
\end{equation*}
They are linear because they are the composition of linear functions,
and the fact that the compo
is
tion of linear functions is linear
and the fact that the compo
si
tion of linear functions is linear
was part of the proof that isomorphism is an equivalence
relation (alternatively, the check that they are linear is
straightforward).
...
...
@@ 12757,7 +12757,7 @@
\colvec[r]{2 \\ 0}\colvec[r]{1 \\ 0}=\colvec[r]{3 \\ 0}
\end{equation*}
A more system
m
atic way to find the image of $\vec{e}_2$ is to
A more systematic way to find the image of $\vec{e}_2$ is to
use the given information to represent the transformation, and then
use that representation to determine the image.
Taking this for a basis,
...
...
@@ 13344,7 +13344,7 @@
\end{equation*}
gives the additional information (beyond that there is at least one
solution) that there are infinitely many solutions.
Parametizing gives $c_2=1+c_3$ and $c_1=1$, and so taking $c_3$ to
Paramet
r
izing gives $c_2=1+c_3$ and $c_1=1$, and so taking $c_3$ to
be zero gives a particular solution of $c_1=1$, $c_2=1$, and
$c_3=0$ (which is, of course, the observation made at the start).
\end{exparts}
...
...
@@ 13368,7 +13368,7 @@
\colvec[r]{0 \\ 0 \\ 1}
\!\mapsto\colvec[r]{3 \\ 4}
\end{equation*}
So, for this first one, we are asking whether th
a
re are scalars such that
So, for this first one, we are asking whether th
e
re are scalars such that
\begin{equation*}
c_1\colvec[r]{1 \\ 0}+c_2\colvec[r]{1 \\ 1}
+c_3\colvec[r]{3 \\ 4}=\colvec[r]{1 \\ 3}
...
...
@@ 13529,7 +13529,7 @@
\end{ans}
\begin{ans}{Three.III.2.16}
Let the matrix be $G$, and suppose that it r
pe
resents $\map{g}{V}{W}$
Let the matrix be $G$, and suppose that it r
ep
resents $\map{g}{V}{W}$
with respect to bases $B$ and $D$.
Because $G$ has two columns, $V$ is twodimensional.
Because $G$ has two rows, $W$ is twodimensional.
...
...
@@ 13579,7 +13579,7 @@
\end{ans}
\begin{ans}{Three.III.2.18}
Recall that the represention map
Recall that the represent
at
ion map
\begin{equation*}
V\mapsunder{\text{Rep}_{B}}\Re^n
\end{equation*}
...
...
@@ 13710,7 +13710,7 @@
to its dot product with $\vec{x}$ is linear (this is a matrixvector
product and so \nearbytheorem{th:MatIsLinMap} applies).
Thus the map under consideration $h_{\vec{x}}$ is linear because
it is the composi
s
tion of two linear maps.
it is the composition of two linear maps.
\begin{equation*}
\vec{v}\mapsto \rep{\vec{v}}{B}
\mapsto \vec{x}\cdot\rep{\vec{v}}{B}
...
...
@@ 13898,7 +13898,7 @@
h_{1,j}\vec{\delta}_1+\dots+h_{i,j}\vec{\delta}_i
+\dots+h_{m,j}\vec{\delta}_m
\end{equation*}
and with resp
ce
t to $B,2\cdot D$ it also represents
and with resp
ec
t to $B,2\cdot D$ it also represents
\( \map{h_2}{V}{W} \) sending
\begin{equation*}
\vec{\beta}_j\mapsto
...
...
@@ 16421,7 +16421,7 @@
The proof tells us what how the bases change.
We start by swapping the first and second rows
of the representation with respect to $B$ to get a representation
with res
e
pect to a new basis $B_1$.
with respect to a new basis $B_1$.
\begin{equation*}
\rep{1x+3x^2x^3}{B_1}=
\colvec[r]{1 \\ 0 \\ 1 \\ 2}_{B_1}
...
...
@@ 16631,7 +16631,7 @@
\colvec[r]{1 \\ 1}=1\cdot\colvec[r]{1 \\ 0}
+1\cdot\colvec[r]{0 \\ 1}
\end{equation*}
give the other nonsinguar matrix.
give the other nonsingu
l
ar matrix.
\begin{equation*}
\rep{\identity}{\hat{B},B}=\begin{mat}[r]
0 &1 \\
...
...
@@ 17301,7 +17301,7 @@
Suppose that \( \vec{v}\in\Re^n \) with \( n>1 \).
If \( \vec{v}\neq\zero \) then we consider the line
\( \ell=\set{c\vec{v}\suchthat c\in\Re} \) and if \( \vec{v}=\zero \)
we take \( \ell \) to be any (nondegenerate) line at all
we take \( \ell \) to be any (non

degenerate) line at all
(actually, we needn't distinguish between these two cases\Dash see
the prior exercise).
Let \( v_1,\dots,v_n \) be the components of \( \vec{v} \);
...
...
@@ 17329,7 +17329,7 @@
The dimension \( n=0 \) case is the trivial vector space, here
there is only one vector and so it cannot be expressed as the projection
of a different vector.
In the dimension $n=1$ case there is only one (nondegenerate) line,
In the dimension $n=1$ case there is only one (non

degenerate) line,
and every vector is in it, hence every vector is the projection only
of itself.
...
...
@@ 17670,7 +17670,7 @@
\end{ans}
\begin{ans}{Three.VI.2.12}
We can param
a
trize the given space can in this way.
We can param
e
trize the given space can in this way.
\begin{equation*}
\set{\colvec{x \\ y \\ z} \suchthat x=yz}
=\set{\colvec[r]{1 \\ 1 \\ 0}\cdot y+\colvec[r]{1 \\ 0 \\ 1}\cdot z
...
...
@@ 17865,7 +17865,7 @@
meets the vertical dashed line
$\vec{v}(1\cdot\vec{e}_1+2\cdot\vec{e}_2)$; this is what
first item of this question proved.
The Pythagorean theorem then gives that the hypoten
e
use\Dash the
The Pythagorean theorem then gives that the hypotenuse\Dash the
segment from $\vec{v}$ to any other vector\Dash is longer than
the vertical dashed line.
...
...
@@ 19644,7 +19644,7 @@
perform the row operations and, if needed, column operations
to reduce it to a partialidentity matrix.
We will then translate that into a factorization $H=PBQ$.
Subs
i
tituting into the general matrix
Substituting into the general matrix
\begin{equation*}
\rep{r_\theta}{\stdbasis_2,\stdbasis_2}
\begin{mat}
homogeom.tex
View file @
dcc4189f
...
...
@@ 158,7 +158,7 @@ For contrast the next picture shows the effect of the map represented by
$
C
_{
2
,
1
}
(
1
)
$
.
Here vectors are affected according to their
second component:
$
\binom
{
x
}{
y
}$
slides hor
o
zontally by twice
$
y
$
.
$
\binom
{
x
}{
y
}$
slides hor
i
zontally by twice
$
y
$
.
\begin{center}
\includegraphics
{
ch3.57
}
\end{center}
...
...
@@ 174,7 +174,7 @@ $H=T_nT_{n1}\cdots T_jBT_{j1}\cdots T_1$,
and so, in some sense, we have an understanding
of the action of any matrix
$
H
$
.
We will illustrate the useful
l
ness of our understanding in two ways.
We will illustrate the usefulness of our understanding in two ways.
The first is that we will use it to prove something about linear maps.
Recall that under a linear map, the image of a subspace is a subspace
and thus the linear transformation
$
h
$
represented by
$
H
$
maps lines
...
...
@@ 192,7 +192,7 @@ Therefore their composition also preserves lines.
% Thus, by understanding its components we can understand arbitrary square
% matrices $H$, in the sense that we can prove things about them.
The second way that we will illustrate the useful
l
ness of
The second way that we will illustrate the usefulness of
our understanding is to apply it to Calculus.
Below is a picture
of the action of the onevariable real function
\(
y
(
x
)=
x
^
2
+
x
\)
.
...
...
@@ 430,7 +430,7 @@ is appealing both for its simplicity and for its usefulness.
perform the row operations and, if needed, column operations
to reduce it to a partialidentity matrix.
We will then translate that into a factorization
$
H
=
PBQ
$
.
Subs
i
tituting into the general matrix
Substituting into the general matrix
\begin{equation*}
\rep
{
r
_
\theta
}{
\stdbasis
_
2,
\stdbasis
_
2
}
\begin{mat}
...
...
jhanswer.pdf
View file @
dcc4189f
This diff is collapsed.
Click to expand it.
lstsqs.tex
View file @
dcc4189f
...
...
@@ 616,7 +616,7 @@ record was 1954May06.
\textit
{
(This illustrates that there are data sets for which a
linear model is not right, and that the line of best fit doesn't
in that case have any predictive value.)
}
In a highway resturant a trucker told me that his boss often sends
In a highway rest
a
urant a trucker told me that his boss often sends
him by a roundabout route, using more gas
but paying lower bridge tolls.
He said that New York state sets the bridge
...
...
map1.tex
View file @
dcc4189f
...
...
@@ 2178,7 +2178,7 @@ classes,
the reduced echelon form matrices.
In this section we have followed that outline,
except that the appropriate notion of same

ness
except that the appropriate notion of sameness
here is vector space isomorphism.
First we defined isomorphism, saw some examples,
and established some properties.
...
...
map2.tex
View file @
dcc4189f
...
...
@@ 600,7 +600,7 @@ is more fruitful and more central to further progress.
\end{aligned}
\end{multline*}
(An alternate proof is to simply note that this is a
property of differentiation that is familar from calculus.)
property of differentiation that is famil
i
ar from calculus.)
These two maps are not inverses as this composition
does not act as the identity map on
...
...
@@ 1295,7 +1295,7 @@ is more fruitful and more central to further progress.
\colvec
{
f
_
1(
\vec
{
v
}
)
\\
f
_
2(
\vec
{
v
}
)
}
\end{equation*}
They are linear because they are the composition of linear functions,
and the fact that the compo
is
tion of linear functions is linear
and the fact that the compo
si
tion of linear functions is linear
was part of the proof that isomorphism is an equivalence
relation (alternatively, the check that they are linear is
straightforward).
...
...
@@ 1486,7 +1486,7 @@ We lose that the domain
corresponds perfectly to the range.
What we retain, as the examples below illustrate,
is that a homomorphism describes how
the domain is ``like'' or ``analgous to'' the range.
the domain is ``like'' or ``anal
o
gous to'' the range.
\begin{example}
\label
{
ex:RThreeHomoRTwo
}
%\label{exPicProj}
We think of
$
\Re
^
3
$
as like
$
\Re
^
2
$
except that vectors have an extra
...
...
@@ 1802,7 +1802,7 @@ Equality holds if and only if the nullity of the map is $0$.
We know
that an isomorphism exists between two spaces
if and only if the dimension of the range equals the dimension of the domain.
We have now seen that for a homomorphism to exist a ne
x
essary condition is that
We have now seen that for a homomorphism to exist a ne
c
essary condition is that
the dimension of the range must be less than or equal to the
dimension of the domain.
For instance, there is no homomorphism
...
...
map3.tex
View file @
dcc4189f
...
...
@@ 1214,7 +1214,7 @@ for any matrix there is an associated linear map.
\colvec
[r]
{
2
\\
0
}

\colvec
[r]
{
1
\\
0
}
=
\colvec
[r]
{
3
\\
0
}
\end{equation*}
A more system
m
atic way to find the image of
$
\vec
{
e
}_
2
$
is to
A more systematic way to find the image of
$
\vec
{
e
}_
2
$
is to
use the given information to represent the transformation, and then
use that representation to determine the image.
Taking this for a basis,
...
...
@@ 1988,7 +1988,7 @@ but we do not have particular spaces or bases in mind then
we often take the
domain and codomain to be
$
\Re
^
n
$
and
$
\Re
^
m
$
and use the standard
bases.
This is conv
ien
ent because with the standard bases
This is conv
eni
ent because with the standard bases
vector representation is transparent
\Dash
the representation of
$
\vec
{
v
}$
is
$
\vec
{
v
}$
.
(In this case the
...
...
@@ 2081,7 +2081,7 @@ that superspace
(because any basis for the rangespace is a linearly independent subset
of the codomain
whose size is equal to the dimension of the codomain, and thus so this
basis for the r
ea
gespace must also be
basis for the r
an
gespace must also be
a basis for the codomain).
For the other half,
...
...
@@ 2257,7 +2257,7 @@ And, we shall see how to find the matrix that represents a map's inverse.
\end{equation*}
gives the additional information (beyond that there is at least one
solution) that there are infinitely many solutions.
Parametizing gives
$
c
_
2
=
1
+
c
_
3
$
and
$
c
_
1
=
1
$
, and so taking
$
c
_
3
$
to
Paramet
r
izing gives
$
c
_
2
=
1
+
c
_
3
$
and
$
c
_
1
=
1
$
, and so taking
$
c
_
3
$
to
be zero gives a particular solution of
$
c
_
1
=
1
$
,
$
c
_
2
=
1
$
, and
$
c
_
3
=
0
$
(which is, of course, the observation made at the start).
\end{exparts}
...
...
@@ 2293,7 +2293,7 @@ And, we shall see how to find the matrix that represents a map's inverse.
\colvec
[r]
{
0
\\
0
\\
1
}
\!\mapsto\colvec
[r]
{
3
\\
4
}
\end{equation*}
So, for this first one, we are asking whether th
a
re are scalars such that
So, for this first one, we are asking whether th
e
re are scalars such that
\begin{equation*}
c
_
1
\colvec
[r]
{
1
\\
0
}
+c
_
2
\colvec
[r]
{
1
\\
1
}
+c
_
3
\colvec
[r]
{
3
\\
4
}
=
\colvec
[r]
{
1
\\
3
}
...
...
@@ 2509,7 +2509,7 @@ And, we shall see how to find the matrix that represents a map's inverse.
domain.
\end{exparts}
\begin{answer}
Let the matrix be
$
G
$
, and suppose that it r
pe
resents
$
\map
{
g
}{
V
}{
W
}$
Let the matrix be
$
G
$
, and suppose that it r
ep
resents
$
\map
{
g
}{
V
}{
W
}$
with respect to bases
$
B
$
and
$
D
$
.
Because
$
G
$
has two columns,
$
V
$
is twodimensional.
Because
$
G
$
has two rows,
$
W
$
is twodimensional.
...
...
@@ 2574,7 +2574,7 @@ And, we shall see how to find the matrix that represents a map's inverse.
respect to
\(
D
\)
.
Show that map is a linear transformation of
\(
\Re
^
n
\)
.
\begin{answer}
Recall that the represention map
Recall that the represent
at
ion map
\begin{equation*}
V
\mapsunder
{
\text
{
Rep
}_{
B
}}
\Re
^
n
\end{equation*}
...
...
@@ 2787,7 +2787,7 @@ And, we shall see how to find the matrix that represents a map's inverse.
to its dot product with
$
\vec
{
x
}$
is linear (this is a matrixvector
product and so
\nearbytheorem
{
th:MatIsLinMap
}
applies).
Thus the map under consideration
$
h
_{
\vec
{
x
}}$
is linear because
it is the composi
s
tion of two linear maps.
it is the composition of two linear maps.
\begin{equation*}
\vec
{
v
}
\mapsto
\rep
{
\vec
{
v
}}{
B
}
\mapsto
\vec
{
x
}
\cdot\rep
{
\vec
{
v
}}{
B
}
...
...
map4.tex
View file @
dcc4189f
...
...
@@ 400,7 +400,7 @@ no matter what domain and codomain bases we use.
h
_{
1,j
}
\vec
{
\delta
}_
1+
\dots
+h
_{
i,j
}
\vec
{
\delta
}_
i
+
\dots
+h
_{
m,j
}
\vec
{
\delta
}_
m
\end{equation*}
and with resp
ce
t to
$
B,
2
\cdot
D
$
it also represents
and with resp
ec
t to
$
B,
2
\cdot
D
$
it also represents
\(
\map
{
h
_
2
}{
V
}{
W
}
\)
sending
\begin{equation*}
\vec
{
\beta
}_
j
\mapsto
...
...
@@ 466,7 +466,7 @@ no matter what domain and codomain bases we use.
\index
{
transpose!interaction with sum and scalar multiplication
}
of a matrix
$
M
$
is another matrix, whose
$
i,j
$
entry is the
$
j,i
$
entry of
$
M
$
.
Verif
i
y these identities.
Verify these identities.
\begin{exparts}
\partsitem
\(
\trans
{
(
G
+
H
)
}
=
\trans
{
G
}
+
\trans
{
H
}
\)
\partsitem
\(
\trans
{
(
r
\cdot
H
)
}
=
r
\cdot\trans
{
H
}
\)
...
...
@@ 2466,7 +2466,7 @@ is square and has with all entries zero except for ones in the main diagonal.
\end{definition}
\begin{example}
Here is the
\(
\nbyn
{
2
}
\)
identity matrix leaving its multiplicand unchaged
Here is the
\(
\nbyn
{
2
}
\)
identity matrix leaving its multiplicand uncha
n
ged
when it acts from the right.
\begin{equation*}
\begin{mat}
[r]
...
...
@@ 2916,7 +2916,7 @@ Until now we have taken the point of view that our primary objects of study
are vector spaces and the maps between them, and
have adopted matrices only for computational convenience.
This subsection show that this isn't the whole story.
Understanding matrices operations
v
y how the entries combine can
Understanding matrices operations
b
y how the entries combine can
be useful also.
In the rest of this book we shall continue to focus on maps as the primary
objects but we will be pragmatic
\Dash
if the matrix point of view gives some
...
...
@@ 3500,7 +3500,7 @@ clearer idea then we will go with it.
\end{answer}
\item
Combine the two generalizations of the identity matrix,
the one allowing ent
ir
es to be other than ones, and the one allowing the
the one allowing ent
ri
es to be other than ones, and the one allowing the
single one in each row and column to be off the diagonal.
What is the action of this type of matrix?
\begin{answer}
...
...
@@ 5120,7 +5120,7 @@ elementary real number system can be interesting and useful.
items.
\end{exparts}
When two things multiply to give zero despite that neither is zero, each is
said to be a
\definend
{
zero divisor
}
.
\index
{
zero divison
}
said to be a
\definend
{
zero divisor
}
.
\index
{
zero divis
i
on
}
Prove that no zero divisor is invertible.
\begin{answer}
For the answer to the items making up the first half, see
...
...
map5.tex
View file @
dcc4189f
...
...
@@ 90,7 +90,7 @@ map \( \map{\identity}{V}{V} \) with respect to those bases.
Leftmultiplication by the change of basis matrix for
\(
B,D
\)
converts a representation with respect to
\(
B
\)
to one with respect to
\(
D
\)
.
Conversly, if leftmultiplication by a matrix changes bases
Convers
e
ly, if leftmultiplication by a matrix changes bases
$
M
\cdot\rep
{
\vec
{
v
}}{
B
}
=
\rep
{
\vec
{
v
}}{
D
}$
then
$
M
$
is a change of basis matrix.
\end{lemma}
...
...
@@ 178,7 +178,7 @@ to some ending basis.
Because the matrix is nonsingular it will GaussJordan reduce to the
identity.
If the matrix is the identity~
$
I
$
then the statement is obvious.
Otherwise there are elementa
t
ry reduction matrices such that
Otherwise there are elementary reduction matrices such that
$
R
_
r
\cdots
R
_
1
\cdot
M
=
I
$
with
$
r
\geq
1
$
.
Elementary matrices are invertible and their inverses are also elementary
so multiplying both sides of that equation from the left
...
...
@@ 608,7 +608,7 @@ the same space, and where the map is the identity map.
\end{equation*}
\end{answer}
\item
Conside the vector space of realvalued functions with basis
Conside
r
the vector space of realvalued functions with basis
\(
\sequence
{
\sin
(
x
)
,
\cos
(
x
)
}
\)
.
Show that
\(
\sequence
{
2
\sin
(
x
)+
\cos
(
x
)
,
3
\cos
(
x
)
}
\)
is also a basis for this space.
...
...
@@ 789,7 +789,7 @@ the same space, and where the map is the identity map.
\begin{exparts}
\partsitem
In
\(
\polyspace
_
3
\)
with basis
\(
B
=
\sequence
{
1
+
x,
1

x,x
^
2
+
x
^
3
,x
^
2

x
^
3
}
\)
we have this
represen
a
tation.
representation.
\begin{equation*}
\rep
{
1x+3x
^
2x
^
3
}{
B
}
=
\colvec
[r]
{
0
\\
1
\\
1
\\
2
}_
B
...
...
@@ 815,7 +815,7 @@ the same space, and where the map is the identity map.
The proof tells us what how the bases change.
We start by swapping the first and second rows
of the representation with respect to
$
B
$
to get a representation
with res
e
pect to a new basis
$
B
_
1
$
.
with respect to a new basis
$
B
_
1
$
.
\begin{equation*}
\rep
{
1x+3x
^
2x
^
3
}{
B
_
1
}
=
\colvec
[r]
{
1
\\
0
\\
1
\\
2
}_{
B
_
1
}
...
...
@@ 1184,7 +1184,7 @@ has been \definend{diagonalized}\index{matrix!diagonalized}
when its representation is diagonal with respect to
$
B,B
$
, that is,
with respect to equal starting
and ending bases.
In Chaper Five we shall see which maps and matrices are diagonalizable.
In Chap
t
er Five we shall see which maps and matrices are diagonalizable.
In the rest of this subsection we consider the easier case
where representations are with respect to
$
B,D
$
, which are
possibly different starting and ending bases.
...
...
@@ 1223,7 +1223,7 @@ the set of matrices into matrix equivalence classes.
\end{center}
We can get some insight into the classes by comparing matrix equivalence
with row equivalence
(remem
e
ber that matrices are row equivalent when they can be reduced to each
(remember that matrices are row equivalent when they can be reduced to each
other by row operations).
In
$
\hat
{
H
}
=
PHQ
$
, the matrices
$
P
$
and
$
Q
$
are nonsingular and
thus we can write each as a product of elementary reduction matrices
...
...
@@ 1609,7 +1609,7 @@ this is a good classification of linear maps.
\colvec
[r]
{
1
\\
1
}
=1
\cdot\colvec
[r]
{
1
\\
0
}
+1
\cdot\colvec
[r]
{
0
\\
1
}
\end{equation*}
give the other nonsinguar matrix.
give the other nonsingu
l
ar matrix.
\begin{equation*}
\rep
{
\identity
}{
\hat
{
B
}
,B
}
=
\begin{mat}
[r]
0
&
1
\\
...
...
map6.tex
View file @
dcc4189f
...
...
@@ 22,7 +22,7 @@ is the $\vec{p}$ in the plane with the property that
someone standing on
$
\vec
{
p
}$
and looking directly up or down sees
$
\vec
{
v
}$
.
In this section we will generalize this to other projections,
both orthogonal and nonorthogonal.
both orthogonal and non

orthogonal.
...
...
@@ 202,7 +202,7 @@ This subsection has developed a natural projection map, orthogonal projection
into a line.
As suggested by the examples, we use it often in applications.
The next subsection shows how the definition of orthogonal
projection into a line gives us a way to calculate especially conv
ien
ent bases
projection into a line gives us a way to calculate especially conv
eni
ent bases
for vector spaces, again something that we often see in applications.
The final subsection completely generalizes projection, orthogonal or not,
into any subspace at all.
...
...
@@ 317,7 +317,7 @@ into any subspace at all.
\partsitem
$
\colvec
[
r
]
{
1
\\
2
}$
\partsitem
$
\colvec
[
r
]
{
0
\\
4
}$
\end{exparts*}
Show that in general the projection tranformation is this.
Show that in general the projection tran
s
formation is this.
\begin{equation*}
\colvec
{
x
_
1
\\
x
_
2
}
\mapsto
...
...
@@ 467,7 +467,7 @@ into any subspace at all.
Suppose that
\(
\vec
{
v
}
\in\Re
^
n
\)
with
\(
n>
1
\)
.
If
\(
\vec
{
v
}
\neq\zero
\)
then we consider the line
\(
\ell
=
\set
{
c
\vec
{
v
}
\suchthat
c
\in\Re
}
\)
and if
\(
\vec
{
v
}
=
\zero
\)
we take
\(
\ell
\)
to be any (nondegenerate) line at all
we take
\(
\ell
\)
to be any (non

degenerate) line at all
(actually, we needn't distinguish between these two cases
\Dash
see
the prior exercise).
Let
\(
v
_
1
,
\dots
,v
_
n
\)
be the components of
\(
\vec
{
v
}
\)
;
...
...
@@ 495,7 +495,7 @@ into any subspace at all.
The dimension
\(
n
=
0
\)
case is the trivial vector space, here
there is only one vector and so it cannot be expressed as the projection
of a different vector.
In the dimension
$
n
=
1
$
case there is only one (nondegenerate) line,
In the dimension
$
n
=
1
$
case there is only one (non

degenerate) line,
and every vector is in it, hence every vector is the projection only
of itself.
\end{answer}
...
...
@@ 1244,7 +1244,7 @@ An example is in \nearbyexercise{exer:OrthoRepEasy}.
Find an orthonormal basis for this subspace of
$
\Re
^
3
$
:~the
plane
$
x

y
+
z
=
0
$
.
\begin{answer}
We can param
a
trize the given space can in this way.
We can param
e
trize the given space can in this way.
\begin{equation*}
\set
{
\colvec
{
x
\\
y
\\
z
}
\suchthat
x=yz
}
=
\set
{
\colvec
[r]
{
1
\\
1
\\
0
}
\cdot
y+
\colvec
[r]
{
1
\\
0
\\
1
}
\cdot
z
...
...
@@ 1465,7 +1465,7 @@ An example is in \nearbyexercise{exer:OrthoRepEasy}.
meets the vertical dashed line
$
\vec
{
v
}
(
1
\cdot\vec
{
e
}_
1
+
2
\cdot\vec
{
e
}_
2
)
$
; this is what
first item of this question proved.
The Pythagorean theorem then gives that the hypoten
e
use
\Dash
the
The Pythagorean theorem then gives that the hypotenuse
\Dash
the
segment from
$
\vec
{
v
}$
to any other vector
\Dash
is longer than
the vertical dashed line.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment