Vector space definition. Linear (vector) space

V.V. Golovizin Lectures on algebra and geometry. 4

Lectures on algebra and geometry. Semester 2.

Lecture 22. Vector spaces.

Abstract: definition of a vector space, its simplest properties, systems of vectors, linear combination of a system of vectors, trivial and nontrivial linear combination, linearly dependent and independent systems of vectors, conditions for linear dependence or independence of a system of vectors, subsystems of a system of vectors, systems of columns of an arithmetic vector space.

item 1. Definition of a vector space and its simplest properties.

Here, for the convenience of the reader, we repeat the content of paragraph 13 of Lecture 1.

Definition. Let be an arbitrary non-empty set, the elements of which we will call vectors, K be a field, the elements of which we will call scalars. Let an internal binary algebraic operation be defined on the set, which we will denote by the + sign and call the addition of vectors. Let also an external binary algebraic operation be defined on the set, which we will call multiplication of a vector by a scalar and denote by the sign of multiplication. In other words, two mappings are defined:

A set together with these two algebraic operations is called a vector space over the field K if the following axioms hold:

1. Addition is associative, i.e.

2. There is a zero vector, i.e.

3. For any vector, there is an opposite to it:

The vector y opposite to the vector x is usually denoted -x, so

4. Addition is commutative; ...

5. Multiplication of a vector by a scalar obeys the law of associativity, i.e.

where the product is the product of scalars defined in the field K.

6., where 1 is the unit of the field K.

7. Multiplication of a vector by a scalar is distributive with respect to the addition of vectors:

8. Multiplication of a vector by a scalar is distributive with respect to the addition of scalars:.

Definition. A vector space over the field of real numbers is called a real vector space.

Theorem. (The simplest properties of vector spaces.)

1. There is only one zero vector in vector space.

2. In vector space, any vector has a unique opposite to it.

3.or
.

4. .

Evidence. 1) The uniqueness of the zero vector is proved in the same way as the uniqueness of the unit matrices and, in general, as the uniqueness of the neutral element of any internal binary algebraic operation.

Let 0 be the zero vector of the vector space V. Then. Let be
Is another zero vector. Then. Take in the first case
, and in the second -
... Then
and
, whence it follows that
, ch.d.

2a) First, we prove that the product of the zero scalar by any vector is equal to the zero vector.

Let be
... Then, applying the axioms of the vector space, we obtain:

With respect to addition, the vector space is an abelian group, and in any group the cancellation law is valid. Applying the law of cancellation, it follows from the last equality

.

2b) Now we prove part 4). Let be
Is an arbitrary vector. Then

This immediately implies that the vector
is the opposite of the vector x.

2c) Now let
... Then, applying the axioms of the vector space,
and
we get:

2d) Let
and suppose that
... Because
, where K is a field, then there exists
... Multiply the equality
left on
:
, whence follows
or
or
.

The theorem is proved.

item 2. Examples of vector spaces.

1) The set of numeric real functions of one variable, continuous on the interval (0; 1) with respect to the usual operations of addition of functions and multiplication of a function by a number.

2) The set of polynomials from one letter with coefficients from the field K with respect to addition of polynomials and multiplication of polynomials by a scalar.

3) Lots complex numbers concerning addition of complex numbers and multiplication by a real number.

4) The set of matrices of the same size with elements from the field K with respect to matrix addition and matrix multiplication by a scalar.

The following example is an important special case of Example 4.

5) Let be an arbitrary natural number. Let us denote by the set of all columns of height n, i.e. the set of matrices over a field K of size
.

The set is a vector space over the field K and is called the arithmetic vector space of columns of height n over the field K.

In particular, if instead of an arbitrary field K we take the field of real numbers, then the vector space
is called the real arithmetic vector space of columns of height n.

Similarly, a vector space is the set of matrices over a field K of size
or, otherwise, strings of length n. It is also denoted by and is also called the arithmetic vector space of strings of length n over the field K.

item 3. Systems of vectors of vector space.

Definition. Any finite non-empty set of vectors of this space is called a system of vectors of a vector space.

Designation:
.

Definition. Expression

, (1)

where are the scalars of the field K, are the vectors of the vector space V, is called a linear combination of the system of vectors
... Scalars are called the coefficients of this linear combination.

Definition. If all the coefficients of the linear combination (1) are equal to zero, then such a linear combination is called trivial, otherwise it is nontrivial.

Example. Let be
a system of three vectors of a vector space V. Then

- a trivial linear combination of this vector system;

Is a nontrivial linear combination of this vector system, since the first coefficient of this combination
.

Definition. If any vector x of the vector space V can be represented as:

then the vector x is said to be linearly expressed in terms of the vectors of the system
... In this case, it is also said that the system
represents the vector x linearly.

Comment. In this and the previous definition, the word "linear" is often overlooked and the system is said to represent a vector or a vector is expressed in terms of system vectors and the like.

Example. Let be
Is a system of two columns of the arithmetic real vector space of columns of height 2. Then the column
expressed linearly in terms of the columns of the system, or a given system of columns linearly represents column x. Really,

item 4. Linearly dependent and linearly independent systems of vectors of vector space.

Since the product of a zero scalar by any vector is a zero vector and the sum of zero vectors is equal to the zero vector, then for any system of vectors the equality

It follows that the zero vector is linearly expressed in terms of the vectors of any system of vectors, or, in other words, any system of vectors linearly represents the zero vector.

Example. Let be
... In this case, the null column can be linearly expressed through the columns of the system in more than one way:

or

To distinguish between these ways of linear representation of the zero vector, we introduce the following definition.

Definition. If equality holds

and all the coefficients, then they say that the system
represents the zero vector trivially. If, however, in equality (3) at least one of the coefficients
is not zero, then they say that the vector system
represents the zero vector nontrivially.

From the last example, we see that there are systems of vectors that can represent the zero vector nontrivially. From the next example, we will see that there are systems of vectors that cannot represent the zero vector nontrivially.

Example. Let be
- a system of two columns from vector space. Consider equality:

,

where
so far unknown coefficients. Using the rules for multiplying a column by a scalar (number) and adding columns, we get the equality:

.

It follows from the definition of equality of matrices that
and
.

Thus, this system cannot represent the zero column nontrivially.

From the given examples it follows that there are two types of vector systems. Some systems represent the zero vector nontrivially, while others do not. Note again that any system of vectors represents a zero vector trivially.

Definition. The vector space system that represents the zero vector is ONLY trivially called linearly independent.

Definition. A system of vectors of a vector space that can represent a zero vector nontrivially is called linearly dependent.

The last definition can be given in a more detailed form.

Definition. Vector system
vector space V is called linearly dependent if there is such a nonzero set of scalars of the field K

Comment. Any system of vectors
can represent a null vector trivially:

But this is not enough to find out whether this vector system is linearly dependent or linearly independent. It follows from the definition that a linearly independent system of vectors cannot represent the zero vector nontrivially, but only trivially. Therefore, in order to verify the linear independence of a given system of vectors, it is necessary to consider the representation of zero by an arbitrary linear combination of this system of vectors:

If this equality is impossible under the condition that at least one coefficient of this linear combination is nonzero, then this system is by definition linearly independent.

So in the examples of the previous paragraph, the column system
is linearly independent, and the column system
is linearly dependent.

The linear independence of the column system , , ... ,

from the space, where K is an arbitrary field, n is an arbitrary natural number.

The following theorems give several criteria for linear dependence and, accordingly, linear independence of vector systems.

Theorem. (A necessary and sufficient condition for the linear dependence of a system of vectors.)

A system of vectors of a vector space is linearly dependent if and only if one of the vectors of the system is linearly expressed in terms of other vectors of this system.

Evidence. Need. Let the system
linearly dependent. Then, by definition, it represents the zero vector nontrivially, i.e. there is a nontrivial linear combination of this system of vectors equal to the zero vector:

where at least one of the coefficients of this linear combination is not zero. Let be
,
.

We divide both sides of the previous equality by this nonzero coefficient (i.e., multiply by :

We denote:
where.

those. one of the vectors of the system is linearly expressed in terms of other vectors of this system, p.a.

Adequacy. Let one of the vectors of the system be linearly expressed in terms of other vectors of this system:

Move the vector to the right side of this equality:

Since the coefficient at the vector is equal
, then we have a nontrivial representation of zero by the system of vectors
, which means that this system of vectors is linearly dependent, p.a.

The theorem is proved.

Consequence.

1. A system of vectors of a vector space is linearly independent if and only if none of the vectors of the system is linearly expressed in terms of other vectors of this system.

2. A system of vectors containing a zero vector or two equal vectors is linearly dependent.

Evidence.

1) Necessity. Let the system be linearly independent. Suppose the opposite and there is a vector of the system that is linearly expressed in terms of other vectors of this system. Then, by the theorem, the system is linearly dependent and we arrive at a contradiction.

Adequacy. Let none of the vectors of the system be expressed in terms of others. Suppose the opposite. Let the system be linearly dependent, but then it follows from the theorem that there is a vector of the system that is linearly expressed in terms of other vectors of this system and we again come to a contradiction.

2a) Let the system contain a zero vector. For definiteness, let us assume that the vector
:. Then the equality

those. one of the vectors of the system is linearly expressed through the other vectors of this system. It follows from the theorem that such a system of vectors is linearly dependent, p.a.

Note that this fact can be proved directly from the definition of a linearly dependent system of vectors.

Because
, then the following equality is obvious

This is a non-trivial representation of the zero vector, which means that the system
is linearly dependent.

2b) Let the system have two equal vectors. Let for definiteness
... Then the equality

Those. the first vector is linearly expressed through the remaining vectors of the same system. It follows from the theorem that this system is linearly dependent, p.a.

Similarly to the previous one, this statement can be proved directly for the definition of a linearly dependent system.

Indeed, since
, then the equality

those. we have a non-trivial representation of the zero vector.

The corollary is proved.

Theorem (On the linear dependence of a system of one vector.

A system consisting of one vector is linearly dependent if and only if this vector is zero.

Evidence.

Need. Let the system
linearly dependent, i.e. there is a non-trivial representation of the zero vector

,

where
and
... From the simplest properties of a vector space it follows that then
.

Adequacy. Let the system consist of one zero vector
... Then this system represents the zero vector nontrivially

,

whence follows the linear dependence of the system
.

The theorem is proved.

Consequence. A system consisting of one vector is linearly independent if and only if this vector is nonzero.

The proof is left to the reader as an exercise.

VECTOR SPACE (linear space), one of the fundamental concepts of algebra, generalizing the concept of a collection of (free) vectors. In vector space, instead of vectors, any objects that can be added and multiplied by numbers are considered; in this case, it is required that the basic algebraic properties of these operations be the same as for vectors in elementary geometry. In the exact definition, numbers are replaced by elements of any field K. A vector space over a field K is a set V with the operation of adding elements from V and the operation of multiplying elements from V by elements from the field K, which have the following properties:

x + y \u003d y + x for any x, y from V, that is, with respect to addition V is an abelian group;

λ (x + y) \u003d λ χ + λy for any λ from K and x, y from V;

(λ + μ) х \u003d λх + μх for any λ, μ from К and х from V;

(λ μ) х \u003d λ (μх) for any λ, μ from К and х from V;

1x \u003d x for any x from V, here 1 means the unit of the field K.

Examples of vector space are: sets L 1, L 2 and L 3 of all vectors from elementary geometry, respectively, on a straight line, plane and in space with the usual operations of vector addition and multiplication by a number; coordinate vector space K n, whose elements are all possible strings (vectors) of length n with elements from the field K, and the operations are given by the formulas

the set F (M, K) of all functions defined on a fixed set M and taking values \u200b\u200bin the field K, with the usual operations on functions:

Elements of a vector space е 1 ..., е n are called linearly independent if it follows from the equality λ 1 e 1 + ... + λ n е n \u003d 0 Є V that all λ 1, λ 2, ..., λ n \u003d 0 Є K. Otherwise, the elements е 1, е 2, ···\u003e е n are called linearly dependent. If in the vector space V any n + 1 elements e 1, ..., e n + 1 are linearly dependent and there are n linearly independent elements, then V is called an n-dimensional vector space, and n is dimensionally vector space V. If in a vector space V for any natural number n there exist n linearly independent vectors, then V is called an infinite-dimensional vector space. For example, the vector space L 1, L 2, L 3 and K n are respectively 1-, 2-, 3- and n-dimensional; if M is an infinite set, then the vector space F (M, K) is infinite-dimensional.

A vector space V and U over a field K are called isomorphic if there exists a one-to-one mapping φ: V -\u003e U such that φ (x + y) \u003d φ (x) + φ (y) for any x, y from V and φ (λх) \u003d λ φ (х) for any λ from К and х from V. Isomorphic vector spaces are algebraically indistinguishable. The classification of finite-dimensional vector spaces up to isomorphism is given by their dimension: any n-dimensional vector space over a field K is isomorphic to the coordinate vector space K n. See also Hilbert Space, Linear Algebra.

Vector (or linear) space - a mathematical structure, which is a set of elements called vectors, for which the operations of addition with each other and multiplication by a number are defined - a scalar.

1) X + y \u003d y + x ( addition commutability)

2) X + (y + Z) \u003d (x + Y) + z ( addition associativity)

3) there exists an element 0єV such that x + 0 \u003d x

4) for any x ОV there is an element - x ОV such that x + (- x) \u003d 0? called a vector, opposite vector x.

5) α (βx) \u003d (αβ) x ( scalar multiplication associativity)

7) (α + β) x \u003d αx + βx

8) α (x + y) \u003d αx + αy

1) Free vectors in the space R 3

2) Matrices of dimension nxm

3) The set of all polynomials of degree at most n

4) Examples of linear space are:

5) is the space of real numbers.

6) is a set of geometric vectors on the plane.

7) is the space of matrices of fixed dimension.

8) is the space of solutions of homogeneous linear systems, etc.

Basic definitions

N-dimensional vector is a sequence of n numbers. These numbers are called coordinates vector. The number of coordinates of the vector n is called dimension vector.

Only vectors of the same dimension can be added

Vectors are equalif they have the same dimension and their corresponding coordinates are equal.

Any n-dimensional vector A can multiply by any number λ, and all its coordinates are multiplied by this number:
λA \u003d (λ * a1, λ * a2, ..., λ * an)

Two vectors of the same dimension can be added, while their corresponding coordinates are added:

What is called a linear combination of vectors?



A linear combination of vectors a1, a2, ..., anan expression of the form is called:

Where a1, a2,…, an - arbitrary numbers

What vectors are called linearly dependent (independent)?

Nonzero vectors a1, a2,…, anare called linearly dependentif the nontrivial linear combination of these vectors is equal to the zero vector:

Nonzero vectors a1, a2,…, anare called linearly independentif only the trivial linear combination of these vectors is equal to the zero vector.

Examples of linearly independent vectors

How is the question of the linear dependence of vectors solved?

Theorem 1... For a system of vectors to be linearly dependent, it is necessary and sufficient that at least one of them be represented as a linear combination of the others.

Theorem 2. In n-dimensional space, any system containing more than n vectors is linearly dependent.

Theorem 3. If the determinant composed of the coordinates of the vectors is nonzero, then the system of vectors is linearly independent. If the indicated theorems do not answer the question of linear dependence or independence of vectors, then it is necessary to solve the system of equations with respect to, or determine the rank of the system of vectors.

What is the relationship between the coordinates of two linearly dependent vectors?

Give an example of two linearly dependent vectors

: Vectors and are collinear when such a number exists , that the equality takes place:
.

Determination of the basis of a linear space

A collection of n linearly independent elements in a space of dimension n is called the basis of this space.

Determination of the dimension of a linear space.

Definition 3.1. Linear space R is called n-dimensional if it contains n linearly independent elements, and any ( n+1) elements are already linearly dependent. Moreover, the number ncalled the dimension of space R.

The dimension of the space is denoted by the symbol dim.

Definition 3.2. Linear space R is called infinite-dimensional if it contains any number of linearly independent elements.

Theorem 3.4. Let the linear space R has a basis consisting of n elements. Then the dimension Requals n (dim R \u003d n).

The concept of n-dimensional space

A linear space V is called an n-dimensional space if there is a system of n linearly independent elements in it, and any n + 1 electrons are linearly dependent.

Formulas connecting vectors of the old and new bases