Answer:
a. H + K = {u + v}
b. See explanation below
Step-by-step explanation:
Given
H+K={w:w=u+v for some u in H and some v in K}
a.
From the (given) above,
We have that
0 = v1 + 0.u
v1 represents the zero vector in K
We also have that
0 = v + u1
u1 represents the zero vector in H
Up this point, we've shown that both H and K have vector factors, V
We'll use the available data to solve for the zero vector of H + K
This is given by
u1 + v1 + u2 + v2 ---- Rearrange
u1 + u2 + v1 + v2 ---- Let u3 = u1 + u2 and v3 = v1 + v2.
So, we have
u3 + v3 as the zero vector of H + K
H + K = {u + v}
So, H + K is a subspace of V
b
Since H, K are subspace of V, then they are
1. Both close under scalar multiplication
2. Both closed under vector addition
1 and 2 gives
c(u + v) where c is a constant
= c.u + c.v
We can then conclude that H is a subspace of H+K and K is a subspace of H+K because H + K is closed under scalar multiplication and vector addition