"การฉาย" ที่จะเรียกว่าเป็นโปรเจคเวกเตอร์ ในการคำนวณโปรเจคชันของเวกเตอร์บนเวกเตอร์bคุณใช้ผลิตภัณฑ์ภายในของเวกเตอร์สองตัว:ab
aproj=⟨a,b⟩b
ในกรณีนี้เป็นองค์ประกอบเวกเตอร์ของว่าโกหกในทิศทางเดียวกันของข ในพื้นที่แบบยุคลิดผู้ประกอบการผลิตภัณฑ์ด้านในถูกกำหนดให้เป็นผลิตภัณฑ์แบบจุด:aprojab
⟨a,b⟩=a⋅b=∑i=1naibi
nabaibiiabab. Note that this is a signed quantity, so a negative value would mean that the angle between the two vectors is greater than 90 degrees, as illustrated by an alternative definition for the projection operator:
aproj=|a|cos(θ)b
where θ is the angle between the two vectors.
So, given a vector a and a bunch of basis vectors bi, one can find "how much of a" goes in each of the directions of each of the basis vectors. Typically, those basis vectors will all be mutually orthogonal. In your case, the SVD is an orthogonal decomposition, so this condition should be satisfied. So, to accomplish what you describe, you would take the matrix of eigenvectors U and calculate the inner product of the candidate vector y with each of the matrix's columns:
pi=y⋅ui
The scalar value pi that you get from each inner product represents how well the vector y "lined up" with the i-th eigenvector. Since the eigenvectors are orthonormal, you could then reconstruct the original vector y as follows:
y=∑i=1npiui
You asked whether this representation is unique; I'm not sure exactly what you mean, but it is not unique in the sense that a given vector y could be decomposed by projection onto any number of orthonormal bases. The eigenvectors contained in the matrix U are one such example, but you could use any number of others. For instance, calculating the discrete Fourier transform of y can be viewed as projecting it onto an orthonormal basis of complex exponential vectors of varying frequency.