Post
Topic
Board Meta
Merits 1 from 1 user
Re: Basis vectors in merit network space
by
sncc
on 14/05/2018, 17:07:09 UTC
⭐ Merited by 2112 (1)
They act like "eigenvectors" in the merit network space.  Of course, the definition of eigenvectors in mathematics is more strict but here I am a little bit abusing the terminology to share the concept.
You are abusing the mathematics quite a bit, not "a little bit". But you are really close to the mathematically correct approach, which is

https://en.wikipedia.org/wiki/Singular-value_decomposition

. It is a much better statistical analysis tool for the https://en.wikipedia.org/wiki/Least_squares problem than the old https://en.wikipedia.org/wiki/Levenberg-Marquardt_algorithm [1] .

It has been a great pleasure to read your post.

References:
[1] completely casual reference to theymos . I don't think we want to go into non-linear statistics.
Good to hear your feedback and that you have been enjoying my post.  

Right, orthonormal basis vectors in general vector space should be more precise here for an analogy.  As you noted my point is how to find out such a good basis or analogous merit senders in the entire merit network space.  It would be well-defined problem if it is some general vector space in mathematics as we could basically follow the Gram-Schmidt orthogonalization process to construct a good basis for a given set of liner independent vectors, whose criterion is also well-defined.  The corresponding process in the merit network is not clear to me, but at least the visualization provides an intuitive approach.  

I am not sure how the least square method works for resolving the issue, as it is not clear to me what we should minimize or fit to find out independent merit senders, but you seem to have some idea?