The particular multiplication of a vector, defined in linear algebra, considers a vector of 2 dimensions nx1 or 1xn. And this multiplication operation thus runs in pattern of being complex . But you can validly define a multiplication operation to be not complex (as in GLSL or HLSL), and have a vector space. Making the multiplication of vectors comutative.
As the last time we talked about that: you can certainly add additional stuff (like operators) to a vector space. You might even be able to do more with it then (although the polite way in mathematics is to give it's proper name then), but that doesn't suddenly give all vector spaces that property.
Even so, if I remember correctly the last time that discussion came up you needed something like getting rid of a vector \(v\) on the right in an expression \(Av\) for a suitable matrix \(A\). Even if you have an inner product \(*: V \times V \rightarrow V\) that does not work: first, not every vector has an inverse element regarding \(*\). Second, even if you had the expression \(Ae\) (where \(e\) is the one-element regarding \(*\)), you cannot make the \(e\) vanish. It's the multiplicative neutral element for \(*\),
not for (matrix, vector) multiplication. The result of \(Ae\) is a vector, not another matrix, definitely not \(A\) itself (except in the degenerate case of one-dimensional vector spaces).
You surely agree then that there are plenty of differences fired in also morphisms , transformations. Yes, the multiplication of matricies is not associative then, having multiplication of matricies undestood as F(L(D(v))) where F, L D are linear functions of v vector (linear functions can be multiplied to one function)
I have no clue what you are talking about.
Function composition is always associative. You don't even need linear functions for that, it's a basic property. Changing the representation to \((F \circ L \circ D)(x)\) instead of matrices has no influence on that, you still have \((F \circ L) \circ D = F \circ (L \circ D)\).