So, in an earlier post, I mentioned a metric and an inner product in the same breath, so to speak, and we were talking at the time about tensor products.
Let me offer this totally off-topic aside, which some of you might find amusing......
Recall I defined the bilinear form on $$V$$ by $$V \times V \to \mathbb{R}$$, whereby $$<v,w> \in \mathbb{R}$$ where $$v,\,\,w \in V$$. I mentioned this was not my preferred notation. Let's choose another, say
$$B:V \times V \to \mathbb{R}$$ so that $$B(v,w) \in \mathbb{R}$$
Consider first the form $$b(v,\,\cdot\,)$$ where I have fixed some vector $$v \in V$$, and the "dot" refers to any vector in V of our choice. Then I will have that $$b(v,\,\cdot\,):V \to \mathbb{R}$$.
Let me define $$B(v,\,\cdot\,) \equiv b_1$$ so that for each and any $$w \in V$$ that $$b_1(w) \in \mathbb{R}$$ which can only mean that $$b_1 \in V^*$$, the dual space to V.
Likewise I define $$b(\,\cdot,w) \equiv b_2$$ which, by the same argument is an element in $$V^*$$.
So that $$b_1(w)b_2(v) = b_1\otimes b_2(w,v) \Rightarrow B= b_1 \otimes b_2$$, which is nothing more (and nothing less) than the definition of a tensor, in this case of type (0,2) - and noting that this bilinear form induces the quadratic form by $$B(v,v) = Q(v)$$ which in turn gives us he notion of the length of a vector, I will call this a metric tensor
Let us now suppose that, relative to a basis $$\{\epsilon^j\} \in V^*$$, and assuming $$\alpha,\,\,\beta$$ are elements in the real field, I may write
$$b_1 = \alpha_j \epsilon^j$$ (summation implied) and
$$b_2 = \beta_k \epsilon^k$$, then I will have that
$$B = (\alpha_j \epsilon^j)\otimes (\beta_k \epsilon^k)$$ (implied summation again)
$$ = \alpha_j\beta_k(\epsilon^j \otimes \epsilon^k)$$
Note that the product of the real numbers $$\alpha_j\beta_k$$ is taken element by element, that is....
$$\alpha_1\beta_1,\,\,\alpha_1\beta_2,.....,\alpha_1\beta_n$$
$$\alpha_2\beta_1,\,\,\alpha_2\beta_2,.....,\alpha_2\beta_n$$
..........................................
$$\alpha_n\beta_1,\,\,\alpha_n\beta_2,.....,\alpha_n\beta_n$$
Which is of course a real $$n \times n$$ matrix, which we will call as $$g_{jk}$$, and say this is the component form of our metric tensor.. It happens to be symmetric, that is $$g_{jk} = g_{kj}$$ since $$b(v,w) = b(w,v)$$
Of course this is just kid's stuff - the metric tensor is much more difficult to derive properly. I can try, but it is totally off-topic