Softmax Regression

http://d2l.ai/chapter_linear-networks/softmax-regression.html

In the topic Log-Likelihood and others, what does ā€˜nā€™ represent?

n presents number of observation

Can somebody explain how -logp(y/x) = -sigma(y * log(y))

Here is my humble understanding:

note that \acute{y}_j== p(y == j|x)
and y is one hot code (all zero except one)
thus
- \sum ( y_j * log ( \acute{y}_j ) ) = - log (\acute{y}_y ļ¼‰= -log p(y|x)

poor english sorry