Is there any communication between parameter servers?

Could you show me the source code about it?

Can you add more details to your question. The parameter server for distributed training in MXNet is implemented in the KVStore API. Here’s some details about it.

The gradients are communicated between the workers and the servers. Here’s a tutorial post that explains the process in some detail.