How to get deterministic results in different runs?

I want to know how to make the results deterministic in different runs. Because I try to parallelize a model using Horovod. I want to make sure the results are deterministic no matter how many processes are used, as long as the same hyper-parameters are used.

I started with the simplest MNIST example in https://github.com/apache/incubator-mxnet/blob/master/example/distributed_training-horovod/gluon_mnist.py. Since all different results are because of random seed, I set the same random seed by adding the following code in the file:

import numpy as np
import random

mx.random.seed(1234)
np.random.seed(1234)
random.seed(1234)

Only one process was used to run the file. But the output accuracy are still not the same in different runs. I also tried to set shuffle as True in both train and val iterator, but the results were still not reproducible. So how to make the results deterministic in different runs?

I found the same issue in https://github.com/apache/incubator-mxnet/issues/10831, but that issue was not solved.