Why a Net inputted with same data could have different output?

I have a Net(gluon.Block) network already trained by the training loops

I am using following code to feed the data into the network

net = Net()
# training steps finished here
data = [data1, data2]
output = net(data)

With some same data1,data2, the net could give the different results! what are the possible reasons? I use np.argmax to get the maximum softmax of the output, and each time with the same input it could give the different output

The data batch_size of my data is 15, and data1 has 15 same records

[[0.5579492 0.23680237 0. … 0.20596384 1.0790483 0.11396025]
[0.5579492 0.23680237 0. … 0.20596384 1.0790483 0.11396025]
[0.5579492 0.23680237 0. … 0.20596384 1.0790483 0.11396025]

[0.5579492 0.23680237 0. … 0.20596384 1.0790483 0.11396025]
[0.5579492 0.23680237 0. … 0.20596384 1.0790483 0.11396025]
[0.5579492 0.23680237 0. … 0.20596384 1.0790483 0.11396025]]
<NDArray 15x2048 @cpu(0)>

data2 has 5 pieces, each has 3 same records

[[[-0.1518 0.38409 0.8934 … -0.27123 0.22157 0.92112 ]
[-0.54264 0.41476 1.0322 … -1.2969 0.76217 0.46349 ]
[ 0.085703 -0.22201 0.16569 … -0.074273 0.75808 -0.34243 ]

[ 0. 0. 0. … 0. 0. 0. ]
[ 0. 0. 0. … 0. 0. 0. ]
[ 0. 0. 0. … 0. 0. 0. ]]

[[-0.1518 0.38409 0.8934 … -0.27123 0.22157 0.92112 ]
[-0.54264 0.41476 1.0322 … -1.2969 0.76217 0.46349 ]
[ 0.085703 -0.22201 0.16569 … -0.074273 0.75808 -0.34243 ]

[ 0. 0. 0. … 0. 0. 0. ]
[ 0. 0. 0. … 0. 0. 0. ]
[ 0. 0. 0. … 0. 0. 0. ]]

[[-0.1518 0.38409 0.8934 … -0.27123 0.22157 0.92112 ]
[-0.54264 0.41476 1.0322 … -1.2969 0.76217 0.46349 ]
[ 0.085703 -0.22201 0.16569 … -0.074273 0.75808 -0.34243 ]

[ 0. 0. 0. … 0. 0. 0. ]
[ 0. 0. 0. … 0. 0. 0. ]
[ 0. 0. 0. … 0. 0. 0. ]]

[[-0.54264 0.41476 1.0322 … -1.2969 0.76217 0.46349 ]
[-0.038194 -0.24487 0.72812 … -0.1459 0.8278 0.27062 ]
[ 0.38709 0.32629 0.64524 … -0.8935 0.26669 -0.61397 ]

[ 0.18599 0.37305 0.13079 … -0.48638 1.0193 0.13099 ]
[ 0.31039 0.64859 0.28481 … -0.88554 0.91767 -0.57253 ]
[ 0.40367 0.35096 -0.18594 … -0.44149 0.14828 -0.068031]]

[[-0.54264 0.41476 1.0322 … -1.2969 0.76217 0.46349 ]
[-0.038194 -0.24487 0.72812 … -0.1459 0.8278 0.27062 ]
[ 0.38709 0.32629 0.64524 … -0.8935 0.26669 -0.61397 ]

[ 0.18599 0.37305 0.13079 … -0.48638 1.0193 0.13099 ]
[ 0.31039 0.64859 0.28481 … -0.88554 0.91767 -0.57253 ]
[ 0.40367 0.35096 -0.18594 … -0.44149 0.14828 -0.068031]]

[[-0.54264 0.41476 1.0322 … -1.2969 0.76217 0.46349 ]
[-0.038194 -0.24487 0.72812 … -0.1459 0.8278 0.27062 ]
[ 0.38709 0.32629 0.64524 … -0.8935 0.26669 -0.61397 ]

[ 0.18599 0.37305 0.13079 … -0.48638 1.0193 0.13099 ]
[ 0.31039 0.64859 0.28481 … -0.88554 0.91767 -0.57253 ]
[ 0.40367 0.35096 -0.18594 … -0.44149 0.14828 -0.068031]]]
<NDArray 15x12x100 @cpu(0)>

Does the dropout has an effect on this?

Yes dropout has an effect on the result of your network on the same input data. That’s because there is randomness associated with whether a particular weight is zeroed out or not. You can use a random number seed to see effectively make your network deterministic.