Hello

How can I assign class weights in gluon? I see the “weight” parameter in the loss function SoftmaxCrossEntropy, but the documentation is very poor.

Best regards

Hello

How can I assign class weights in gluon? I see the “weight” parameter in the loss function SoftmaxCrossEntropy, but the documentation is very poor.

Best regards

`weight`

is just a global scalar value. You need to use `sample_weight`

that allows you to weight each sample in the batch.

If by “class weights” you mean class parameters like weights and bias, then here is an example:

```
class NormalizationHybridLayer(gluon.HybridBlock):
def __init__(self, hidden_units, scales):
super(NormalizationHybridLayer, self).__init__()
with self.name_scope():
# here you are creating your weights
self.weights = self.params.get('weights', # you can set name whatever you want
shape=(hidden_units, 0),
allow_deferred_init=True)
# here you are creating another parameter
self.scales = self.params.get('scales',
shape=scales.shape,
init=mx.init.Constant(scales.asnumpy().tolist()), # Convert to regular list to make this object serializable
differentiable=False)
def hybrid_forward(self, F, x, weights, scales):
normalized_data = F.broadcast_div(F.broadcast_sub(x, F.min(x)), (F.broadcast_sub(F.max(x), F.min(x))))
weighted_data = F.FullyConnected(normalized_data, weights, num_hidden=self.weights.shape[0], no_bias=True)
scaled_data = F.broadcast_mul(scales, weighted_data)
return scaled_data
```

More info available here

=======================================

The weight argument that you saw within SoftmayCrossEntropyLoss actually is just a value that would be multiplied by the final loss computed, for instance if your SoftmaxCrossEntropyLoss is L, then it will return weight * L.

Hope this helps.