Looking for 2 functions in gluon

EarlyStop and UpSampling.
I didn’t find anything for earlystop function.
I do find upsampling in symbol and ndarray but not a gluon layer.
Please confirm.

Afaik, upsampling is under development for gluon with pytorch compatible ops: https://github.com/apache/incubator-mxnet/issues/9970

1 Like

Thanks a lot for the information. I didn’t notice that there was a plan when I read those posts… But glad to know it’s to be expected. Currently I may have to create my own layer I guess.

With respect to UpSample (and BilinearResize2D), these are operators without any parameters. Therefore you don’t need any gluon layers to be able to use them in gluon. If you’re using a Sequantial or HybridSequential block, you can wrap UpSample in a small custom block class:

class UpSample(gluon.HybridBlock):
    def __init__(self, scale, sample_type):
        super(UpSample, self).__init__()
        self.scale = scale
        self.sample_type = sample_type

    def hybrid_forward(self, F, x):
        return F.UpSample(x, scale=self.scale, sample_type=self.sample_type)

For EarlyStop, you can implement any logic you need for EarlyStop using simple python because in Gluon you have full access to every iteration’s loss. What’s an example of an EarlyStop function in other frameworks?

2 Likes

Yes, upsampling block is exactly what I planned to do and many thanks for your code :grinning:
For earlystop, I guess you are right in general. I was using Keras and think it’s convenient. But I think that’s because they use simple “predict/fit” function to do training so it’s a easy function to call. And in MxNet, there is indeed no necessary to do it.
Many thanks for both.