It seems that random is not yet implemented for gluon?
[23:48:39] /Users/travis/build/dmlc/mxnet-distro/mxnet-build/dmlc-core/include/dmlc/logging.h:308: [23:48:39] src/c_api/c_api_ndarray.cc:180: Check failed: infershape[op](attrs, &in_shapes, &out_shapes)
Stack trace returned 8 entries:
[bt] (0) 0 libmxnet.so 0x00000001049a81f8 _ZN4dmlc15LogMessageFatalD2Ev + 40
[bt] (1) 1 libmxnet.so 0x00000001049a5db9 _ZN4dmlc15LogMessageFatalD1Ev + 9
[bt] (2) 2 libmxnet.so 0x000000010576d4d7 _Z12SetShapeTypePKN4nnvm2OpERKNS_9NodeAttrsERKN5mxnet7ContextERKNSt3__16vectorINS6_7NDArrayENSA_9allocatorISC_EEEEPSF_Pi + 6391
[bt] (3) 3 libmxnet.so 0x0000000105773da9 _Z20ImperativeInvokeImplRKN5mxnet7ContextEON4nnvm9NodeAttrsEPNSt3__16vectorINS_7NDArrayENS6_9allocatorIS8_EEEESC_PNS7_IbNS9_IbEEEESF_ + 537
[bt] (4) 4 libmxnet.so 0x000000010577713e MXInvokeCachedOp + 1534
[bt] (5) 5 libmxnet.so 0x0000000105777dec MXInvokeCachedOpEx + 28
[bt] (6) 6 _ctypes.cpython-36m-darwin.so 0x00000001045a92c7 ffi_call_unix64 + 79
[bt] (7) 7 ??? 0x00007fff5d9e4430 0x0 + 140734764041264
Segmentation fault: 11
I used a SeLU Activation function and it crashed - I was wondering if there is some sort of list of what remains to be implemented?:
def hybrid_forward(self, F, x):
x = self.scale * F.LeakyReLU(self.fc1(x), act_type='elu', slope=self.alpha)
noise = F.random_uniform( shape=() )
x = a*( self.alphaprime + 0.5*(1.0 + F.sign( noise - self.dropout ))*(x - self.alphaprime)) + b
x = self.scale * F.LeakyReLU(self.fc2(x), act_type='elu', slope=self.alpha)
noise = F.random_uniform( shape=() )
x = a*( self.alphaprime + 0.5*(1.0 + F.sign( noise - self.dropout ))*(x - self.alphaprime)) + b
x = self.output(x)
return x