-
-
I got a very big CCC instead of ~1. Here is my system,
x_train = np.arange(0, 100.01, 0.01) y_train = 2*x_train + 3 #**2 + 3 def build_model(): # create model model = Sequential() model.add(Dense(128, input_dim=1, activation='relu')) model.add(Dense(128, activation='relu')) #model.add(Dropout(0.5)) model.add(Dense(1)) #kernel_initializer='normal' # compile model model.compile(loss='mse', optimizer='adam', metrics=[ccc]) return model model = build_model() model.summary() # train model, use 30% of train data for validation/development hist = model.fit(x_train, y_train, epochs=20, batch_size=16, validation_split=0.3)
Output:
Epoch 1/20 7000/7000 [==============================] - 3s 374us/step - loss: 325.9038 - ccc: 150045115.4181 - val_loss: 2.3211 - val_ccc: 1466207.0620 Epoch 2/20 7000/7000 [==============================] - 1s 143us/step - loss: 0.5625 - ccc: 159878994.8251 - val_loss: 0.7647 - val_ccc: 4424122.4748 Epoch 3/20 7000/7000 [==============================] - 1s 137us/step - loss: 0.1576 - ccc: 159900238.2171 - val_loss: 0.0623 - val_ccc: 45918605.1997 Epoch 4/20 7000/7000 [==============================] - 1s 140us/step - loss: 0.0011 - ccc: 159908508.6720 - val_loss: 0.0237 - val_ccc: 87057141.7451 Epoch 5/20 7000/7000 [==============================] - 1s 140us/step - loss: 6.4731e-09 - ccc: 159908571.9314 - val_loss: 0.0235 - val_ccc: 87494003.2358
Any idea what's wrong..? The model prediction is almost perfect, it gives [11.00 24.99] for input [4 11].
-
-
This could be confusing:
s_xy = 1.0 / (N - 1.0 + K.epsilon()) * K.sum((y_true - K.mean(y_true)) * (y_pred - K.mean(y_pred)))
Better use
s_xy = K.mean(K.sum((y_true - K.mean(y_true)) * (y_pred - K.mean(y_pred))))
As N is often caught as = 1 instead of the number of samples (when returning a dense layer from Keras)
Edited by Rachid Rhyad Saboundji