我正在尝试实现一个自定义度量,为我的语义分割问题计算特异性。当我用这个度量来训练模型的时候,我不断地得到这个错误。当模型显示培训度量时,此度量很好地工作,但对于验证度量则停止,如下所示。
我的实施情况如下:
class Specificity(tf.keras.metrics.Metric):
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.tn = tf.keras.metrics.TrueNegatives()
self.fp = tf.keras.metrics.FalsePositives()
def update_state(self, y_true, y_pred, sample_weight=None):
self.tn.update_state(y_true, y_pred)
self.fp.update_state(y_true, y_pred)
def result(self):
tn = self.tn.result()
fp = self.fp.result()
return tf.expand_dims(tf.divide(tn, tn + fp), axis=-1)输出信息:
Epoch 1/200
2022-07-26 10:16:25.238678: I tensorflow/stream_executor/cuda/cuda_dnn.cc:384] Loaded cuDNN version 8302
2022-07-26 10:16:26.083645: I tensorflow/core/platform/default/subprocess.cc:304] Start cannot spawn child process: No such file or directory
100/100 [==============================] - ETA: 0s - loss: 0.7601 - iou_score: 0.5211 - precision: 0.6069 - recall: 0.7806 - f1-score: 0.6732 - auc: 0.9403 - specificity: 0.8967Traceback (most recent call last):
File "/home/bhattrai/corneal_neovascularization_tf2/test_statistics.py", line 892, in <module>
history = model.fit(my_generator, validation_data=validation_datagen, steps_per_epoch=100, validation_steps=100,
File "/home/bhattrai/.local/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/home/bhattrai/.local/lib/python3.9/site-packages/keras/backend.py", line 4028, in batch_set_value
x.assign(np.asarray(value, dtype=dtype_numpy(x)))
ValueError: Cannot assign value to variable ' accumulator:0': Shape mismatch.The variable shape (1,), and the assigned value shape () are incompatible.
2022-07-26 10:16:39.142482: W tensorflow/core/kernels/data/generator_dataset_op.cc:108] Error occurred when finalizing GeneratorDataset iterator: FAILED_PRECONDITION: Python interpreter state is not initialized. The process may be terminated.
[[{{node PyFunc}}]]我试着用另一种方式实现它,如下所示,但是当我在模型培训期间使用它时,它总是显示0.000。
import tensorflow as tf
class Specificity(tf.keras.metrics.Metric):
def __init__(self, name='specificity', **kwargs):
super().__init__(name=name, **kwargs)
self.tn = self.add_weight(name='tn', initializer='zeros')
self.fp = self.add_weight(name='fp', initializer='zeros')
self.tnr = self.add_weight(name='tnr', initializer='zeros')
def update_state(self, y_true, y_pred, sample_weight=None):
y_true = tf.cast(y_true, tf.bool)
y_pred = tf.cast(y_pred, tf.bool)
# Getting the True Negatives
values = tf.logical_and(tf.equal(y_true, False), tf.equal(y_pred, False))
values = tf.cast(values, self.dtype)
self.tn.assign_add(tf.reduce_sum(values))
# Getting the False Positives
values = tf.logical_and(tf.equal(y_true, False), tf.equal(y_pred, True))
values = tf.cast(values, self.dtype)
self.fp.assign_add(tf.reduce_sum(values))
self.tnr.assign_add(tf.divide(self.tn, tf.add(self.tn, self.fp)))
def result(self):
return self.tnr输出:
m = Specificity()
m.update_state([0, 1, 0, 0], [0, 1, 0, 0])
X = m.result().numpy()
1.0但当我用它训练模特时:
X = tf.random.normal(shape=(100, 256, 256, 3))
Y = tf.random.uniform(minval=0, maxval=2, shape=(100, 256, 256, 1), dtype=tf.int32)
dataset = tf.data.Dataset.from_tensor_slices((X,Y))
train = dataset.take(80)
val = dataset.skip(80)
train = train.cache().shuffle(1000).batch(32).prefetch(tf.data.experimental.AUTOTUNE)
val = val.cache().shuffle(1000).batch(32).prefetch(tf.data.experimental.AUTOTUNE)
model = tf.keras.Sequential([
tf.keras.layers.Input(shape=(256,256,3)),
tf.keras.layers.Conv2D(16, 3, padding='same'),
tf.keras.layers.MaxPooling2D(),
tf.keras.layers.Conv2D(32, 3, padding='same'),
tf.keras.layers.MaxPooling2D(),
tf.keras.layers.Conv2D(64, 3, padding='same'),
tf.keras.layers.MaxPooling2D(),
tf.keras.layers.Conv2DTranspose(64, strides=2, padding='same', kernel_size=3),
tf.keras.layers.Conv2DTranspose(32, strides=2, padding='same', kernel_size=3),
tf.keras.layers.Conv2DTranspose(32, strides=2, padding='same', kernel_size=3),
tf.keras.layers.Conv2D(1, 3, activation='sigmoid', padding="same")
])
model.compile(loss=tf.keras.losses.BinaryCrossentropy(),
optimizer='adam',
metrics=[Specificity(), 'accuracy'])
model.fit(train, validation_data=val, epochs=2)
Epoch 1/2
3/3 [==============================] - 16s 5s/step - loss: 0.6939 - specificity: 0.0000e+00 - accuracy: 0.5003 - val_loss: 0.6934 - val_specificity: 0.0000e+00 - val_accuracy: 0.4999
Epoch 2/2
3/3 [==============================] - 15s 5s/step - loss: 0.6933 - specificity: 0.0000e+00 - accuracy: 0.5004 - val_loss: 0.6933 - val_specificity: 0.0000e+00 - val_accuracy: 0.5002
<keras.callbacks.History at 0x7f9fdb2c1250>发布于 2022-07-27 10:01:08
我现在还不能测试它,但是我总是将自定义度量定义为函数(而不是子类)。也许你可以试试。另外,在这个文章中,我发现了一个具体性度量的实现,也许您可以根据您的需要对它进行调整。
这是代码:
import numpy as np
import tensorflow as tf
from keras import backend as K
def specificity(y_true, y_pred):
tn = K.sum(K.round(K.clip((1 - y_true) * (1 - y_pred), 0, 1)))
fp = K.sum(K.round(K.clip((1 - y_true) * y_pred, 0, 1)))
return tn / (tn + fp + K.epsilon())
model = tf.keras.Sequential(...)
model.compile(
optimizer="adam",
loss=tf.keras.losses.BinaryCrossentropy(),
metrics=[
"accuracy",
specificity
]
)
model.fit(train, validation_data=val, epochs=2)更新:
在您的子类版本中,问题可能与缺少reset_states函数有关(在每个时代之后清除状态)。尝试将以下内容添加到您的课堂中:
def reset_states(self):
self.self.tn.assign(0)
self.self.tp.assign(0)
self.self.tnr.assign(0)https://stackoverflow.com/questions/73119989
复制相似问题