ย
ย
ย
Reproducible Result?
- ์ฝ๋ฉ์์ tensorflow๋ฅผ ํ์ฉํ CNN ๋ชจ๋ธ ํ์ต์ ์งํํ๋ค๊ฐ, ๋์ผํ ๋ ์ด์ด ๋ฐ ์ค์ ์ ํด๋ ๊ฒฐ๊ณผ๋ง๋ค loss ๋ฐ accuracy๊ฐ ๋ค๋ฅด๊ฒ ๋์ค๋ ๊ฒฝ์ฐ๋ฅผ ๋ณด์๋ค.
- ๊ตฌ๊ธ๋ง์ ํด๋ณด๋, random seed ๋ฐ gpu ์ฐ์ฐ์ ๋ฐ๋ผ์ ๊ฒฐ๊ณผ๊ฐ ๋ค๋ฅด๊ฒ ๋์ค๋ ๊ฒ์ด ์์ธ์ด์๋ค.
- ๋ ์ด์ด๋ ์ค์ ์ ๋ฐ๋ผ loss ๋ฐ accuracy๊ฐ ์ด๋ป๊ฒ ๋ณํ๋๋์ง ํ์ธํ๊ณ ์ถ์๊ณ , ์ด์ ๋ฐ๋ผ ์ฌํ๊ฐ๋ฅํ ๊ฒฐ๊ณผ๊ฐ ๋์ค๋๋ก ์ค์ ํ๊ณ ์ถ์๋ค.
- ์ด์ ๋ฐ๋ผ ๋ช๊ฐ์ง ์ค์ ์ ์ ์ฉํด์, ๋์ผํ ๋ ์ด์ด ๋ฐ ์ค์ ์์ ๋์ผํ ๊ฒฐ๊ณผ๊ฐ ๋์ถ๋๋ ๊ฒ์ ํ์ธํ์๋ค.
ย
ย
ํ์ํ ์ค์
- tensorflow์ utils์์ set_random_seed ๋ฉ์๋๋ฅผ ํธ์ถํ์ฌ ์๋๊ฐ์ ์ค์ ํ๋ค.
- tensorflow์ config์์ enable_op_determinism ๋ฉ์๋๋ฅผ ํธ์ถํ๋ค.
- ๋ ธํธ๋ถ ์ธ์ ์ ๋์ง ์๊ณ ๊ณ์ ํ ์คํธ๋ฅผ ์งํํ๊ณ ์ ํ๋ค๋ฉด, tensorflow์ backend์์ clear_session ๋ฉ์๋๋ฅผ ํธ์ถํ๋ค.
ย
ย
๋ ธํธ๋ถ ์ฝ๋
import tensorflow as tf import numpy as np from keras.datasets import cifar10 from keras.utils import to_categorical from sklearn.model_selection import train_test_split from keras import models, layers from keras.callbacks import EarlyStopping, ModelCheckpoint from sklearn.metrics import confusion_matrix, classification_report
(X_train, y_train), (X_test, y_test) = cifar10.load_data() print("Train Shape", X_train.shape, y_train.shape) print("Test Shape", X_test.shape, y_test.shape)
print("Train Data") print(np.unique(y_train, return_counts=True)) print("Test Data") print(np.unique(y_test, return_counts=True))
X_train = X_train.astype(float) / 255 X_test = X_test.astype(float) / 255 y_train = to_categorical(y_train) y_test = to_categorical(y_test)
X_train, X_valid, y_train, y_valid = train_test_split(X_train, y_train, test_size=0.2, random_state=2045) print("Train Shape:", X_train.shape, y_train.shape) print("Valid Shape:", X_valid.shape, y_valid.shape)
tf.keras.backend.clear_session() tf.config.experimental.enable_op_determinism() tf.keras.utils.set_random_seed(2045) CIFAR = models.Sequential() CIFAR.add(layers.Conv2D(64, 2, input_shape=(32, 32, 3,))) CIFAR.add(layers.MaxPool2D(3)) CIFAR.add(layers.Conv2D(32, 2)) CIFAR.add(layers.MaxPool2D(3)) CIFAR.add(layers.Flatten()) CIFAR.add(layers.Dense(1024, activation='relu')) CIFAR.add(layers.Dropout(0.6)) CIFAR.add(layers.Dense(512, activation='relu')) CIFAR.add(layers.Dropout(0.4)) CIFAR.add(layers.Dense(128, activation='relu')) CIFAR.add(layers.Dense(10, activation='softmax'))
CIFAR.summary()
CIFAR.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
%%time batch_size = 128 es = EarlyStopping(monitor='accuracy', mode='max', patience=50, verbose=1) mc = ModelCheckpoint('best_CIFAR.h5', monitor='accuracy', mode='max', save_best_only=True, verbose=1) Hist_CIFAR = CIFAR.fit(X_train, y_train, epochs=10, batch_size=batch_size,\ callbacks=[es, mc], validation_data=(X_valid, y_valid),\ workers=1)
best_CIFAR = models.load_model('best_CIFAR.h5') loss, accuracy = best_CIFAR.evaluate(X_test, y_test, verbose=0) print('Loss = {:.5f}'.format(loss)) print('Accuracy = {:.5f}'.format(accuracy))
y_real = np.argmax(y_test, axis=1) preditions = best_CIFAR.predict(X_test) y_pred = np.argmax(preditions, axis=1) print(confusion_matrix(y_real, y_pred)) print(classification_report(y_real, y_pred))
ย
ย
ํ ์คํธ ๊ฒฐ๊ณผ
- ์ฌ๋ฌ๋ฒ ์๋ํด๋ ๋์ผํ ๊ฒฐ๊ณผ๋ฅผ ๋์ถํ๋ค
- ๋ค๋ง ๊ตฌ๋ํ๋ ํ๋์จ์ด๊ฐ ๋ค๋ฅธ ๊ฒฝ์ฐ์ ํ๋์จ์ด์ ๋ฐ๋ผ ๋ค๋ฅธ ๊ฒฐ๊ณผ๋ฅผ ๋ํ๋ธ๋ค.
- ๋ก์ปฌ ์คํ ๊ฒฐ๊ณผ์ ์ฝ๋ฉ ์คํ ๊ฒฐ๊ณผ๊ฐ ๋ค๋ฅด๊ฒ ๋์๋ค.
ย