MargNet: Identifying Compact and Faint Galaxies apart from Stars and Quasars¶

  • Email: atharvabagul2000@gmail.com
  • LinkedIn: https://www.linkedin.com/in/atharva-bagul-2407/
  • Paper: MargNet Paper
  • Objective : Develop a deep learning-based classifier, MargNet, to identify stars, quasars, and compact galaxies using photometric parameters and images from the SDSS DR16 catalogue.
  • Architecture : MargNet is a Multimodal Model which integrates Convolutional Neural Networks (CNN) and Artificial Neural Networks (ANN) in a stacking ensemble for efficiently identifying sources with high accuracy and scalability
  • Dataset : Trained on a curated dataset comprising of 240,000 compact objects and 150,000 faint objects from SDSS DR16. (Details mentioned later)
  • Performance : MargNet excels at distinguishing compact galaxies from stars and quasars, especially at fainter magnitudes, outperforming existing methods.
  • Future Applications : This approach is well-suited for upcoming astronomical surveys, such as the Dark Energy Survey (DES) and Vera C. Rubin Observatory LSST.

Beautiful galaxies¶

Our Dataset¶

Bright Galaxies¶

In [4]:
from PIL import Image
im=Image.open('/Users/atharvabagul/Downloads/bright_galaxies_6x6.png')
display(im)

Compact Galaxies¶

In [3]:
from PIL import Image
im=Image.open('/Users/atharvabagul/Downloads/medium_galaxies_9x9.png')
display(im)

Faint & Compact Galaxies¶

In [5]:
from PIL import Image
im=Image.open('/Users/atharvabagul/Downloads/dim_galaxies_4x4.png')
display(im)

Faintness-Compactness¶

While some definitions of compactness are based on the galaxy’s magnitude (Fairall 1978; Hickson 1982), we do not use any magnitude criterion in our work, because low surface brightness (LSB) galaxies can have extended morphology albeit being faint. Instead, we define the compactness parameter c for an object as c=⟨deVRadFWHM⟩c=⟨deVRadFWHM⟩

Here, deVRad is the half-light radius (also known as de Vaucouleur’s radius) of the galaxy. ⟨⟩denotes an average over all the five passbands: u, g, r, i, and z. FWHM denotes the full width at half-maximum of the PSF. To establish the limit on c below which classification becomes difficult, we trained a random forest classifier using photometric parameters on a small random data set of spectroscopically identified stars, galaxies, and quasars, and evaluated the performance using the accuracy achieved in the classification. By retraining the random forest for training samples in small bins of c, we can see how the value of c affects the performance

In [2]:
from PIL import Image
im=Image.open('/Users/atharvabagul/MargNet/Model_Performance-1.png')
display(im)

When choosing objects to constitute our faint data set, we require that the average magnitude in the five passbands ⟨mag⟩>20⟨mag⟩>20

We choose 20 as the cut-off point for two reasons. First, it is beyond this level of faintness that the traditional star–galaxy classifiers start to fail, with a huge decrease in performance at r > 21 (Kim & Brunner 2017; Cabayol et al. 2019). Secondly, enough samples from the SDSS obey this cut-off.

Step 0 -- Getting the pre-requisites¶

In [1]:
import os
import numpy as np
np.random.seed(69)
import pandas as pd
import random
import pickle as pkl
import matplotlib.pyplot as plt
import matplotlib.image as img
import seaborn as sns
sns.set()
import tensorflow as tf
from tqdm.notebook import tqdm
from tensorflow.keras.utils import to_categorical, plot_model
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense,concatenate, Dropout, Flatten, Conv2D, MaxPool2D, BatchNormalization, ZeroPadding2D, LeakyReLU, ReLU, AveragePooling2D
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.callbacks import ReduceLROnPlateau, EarlyStopping
from tensorflow.keras.models import load_model
from sklearn import metrics
from sklearn import preprocessing
from sklearn.model_selection import train_test_split
from sklearn.ensemble import RandomForestClassifier
import time

1. Create Train/Val/Test Sets¶

  • MargNet GitHub: https://github.com/sidchaini/MargNet

The datasets have been organized by each experiment, and this is what each file means:

  • objlist.npy - The SDSS ObjID of each object
  • X.npy - the 5-band image for each object (Used by the CNN)
  • dnnx.npy - the set of 24 photometric features for each object (Used by the DNN)
  • y.npy - the classification label for each object
  • photofeatures.csv - SDSS spreadsheet containing all the features from dnnx, labels from y, ObjIDs from objlist and a couple of more SDSS specific parameters.

Note: objlist, X, dnnx and y are in the same order. So, objlist[0], X[0], dnnx[0] and y[0] correspond to the same object.

Experiments!¶

We divide our problem into various experiments.

  • Experiment 1 : In this experiment, all three sets – training, validation, and test, are chosen from the compact source data set (c<0.5)(c<0.5), which is split in the ratio 6:1:1 (i.e. 75 percent training, 12.5 percent validation, and 12.5 percent test).

  • Experiment 2 : In this experiment, all three sets – training, validation, and test, are chosen from the faint and compact source data set (c<0.5;⟨mag⟩>20)(c<0.5;⟨mag⟩>20) and split in the ratio 8:1:1 (i.e. 80 per cent training, 10 per cent validation, and 10 per cent test). Here, we split our data differently than Experiment 1 and Experiment 3 as there were just 50 000 objects from each class in our data set

  • Experiment 3 : In this experiment, the training and validation set is chosen from the compact source data set (c < 0.5). However, the test set is chosen from the faint and compact source data set (c < 0.5; ⟨mag⟩ > 20), such that the ratio is 6:1:1 (i.e. 75 per cent training, 12.5 per cent validation, 12.5 per cent test).

In [2]:
X = np.load("../dataset/X_exp1.npy")
In [3]:
dnnx = np.load("../dataset/dnnx_exp1.npy")
In [4]:
objlist = np.load("../dataset/objlist_exp1.npy")
In [5]:
y = np.load("../dataset/y_exp1.npy", allow_pickle=True)

y, label_strings = pd.factorize(y,sort=True)
y = to_categorical(y)

print(label_strings)
['GALAXY' 'QSO' 'STAR']
In [6]:
zipX = list(zip(X, dnnx))
zipy = list(zip(y, objlist))

zipX_train, zipX_test, zipy_train, zipy_test = train_test_split(zipX, zipy, test_size = 0.125,random_state=42)
zipX_train, zipX_val, zipy_train, zipy_val = train_test_split(zipX_train, zipy_train, test_size = 0.1428, random_state=42)

X_train, dnnx_train = zip(*zipX_train)
X_val, dnnx_val = zip(*zipX_val)
X_test, dnnx_test = zip(*zipX_test)

y_train, objlist_train = zip(*zipy_train)
y_val, objlist_val = zip(*zipy_val)
y_test, objlist_test = zip(*zipy_test)

X_train = np.array(X_train)
X_val = np.array(X_val)
X_test = np.array(X_test)

dnnx_train = np.array(dnnx_train)
dnnx_val = np.array(dnnx_val)
dnnx_test = np.array(dnnx_test)

y_train = np.array(y_train)
objlist_train = np.array(objlist_train)
y_val = np.array(y_val)
objlist_val = np.array(objlist_val)
y_test = np.array(y_test)
objlist_test = np.array(objlist_test)


del(zipX,zipX_test,zipX_train,zipX_val, X, zipy, zipy_test, zipy_train, zipy_val, objlist)
In [7]:
def get_metrics(y_pred, y_test, labels, to_print=True):
    correct_labels = np.where(y_pred==y_test)[0]
    accuracy = metrics.accuracy_score(y_test, y_pred)
    precision = metrics.precision_score(y_test, y_pred,average='macro')
    recall = metrics.recall_score(y_test, y_pred,average='macro')
    f1score = metrics.f1_score(y_test, y_pred,average='macro')
    # rocscore = metrics.roc_auc_score(y_test, y_pred,average='micro',multi_class="ovo")
    confusion_matrix = metrics.confusion_matrix(y_test, y_pred)  
    classification_report = metrics.classification_report(y_test, y_pred)

    if to_print:
        print("Identified {} correct labels out of {} labels".format(len(correct_labels), y_test.shape[0]))
        print("Accuracy:",accuracy)
        print("Precision:",precision)
        print("Recall:",recall)
        print("F1 Score:",f1score)
        # print("ROC AUC Score:",rocscore)
        print(f"Labels are: {labels}")
        print("Confusion Matrix:\n", confusion_matrix)
        print("Classification_Report:\n", classification_report)

    return (correct_labels, accuracy, precision, recall, confusion_matrix, classification_report)
In [8]:
def plot_model_change(history,fname="output/time.pdf"):
    # summarize history for accuracy
    plt.plot(history.history['accuracy'],label="Training Acc")
    plt.plot(history.history['val_accuracy'],label="Val Acc")
    plt.title('model accuracy')
    plt.ylabel('accuracy')
    plt.xlabel('epoch')
    plt.legend()
    plt.show()
    # summarize history for loss
    plt.plot(history.history['loss'],label="Training Loss")
    plt.plot(history.history['val_loss'],label="Val Loss")
    plt.title('model loss')
    plt.ylabel('loss')
    plt.xlabel('epoch')
    plt.legend()
    plt.savefig(fname)
    plt.show()

2. Train a DNN Classifier¶

Mathematically, the output y of an artificial neuron can be represented as follows: y=f(Σw.x+b)y=f(Σw.x+b)

A practical ANN consists of more than one such neuron arranged in multiple layers having interconnections between neurons in different layers. The training of the neural network refers to the learning of the weights w and bias value(s) b in order to minimize the departure between the neural output y and the expected output.

In [7]:
from PIL import Image
im=Image.open('/Users/atharvabagul/MargNet/DNN-1.png')
im = im.resize((200, 600), Image.LANCZOS)
display(im)
In [9]:
model = Sequential()

model.add(Dense(1024, activation="sigmoid", input_dim=dnnx_train.shape[1]))
model.add(Dropout(0.25))
model.add(Dense(256, activation="sigmoid"))
model.add(Dropout(0.25))
model.add(Dense(128, activation="sigmoid"))
model.add(Dropout(0.25))
model.add(Dense(64, activation="sigmoid"))
model.add(Dropout(0.25))
model.add(Dense(32, activation="sigmoid"))
model.add(Dropout(0.25))

model.add(Dense(3, activation='softmax'))

model.compile(loss='categorical_crossentropy',
              optimizer="adam",
              metrics=['accuracy'])

es = EarlyStopping(monitor='val_loss', verbose=0, patience=100, restore_best_weights=True)
cb = [es]
history = model.fit(dnnx_train, y_train,
                    batch_size=2048,
                    epochs = 4000,
                    validation_data = (dnnx_val,y_val),
                    callbacks = cb,
                    verbose = 2)
Epoch 1/4000
88/88 - 1s - loss: 1.1102 - accuracy: 0.3661 - val_loss: 0.9145 - val_accuracy: 0.5666
Epoch 2/4000
88/88 - 0s - loss: 0.7676 - accuracy: 0.6128 - val_loss: 0.6135 - val_accuracy: 0.7483
Epoch 3/4000
88/88 - 0s - loss: 0.6160 - accuracy: 0.7467 - val_loss: 0.5024 - val_accuracy: 0.8202
Epoch 4/4000
88/88 - 0s - loss: 0.5425 - accuracy: 0.8003 - val_loss: 0.4591 - val_accuracy: 0.8373
Epoch 5/4000
88/88 - 0s - loss: 0.5017 - accuracy: 0.8210 - val_loss: 0.4350 - val_accuracy: 0.8483
Epoch 6/4000
88/88 - 0s - loss: 0.4810 - accuracy: 0.8298 - val_loss: 0.4240 - val_accuracy: 0.8502
Epoch 7/4000
88/88 - 0s - loss: 0.4660 - accuracy: 0.8348 - val_loss: 0.4141 - val_accuracy: 0.8518
Epoch 8/4000
88/88 - 0s - loss: 0.4496 - accuracy: 0.8391 - val_loss: 0.3981 - val_accuracy: 0.8575
Epoch 9/4000
88/88 - 0s - loss: 0.4345 - accuracy: 0.8421 - val_loss: 0.3835 - val_accuracy: 0.8598
Epoch 10/4000
88/88 - 0s - loss: 0.4196 - accuracy: 0.8454 - val_loss: 0.3682 - val_accuracy: 0.8626
Epoch 11/4000
88/88 - 0s - loss: 0.4043 - accuracy: 0.8492 - val_loss: 0.3520 - val_accuracy: 0.8673
Epoch 12/4000
88/88 - 0s - loss: 0.3889 - accuracy: 0.8533 - val_loss: 0.3375 - val_accuracy: 0.8732
Epoch 13/4000
88/88 - 0s - loss: 0.3767 - accuracy: 0.8581 - val_loss: 0.3286 - val_accuracy: 0.8768
Epoch 14/4000
88/88 - 0s - loss: 0.3700 - accuracy: 0.8609 - val_loss: 0.3241 - val_accuracy: 0.8773
Epoch 15/4000
88/88 - 0s - loss: 0.3631 - accuracy: 0.8637 - val_loss: 0.3211 - val_accuracy: 0.8773
Epoch 16/4000
88/88 - 0s - loss: 0.3590 - accuracy: 0.8651 - val_loss: 0.3161 - val_accuracy: 0.8798
Epoch 17/4000
88/88 - 0s - loss: 0.3564 - accuracy: 0.8659 - val_loss: 0.3143 - val_accuracy: 0.8805
Epoch 18/4000
88/88 - 0s - loss: 0.3544 - accuracy: 0.8668 - val_loss: 0.3139 - val_accuracy: 0.8796
Epoch 19/4000
88/88 - 0s - loss: 0.3517 - accuracy: 0.8679 - val_loss: 0.3145 - val_accuracy: 0.8788
Epoch 20/4000
88/88 - 0s - loss: 0.3479 - accuracy: 0.8695 - val_loss: 0.3083 - val_accuracy: 0.8825
Epoch 21/4000
88/88 - 0s - loss: 0.3456 - accuracy: 0.8693 - val_loss: 0.3090 - val_accuracy: 0.8812
Epoch 22/4000
88/88 - 0s - loss: 0.3430 - accuracy: 0.8713 - val_loss: 0.3077 - val_accuracy: 0.8829
Epoch 23/4000
88/88 - 0s - loss: 0.3405 - accuracy: 0.8719 - val_loss: 0.3057 - val_accuracy: 0.8828
Epoch 24/4000
88/88 - 0s - loss: 0.3369 - accuracy: 0.8726 - val_loss: 0.3106 - val_accuracy: 0.8797
Epoch 25/4000
88/88 - 0s - loss: 0.3350 - accuracy: 0.8744 - val_loss: 0.2964 - val_accuracy: 0.8860
Epoch 26/4000
88/88 - 0s - loss: 0.3316 - accuracy: 0.8744 - val_loss: 0.2952 - val_accuracy: 0.8862
Epoch 27/4000
88/88 - 0s - loss: 0.3293 - accuracy: 0.8750 - val_loss: 0.2910 - val_accuracy: 0.8857
Epoch 28/4000
88/88 - 0s - loss: 0.3244 - accuracy: 0.8765 - val_loss: 0.2897 - val_accuracy: 0.8858
Epoch 29/4000
88/88 - 0s - loss: 0.3242 - accuracy: 0.8760 - val_loss: 0.2878 - val_accuracy: 0.8863
Epoch 30/4000
88/88 - 0s - loss: 0.3185 - accuracy: 0.8780 - val_loss: 0.2799 - val_accuracy: 0.8881
Epoch 31/4000
88/88 - 0s - loss: 0.3176 - accuracy: 0.8793 - val_loss: 0.2804 - val_accuracy: 0.8899
Epoch 32/4000
88/88 - 0s - loss: 0.3142 - accuracy: 0.8803 - val_loss: 0.2826 - val_accuracy: 0.8879
Epoch 33/4000
88/88 - 0s - loss: 0.3134 - accuracy: 0.8806 - val_loss: 0.2743 - val_accuracy: 0.8917
Epoch 34/4000
88/88 - 0s - loss: 0.3116 - accuracy: 0.8809 - val_loss: 0.2720 - val_accuracy: 0.8914
Epoch 35/4000
88/88 - 0s - loss: 0.3068 - accuracy: 0.8830 - val_loss: 0.2730 - val_accuracy: 0.8927
Epoch 36/4000
88/88 - 0s - loss: 0.3062 - accuracy: 0.8832 - val_loss: 0.2732 - val_accuracy: 0.8932
Epoch 37/4000
88/88 - 0s - loss: 0.3051 - accuracy: 0.8841 - val_loss: 0.2683 - val_accuracy: 0.8937
Epoch 38/4000
88/88 - 0s - loss: 0.3027 - accuracy: 0.8847 - val_loss: 0.2683 - val_accuracy: 0.8952
Epoch 39/4000
88/88 - 0s - loss: 0.3012 - accuracy: 0.8856 - val_loss: 0.2649 - val_accuracy: 0.8966
Epoch 40/4000
88/88 - 0s - loss: 0.2991 - accuracy: 0.8852 - val_loss: 0.2701 - val_accuracy: 0.8921
Epoch 41/4000
88/88 - 0s - loss: 0.2995 - accuracy: 0.8852 - val_loss: 0.2640 - val_accuracy: 0.8958
Epoch 42/4000
88/88 - 0s - loss: 0.2972 - accuracy: 0.8872 - val_loss: 0.2621 - val_accuracy: 0.8969
Epoch 43/4000
88/88 - 0s - loss: 0.2954 - accuracy: 0.8868 - val_loss: 0.2637 - val_accuracy: 0.8963
Epoch 44/4000
88/88 - 0s - loss: 0.2945 - accuracy: 0.8868 - val_loss: 0.2593 - val_accuracy: 0.8973
Epoch 45/4000
88/88 - 0s - loss: 0.2922 - accuracy: 0.8877 - val_loss: 0.2619 - val_accuracy: 0.8959
Epoch 46/4000
88/88 - 0s - loss: 0.2928 - accuracy: 0.8875 - val_loss: 0.2562 - val_accuracy: 0.8984
Epoch 47/4000
88/88 - 0s - loss: 0.2891 - accuracy: 0.8885 - val_loss: 0.2553 - val_accuracy: 0.8983
Epoch 48/4000
88/88 - 0s - loss: 0.2865 - accuracy: 0.8897 - val_loss: 0.2522 - val_accuracy: 0.8986
Epoch 49/4000
88/88 - 0s - loss: 0.2849 - accuracy: 0.8893 - val_loss: 0.2514 - val_accuracy: 0.8982
Epoch 50/4000
88/88 - 0s - loss: 0.2849 - accuracy: 0.8900 - val_loss: 0.2525 - val_accuracy: 0.8983
Epoch 51/4000
88/88 - 0s - loss: 0.2838 - accuracy: 0.8898 - val_loss: 0.2492 - val_accuracy: 0.8984
Epoch 52/4000
88/88 - 0s - loss: 0.2821 - accuracy: 0.8907 - val_loss: 0.2492 - val_accuracy: 0.8995
Epoch 53/4000
88/88 - 0s - loss: 0.2827 - accuracy: 0.8911 - val_loss: 0.2475 - val_accuracy: 0.9002
Epoch 54/4000
88/88 - 0s - loss: 0.2819 - accuracy: 0.8912 - val_loss: 0.2469 - val_accuracy: 0.8999
Epoch 55/4000
88/88 - 0s - loss: 0.2780 - accuracy: 0.8916 - val_loss: 0.2442 - val_accuracy: 0.9012
Epoch 56/4000
88/88 - 0s - loss: 0.2772 - accuracy: 0.8924 - val_loss: 0.2482 - val_accuracy: 0.9001
Epoch 57/4000
88/88 - 0s - loss: 0.2762 - accuracy: 0.8926 - val_loss: 0.2441 - val_accuracy: 0.9007
Epoch 58/4000
88/88 - 0s - loss: 0.2769 - accuracy: 0.8919 - val_loss: 0.2449 - val_accuracy: 0.9000
Epoch 59/4000
88/88 - 0s - loss: 0.2752 - accuracy: 0.8928 - val_loss: 0.2422 - val_accuracy: 0.9016
Epoch 60/4000
88/88 - 0s - loss: 0.2733 - accuracy: 0.8936 - val_loss: 0.2405 - val_accuracy: 0.9023
Epoch 61/4000
88/88 - 0s - loss: 0.2740 - accuracy: 0.8936 - val_loss: 0.2411 - val_accuracy: 0.9013
Epoch 62/4000
88/88 - 0s - loss: 0.2720 - accuracy: 0.8941 - val_loss: 0.2428 - val_accuracy: 0.9010
Epoch 63/4000
88/88 - 0s - loss: 0.2706 - accuracy: 0.8949 - val_loss: 0.2439 - val_accuracy: 0.9009
Epoch 64/4000
88/88 - 0s - loss: 0.2709 - accuracy: 0.8948 - val_loss: 0.2381 - val_accuracy: 0.9020
Epoch 65/4000
88/88 - 0s - loss: 0.2686 - accuracy: 0.8958 - val_loss: 0.2375 - val_accuracy: 0.9031
Epoch 66/4000
88/88 - 0s - loss: 0.2689 - accuracy: 0.8953 - val_loss: 0.2400 - val_accuracy: 0.9025
Epoch 67/4000
88/88 - 0s - loss: 0.2683 - accuracy: 0.8957 - val_loss: 0.2421 - val_accuracy: 0.9018
Epoch 68/4000
88/88 - 0s - loss: 0.2679 - accuracy: 0.8955 - val_loss: 0.2381 - val_accuracy: 0.9032
Epoch 69/4000
88/88 - 0s - loss: 0.2681 - accuracy: 0.8958 - val_loss: 0.2335 - val_accuracy: 0.9064
Epoch 70/4000
88/88 - 0s - loss: 0.2649 - accuracy: 0.8971 - val_loss: 0.2351 - val_accuracy: 0.9044
Epoch 71/4000
88/88 - 0s - loss: 0.2645 - accuracy: 0.8972 - val_loss: 0.2335 - val_accuracy: 0.9046
Epoch 72/4000
88/88 - 0s - loss: 0.2627 - accuracy: 0.8973 - val_loss: 0.2336 - val_accuracy: 0.9058
Epoch 73/4000
88/88 - 0s - loss: 0.2635 - accuracy: 0.8979 - val_loss: 0.2350 - val_accuracy: 0.9045
Epoch 74/4000
88/88 - 0s - loss: 0.2615 - accuracy: 0.8989 - val_loss: 0.2303 - val_accuracy: 0.9066
Epoch 75/4000
88/88 - 0s - loss: 0.2619 - accuracy: 0.8982 - val_loss: 0.2336 - val_accuracy: 0.9054
Epoch 76/4000
88/88 - 0s - loss: 0.2608 - accuracy: 0.8983 - val_loss: 0.2302 - val_accuracy: 0.9074
Epoch 77/4000
88/88 - 0s - loss: 0.2603 - accuracy: 0.8996 - val_loss: 0.2302 - val_accuracy: 0.9088
Epoch 78/4000
88/88 - 0s - loss: 0.2606 - accuracy: 0.8988 - val_loss: 0.2288 - val_accuracy: 0.9097
Epoch 79/4000
88/88 - 0s - loss: 0.2610 - accuracy: 0.8989 - val_loss: 0.2285 - val_accuracy: 0.9071
Epoch 80/4000
88/88 - 0s - loss: 0.2566 - accuracy: 0.9003 - val_loss: 0.2279 - val_accuracy: 0.9097
Epoch 81/4000
88/88 - 0s - loss: 0.2567 - accuracy: 0.9006 - val_loss: 0.2272 - val_accuracy: 0.9084
Epoch 82/4000
88/88 - 0s - loss: 0.2554 - accuracy: 0.9001 - val_loss: 0.2257 - val_accuracy: 0.9100
Epoch 83/4000
88/88 - 0s - loss: 0.2552 - accuracy: 0.9019 - val_loss: 0.2289 - val_accuracy: 0.9101
Epoch 84/4000
88/88 - 0s - loss: 0.2549 - accuracy: 0.9011 - val_loss: 0.2255 - val_accuracy: 0.9114
Epoch 85/4000
88/88 - 0s - loss: 0.2536 - accuracy: 0.9018 - val_loss: 0.2245 - val_accuracy: 0.9106
Epoch 86/4000
88/88 - 0s - loss: 0.2533 - accuracy: 0.9015 - val_loss: 0.2248 - val_accuracy: 0.9107
Epoch 87/4000
88/88 - 0s - loss: 0.2538 - accuracy: 0.9016 - val_loss: 0.2261 - val_accuracy: 0.9085
Epoch 88/4000
88/88 - 0s - loss: 0.2529 - accuracy: 0.9016 - val_loss: 0.2284 - val_accuracy: 0.9092
Epoch 89/4000
88/88 - 0s - loss: 0.2529 - accuracy: 0.9024 - val_loss: 0.2229 - val_accuracy: 0.9103
Epoch 90/4000
88/88 - 0s - loss: 0.2513 - accuracy: 0.9025 - val_loss: 0.2248 - val_accuracy: 0.9108
Epoch 91/4000
88/88 - 0s - loss: 0.2516 - accuracy: 0.9026 - val_loss: 0.2242 - val_accuracy: 0.9123
Epoch 92/4000
88/88 - 0s - loss: 0.2496 - accuracy: 0.9036 - val_loss: 0.2226 - val_accuracy: 0.9115
Epoch 93/4000
88/88 - 0s - loss: 0.2494 - accuracy: 0.9032 - val_loss: 0.2216 - val_accuracy: 0.9126
Epoch 94/4000
88/88 - 0s - loss: 0.2507 - accuracy: 0.9037 - val_loss: 0.2216 - val_accuracy: 0.9126
Epoch 95/4000
88/88 - 0s - loss: 0.2497 - accuracy: 0.9037 - val_loss: 0.2230 - val_accuracy: 0.9115
Epoch 96/4000
88/88 - 0s - loss: 0.2474 - accuracy: 0.9043 - val_loss: 0.2197 - val_accuracy: 0.9122
Epoch 97/4000
88/88 - 0s - loss: 0.2490 - accuracy: 0.9045 - val_loss: 0.2199 - val_accuracy: 0.9125
Epoch 98/4000
88/88 - 0s - loss: 0.2475 - accuracy: 0.9044 - val_loss: 0.2243 - val_accuracy: 0.9128
Epoch 99/4000
88/88 - 0s - loss: 0.2471 - accuracy: 0.9043 - val_loss: 0.2204 - val_accuracy: 0.9133
Epoch 100/4000
88/88 - 0s - loss: 0.2474 - accuracy: 0.9044 - val_loss: 0.2203 - val_accuracy: 0.9131
Epoch 101/4000
88/88 - 0s - loss: 0.2460 - accuracy: 0.9053 - val_loss: 0.2202 - val_accuracy: 0.9124
Epoch 102/4000
88/88 - 0s - loss: 0.2460 - accuracy: 0.9057 - val_loss: 0.2179 - val_accuracy: 0.9129
Epoch 103/4000
88/88 - 0s - loss: 0.2462 - accuracy: 0.9054 - val_loss: 0.2177 - val_accuracy: 0.9136
Epoch 104/4000
88/88 - 0s - loss: 0.2452 - accuracy: 0.9057 - val_loss: 0.2239 - val_accuracy: 0.9110
Epoch 105/4000
88/88 - 0s - loss: 0.2444 - accuracy: 0.9053 - val_loss: 0.2173 - val_accuracy: 0.9129
Epoch 106/4000
88/88 - 0s - loss: 0.2437 - accuracy: 0.9062 - val_loss: 0.2180 - val_accuracy: 0.9143
Epoch 107/4000
88/88 - 0s - loss: 0.2446 - accuracy: 0.9062 - val_loss: 0.2165 - val_accuracy: 0.9142
Epoch 108/4000
88/88 - 0s - loss: 0.2436 - accuracy: 0.9060 - val_loss: 0.2196 - val_accuracy: 0.9127
Epoch 109/4000
88/88 - 0s - loss: 0.2433 - accuracy: 0.9070 - val_loss: 0.2154 - val_accuracy: 0.9143
Epoch 110/4000
88/88 - 0s - loss: 0.2427 - accuracy: 0.9065 - val_loss: 0.2155 - val_accuracy: 0.9145
Epoch 111/4000
88/88 - 0s - loss: 0.2427 - accuracy: 0.9068 - val_loss: 0.2146 - val_accuracy: 0.9149
Epoch 112/4000
88/88 - 0s - loss: 0.2426 - accuracy: 0.9073 - val_loss: 0.2234 - val_accuracy: 0.9114
Epoch 113/4000
88/88 - 0s - loss: 0.2414 - accuracy: 0.9074 - val_loss: 0.2195 - val_accuracy: 0.9128
Epoch 114/4000
88/88 - 0s - loss: 0.2405 - accuracy: 0.9077 - val_loss: 0.2149 - val_accuracy: 0.9144
Epoch 115/4000
88/88 - 0s - loss: 0.2406 - accuracy: 0.9071 - val_loss: 0.2172 - val_accuracy: 0.9139
Epoch 116/4000
88/88 - 0s - loss: 0.2409 - accuracy: 0.9078 - val_loss: 0.2138 - val_accuracy: 0.9149
Epoch 117/4000
88/88 - 0s - loss: 0.2410 - accuracy: 0.9076 - val_loss: 0.2128 - val_accuracy: 0.9154
Epoch 118/4000
88/88 - 0s - loss: 0.2392 - accuracy: 0.9080 - val_loss: 0.2136 - val_accuracy: 0.9160
Epoch 119/4000
88/88 - 0s - loss: 0.2391 - accuracy: 0.9078 - val_loss: 0.2138 - val_accuracy: 0.9156
Epoch 120/4000
88/88 - 0s - loss: 0.2394 - accuracy: 0.9079 - val_loss: 0.2117 - val_accuracy: 0.9161
Epoch 121/4000
88/88 - 0s - loss: 0.2392 - accuracy: 0.9085 - val_loss: 0.2134 - val_accuracy: 0.9148
Epoch 122/4000
88/88 - 0s - loss: 0.2375 - accuracy: 0.9084 - val_loss: 0.2118 - val_accuracy: 0.9154
Epoch 123/4000
88/88 - 0s - loss: 0.2382 - accuracy: 0.9088 - val_loss: 0.2117 - val_accuracy: 0.9162
Epoch 124/4000
88/88 - 0s - loss: 0.2382 - accuracy: 0.9084 - val_loss: 0.2109 - val_accuracy: 0.9160
Epoch 125/4000
88/88 - 0s - loss: 0.2375 - accuracy: 0.9085 - val_loss: 0.2116 - val_accuracy: 0.9151
Epoch 126/4000
88/88 - 0s - loss: 0.2366 - accuracy: 0.9090 - val_loss: 0.2114 - val_accuracy: 0.9165
Epoch 127/4000
88/88 - 0s - loss: 0.2361 - accuracy: 0.9091 - val_loss: 0.2109 - val_accuracy: 0.9167
Epoch 128/4000
88/88 - 0s - loss: 0.2365 - accuracy: 0.9088 - val_loss: 0.2099 - val_accuracy: 0.9167
Epoch 129/4000
88/88 - 0s - loss: 0.2352 - accuracy: 0.9092 - val_loss: 0.2113 - val_accuracy: 0.9158
Epoch 130/4000
88/88 - 0s - loss: 0.2356 - accuracy: 0.9096 - val_loss: 0.2090 - val_accuracy: 0.9172
Epoch 131/4000
88/88 - 0s - loss: 0.2351 - accuracy: 0.9096 - val_loss: 0.2116 - val_accuracy: 0.9161
Epoch 132/4000
88/88 - 0s - loss: 0.2347 - accuracy: 0.9099 - val_loss: 0.2113 - val_accuracy: 0.9163
Epoch 133/4000
88/88 - 0s - loss: 0.2363 - accuracy: 0.9093 - val_loss: 0.2095 - val_accuracy: 0.9171
Epoch 134/4000
88/88 - 0s - loss: 0.2347 - accuracy: 0.9093 - val_loss: 0.2124 - val_accuracy: 0.9154
Epoch 135/4000
88/88 - 0s - loss: 0.2344 - accuracy: 0.9096 - val_loss: 0.2095 - val_accuracy: 0.9176
Epoch 136/4000
88/88 - 0s - loss: 0.2336 - accuracy: 0.9106 - val_loss: 0.2101 - val_accuracy: 0.9165
Epoch 137/4000
88/88 - 0s - loss: 0.2336 - accuracy: 0.9102 - val_loss: 0.2083 - val_accuracy: 0.9172
Epoch 138/4000
88/88 - 0s - loss: 0.2330 - accuracy: 0.9109 - val_loss: 0.2076 - val_accuracy: 0.9172
Epoch 139/4000
88/88 - 0s - loss: 0.2325 - accuracy: 0.9102 - val_loss: 0.2090 - val_accuracy: 0.9172
Epoch 140/4000
88/88 - 0s - loss: 0.2334 - accuracy: 0.9104 - val_loss: 0.2075 - val_accuracy: 0.9180
Epoch 141/4000
88/88 - 0s - loss: 0.2329 - accuracy: 0.9105 - val_loss: 0.2097 - val_accuracy: 0.9162
Epoch 142/4000
88/88 - 0s - loss: 0.2317 - accuracy: 0.9104 - val_loss: 0.2073 - val_accuracy: 0.9180
Epoch 143/4000
88/88 - 0s - loss: 0.2315 - accuracy: 0.9104 - val_loss: 0.2085 - val_accuracy: 0.9180
Epoch 144/4000
88/88 - 0s - loss: 0.2309 - accuracy: 0.9107 - val_loss: 0.2072 - val_accuracy: 0.9172
Epoch 145/4000
88/88 - 0s - loss: 0.2320 - accuracy: 0.9109 - val_loss: 0.2117 - val_accuracy: 0.9166
Epoch 146/4000
88/88 - 0s - loss: 0.2329 - accuracy: 0.9105 - val_loss: 0.2059 - val_accuracy: 0.9181
Epoch 147/4000
88/88 - 0s - loss: 0.2312 - accuracy: 0.9111 - val_loss: 0.2059 - val_accuracy: 0.9193
Epoch 148/4000
88/88 - 0s - loss: 0.2305 - accuracy: 0.9108 - val_loss: 0.2073 - val_accuracy: 0.9172
Epoch 149/4000
88/88 - 0s - loss: 0.2297 - accuracy: 0.9117 - val_loss: 0.2069 - val_accuracy: 0.9187
Epoch 150/4000
88/88 - 0s - loss: 0.2295 - accuracy: 0.9116 - val_loss: 0.2119 - val_accuracy: 0.9170
Epoch 151/4000
88/88 - 0s - loss: 0.2293 - accuracy: 0.9118 - val_loss: 0.2054 - val_accuracy: 0.9191
Epoch 152/4000
88/88 - 0s - loss: 0.2292 - accuracy: 0.9117 - val_loss: 0.2080 - val_accuracy: 0.9167
Epoch 153/4000
88/88 - 0s - loss: 0.2317 - accuracy: 0.9110 - val_loss: 0.2061 - val_accuracy: 0.9190
Epoch 154/4000
88/88 - 0s - loss: 0.2286 - accuracy: 0.9124 - val_loss: 0.2052 - val_accuracy: 0.9195
Epoch 155/4000
88/88 - 0s - loss: 0.2285 - accuracy: 0.9119 - val_loss: 0.2040 - val_accuracy: 0.9192
Epoch 156/4000
88/88 - 0s - loss: 0.2298 - accuracy: 0.9121 - val_loss: 0.2052 - val_accuracy: 0.9184
Epoch 157/4000
88/88 - 0s - loss: 0.2282 - accuracy: 0.9121 - val_loss: 0.2045 - val_accuracy: 0.9193
Epoch 158/4000
88/88 - 0s - loss: 0.2286 - accuracy: 0.9124 - val_loss: 0.2057 - val_accuracy: 0.9194
Epoch 159/4000
88/88 - 0s - loss: 0.2289 - accuracy: 0.9124 - val_loss: 0.2062 - val_accuracy: 0.9194
Epoch 160/4000
88/88 - 0s - loss: 0.2270 - accuracy: 0.9132 - val_loss: 0.2055 - val_accuracy: 0.9201
Epoch 161/4000
88/88 - 0s - loss: 0.2281 - accuracy: 0.9124 - val_loss: 0.2037 - val_accuracy: 0.9195
Epoch 162/4000
88/88 - 0s - loss: 0.2279 - accuracy: 0.9131 - val_loss: 0.2035 - val_accuracy: 0.9196
Epoch 163/4000
88/88 - 0s - loss: 0.2257 - accuracy: 0.9127 - val_loss: 0.2057 - val_accuracy: 0.9189
Epoch 164/4000
88/88 - 0s - loss: 0.2265 - accuracy: 0.9124 - val_loss: 0.2041 - val_accuracy: 0.9187
Epoch 165/4000
88/88 - 0s - loss: 0.2263 - accuracy: 0.9134 - val_loss: 0.2041 - val_accuracy: 0.9198
Epoch 166/4000
88/88 - 0s - loss: 0.2265 - accuracy: 0.9124 - val_loss: 0.2065 - val_accuracy: 0.9182
Epoch 167/4000
88/88 - 0s - loss: 0.2260 - accuracy: 0.9128 - val_loss: 0.2052 - val_accuracy: 0.9189
Epoch 168/4000
88/88 - 0s - loss: 0.2249 - accuracy: 0.9131 - val_loss: 0.2020 - val_accuracy: 0.9204
Epoch 169/4000
88/88 - 0s - loss: 0.2265 - accuracy: 0.9132 - val_loss: 0.2027 - val_accuracy: 0.9203
Epoch 170/4000
88/88 - 0s - loss: 0.2251 - accuracy: 0.9143 - val_loss: 0.2051 - val_accuracy: 0.9200
Epoch 171/4000
88/88 - 0s - loss: 0.2255 - accuracy: 0.9137 - val_loss: 0.2051 - val_accuracy: 0.9199
Epoch 172/4000
88/88 - 0s - loss: 0.2243 - accuracy: 0.9143 - val_loss: 0.2019 - val_accuracy: 0.9207
Epoch 173/4000
88/88 - 0s - loss: 0.2247 - accuracy: 0.9141 - val_loss: 0.2019 - val_accuracy: 0.9197
Epoch 174/4000
88/88 - 0s - loss: 0.2249 - accuracy: 0.9132 - val_loss: 0.2007 - val_accuracy: 0.9210
Epoch 175/4000
88/88 - 0s - loss: 0.2247 - accuracy: 0.9141 - val_loss: 0.2014 - val_accuracy: 0.9213
Epoch 176/4000
88/88 - 0s - loss: 0.2244 - accuracy: 0.9137 - val_loss: 0.2026 - val_accuracy: 0.9197
Epoch 177/4000
88/88 - 0s - loss: 0.2242 - accuracy: 0.9137 - val_loss: 0.2024 - val_accuracy: 0.9206
Epoch 178/4000
88/88 - 0s - loss: 0.2238 - accuracy: 0.9144 - val_loss: 0.2033 - val_accuracy: 0.9195
Epoch 179/4000
88/88 - 0s - loss: 0.2229 - accuracy: 0.9142 - val_loss: 0.2009 - val_accuracy: 0.9209
Epoch 180/4000
88/88 - 0s - loss: 0.2236 - accuracy: 0.9148 - val_loss: 0.2003 - val_accuracy: 0.9209
Epoch 181/4000
88/88 - 0s - loss: 0.2233 - accuracy: 0.9143 - val_loss: 0.2028 - val_accuracy: 0.9190
Epoch 182/4000
88/88 - 0s - loss: 0.2224 - accuracy: 0.9144 - val_loss: 0.2006 - val_accuracy: 0.9210
Epoch 183/4000
88/88 - 0s - loss: 0.2218 - accuracy: 0.9146 - val_loss: 0.2006 - val_accuracy: 0.9212
Epoch 184/4000
88/88 - 0s - loss: 0.2226 - accuracy: 0.9145 - val_loss: 0.2026 - val_accuracy: 0.9211
Epoch 185/4000
88/88 - 0s - loss: 0.2214 - accuracy: 0.9150 - val_loss: 0.2020 - val_accuracy: 0.9213
Epoch 186/4000
88/88 - 0s - loss: 0.2224 - accuracy: 0.9149 - val_loss: 0.2003 - val_accuracy: 0.9206
Epoch 187/4000
88/88 - 0s - loss: 0.2223 - accuracy: 0.9146 - val_loss: 0.2002 - val_accuracy: 0.9211
Epoch 188/4000
88/88 - 0s - loss: 0.2206 - accuracy: 0.9156 - val_loss: 0.1996 - val_accuracy: 0.9214
Epoch 189/4000
88/88 - 0s - loss: 0.2207 - accuracy: 0.9151 - val_loss: 0.2006 - val_accuracy: 0.9218
Epoch 190/4000
88/88 - 0s - loss: 0.2202 - accuracy: 0.9153 - val_loss: 0.2002 - val_accuracy: 0.9222
Epoch 191/4000
88/88 - 0s - loss: 0.2208 - accuracy: 0.9156 - val_loss: 0.1994 - val_accuracy: 0.9218
Epoch 192/4000
88/88 - 0s - loss: 0.2213 - accuracy: 0.9150 - val_loss: 0.2023 - val_accuracy: 0.9201
Epoch 193/4000
88/88 - 0s - loss: 0.2218 - accuracy: 0.9153 - val_loss: 0.1992 - val_accuracy: 0.9217
Epoch 194/4000
88/88 - 0s - loss: 0.2208 - accuracy: 0.9155 - val_loss: 0.1997 - val_accuracy: 0.9218
Epoch 195/4000
88/88 - 0s - loss: 0.2210 - accuracy: 0.9157 - val_loss: 0.2006 - val_accuracy: 0.9215
Epoch 196/4000
88/88 - 0s - loss: 0.2220 - accuracy: 0.9150 - val_loss: 0.2042 - val_accuracy: 0.9198
Epoch 197/4000
88/88 - 0s - loss: 0.2216 - accuracy: 0.9148 - val_loss: 0.2031 - val_accuracy: 0.9200
Epoch 198/4000
88/88 - 0s - loss: 0.2195 - accuracy: 0.9158 - val_loss: 0.1987 - val_accuracy: 0.9216
Epoch 199/4000
88/88 - 0s - loss: 0.2191 - accuracy: 0.9157 - val_loss: 0.2029 - val_accuracy: 0.9195
Epoch 200/4000
88/88 - 0s - loss: 0.2209 - accuracy: 0.9151 - val_loss: 0.1995 - val_accuracy: 0.9218
Epoch 201/4000
88/88 - 0s - loss: 0.2199 - accuracy: 0.9155 - val_loss: 0.1983 - val_accuracy: 0.9219
Epoch 202/4000
88/88 - 0s - loss: 0.2202 - accuracy: 0.9156 - val_loss: 0.2014 - val_accuracy: 0.9213
Epoch 203/4000
88/88 - 0s - loss: 0.2195 - accuracy: 0.9164 - val_loss: 0.2020 - val_accuracy: 0.9212
Epoch 204/4000
88/88 - 0s - loss: 0.2183 - accuracy: 0.9164 - val_loss: 0.1989 - val_accuracy: 0.9211
Epoch 205/4000
88/88 - 0s - loss: 0.2193 - accuracy: 0.9157 - val_loss: 0.1986 - val_accuracy: 0.9226
Epoch 206/4000
88/88 - 0s - loss: 0.2198 - accuracy: 0.9161 - val_loss: 0.1987 - val_accuracy: 0.9222
Epoch 207/4000
88/88 - 0s - loss: 0.2178 - accuracy: 0.9160 - val_loss: 0.1988 - val_accuracy: 0.9227
Epoch 208/4000
88/88 - 0s - loss: 0.2194 - accuracy: 0.9160 - val_loss: 0.2008 - val_accuracy: 0.9222
Epoch 209/4000
88/88 - 0s - loss: 0.2192 - accuracy: 0.9162 - val_loss: 0.1985 - val_accuracy: 0.9228
Epoch 210/4000
88/88 - 0s - loss: 0.2185 - accuracy: 0.9154 - val_loss: 0.2032 - val_accuracy: 0.9205
Epoch 211/4000
88/88 - 0s - loss: 0.2179 - accuracy: 0.9163 - val_loss: 0.1988 - val_accuracy: 0.9208
Epoch 212/4000
88/88 - 0s - loss: 0.2177 - accuracy: 0.9163 - val_loss: 0.1995 - val_accuracy: 0.9223
Epoch 213/4000
88/88 - 0s - loss: 0.2170 - accuracy: 0.9171 - val_loss: 0.1977 - val_accuracy: 0.9227
Epoch 214/4000
88/88 - 0s - loss: 0.2171 - accuracy: 0.9172 - val_loss: 0.1971 - val_accuracy: 0.9230
Epoch 215/4000
88/88 - 0s - loss: 0.2177 - accuracy: 0.9164 - val_loss: 0.2006 - val_accuracy: 0.9212
Epoch 216/4000
88/88 - 0s - loss: 0.2178 - accuracy: 0.9162 - val_loss: 0.1979 - val_accuracy: 0.9213
Epoch 217/4000
88/88 - 0s - loss: 0.2166 - accuracy: 0.9169 - val_loss: 0.1982 - val_accuracy: 0.9220
Epoch 218/4000
88/88 - 0s - loss: 0.2167 - accuracy: 0.9169 - val_loss: 0.1960 - val_accuracy: 0.9232
Epoch 219/4000
88/88 - 0s - loss: 0.2186 - accuracy: 0.9169 - val_loss: 0.2002 - val_accuracy: 0.9218
Epoch 220/4000
88/88 - 0s - loss: 0.2176 - accuracy: 0.9165 - val_loss: 0.1973 - val_accuracy: 0.9221
Epoch 221/4000
88/88 - 0s - loss: 0.2171 - accuracy: 0.9177 - val_loss: 0.1958 - val_accuracy: 0.9230
Epoch 222/4000
88/88 - 0s - loss: 0.2166 - accuracy: 0.9166 - val_loss: 0.1990 - val_accuracy: 0.9224
Epoch 223/4000
88/88 - 0s - loss: 0.2161 - accuracy: 0.9172 - val_loss: 0.1956 - val_accuracy: 0.9234
Epoch 224/4000
88/88 - 0s - loss: 0.2162 - accuracy: 0.9175 - val_loss: 0.1959 - val_accuracy: 0.9232
Epoch 225/4000
88/88 - 0s - loss: 0.2167 - accuracy: 0.9171 - val_loss: 0.1971 - val_accuracy: 0.9227
Epoch 226/4000
88/88 - 0s - loss: 0.2162 - accuracy: 0.9173 - val_loss: 0.2000 - val_accuracy: 0.9221
Epoch 227/4000
88/88 - 0s - loss: 0.2148 - accuracy: 0.9177 - val_loss: 0.1962 - val_accuracy: 0.9230
Epoch 228/4000
88/88 - 0s - loss: 0.2155 - accuracy: 0.9175 - val_loss: 0.1961 - val_accuracy: 0.9226
Epoch 229/4000
88/88 - 0s - loss: 0.2154 - accuracy: 0.9171 - val_loss: 0.1947 - val_accuracy: 0.9234
Epoch 230/4000
88/88 - 0s - loss: 0.2151 - accuracy: 0.9173 - val_loss: 0.1995 - val_accuracy: 0.9222
Epoch 231/4000
88/88 - 0s - loss: 0.2158 - accuracy: 0.9176 - val_loss: 0.1961 - val_accuracy: 0.9229
Epoch 232/4000
88/88 - 0s - loss: 0.2152 - accuracy: 0.9177 - val_loss: 0.1996 - val_accuracy: 0.9216
Epoch 233/4000
88/88 - 0s - loss: 0.2154 - accuracy: 0.9168 - val_loss: 0.1963 - val_accuracy: 0.9227
Epoch 234/4000
88/88 - 0s - loss: 0.2157 - accuracy: 0.9172 - val_loss: 0.1978 - val_accuracy: 0.9232
Epoch 235/4000
88/88 - 0s - loss: 0.2151 - accuracy: 0.9184 - val_loss: 0.1964 - val_accuracy: 0.9232
Epoch 236/4000
88/88 - 0s - loss: 0.2155 - accuracy: 0.9171 - val_loss: 0.1960 - val_accuracy: 0.9227
Epoch 237/4000
88/88 - 0s - loss: 0.2149 - accuracy: 0.9176 - val_loss: 0.1958 - val_accuracy: 0.9226
Epoch 238/4000
88/88 - 0s - loss: 0.2156 - accuracy: 0.9175 - val_loss: 0.1971 - val_accuracy: 0.9230
Epoch 239/4000
88/88 - 0s - loss: 0.2150 - accuracy: 0.9177 - val_loss: 0.1961 - val_accuracy: 0.9236
Epoch 240/4000
88/88 - 0s - loss: 0.2149 - accuracy: 0.9179 - val_loss: 0.1962 - val_accuracy: 0.9234
Epoch 241/4000
88/88 - 0s - loss: 0.2140 - accuracy: 0.9183 - val_loss: 0.1952 - val_accuracy: 0.9226
Epoch 242/4000
88/88 - 0s - loss: 0.2139 - accuracy: 0.9180 - val_loss: 0.1954 - val_accuracy: 0.9237
Epoch 243/4000
88/88 - 0s - loss: 0.2136 - accuracy: 0.9182 - val_loss: 0.1960 - val_accuracy: 0.9226
Epoch 244/4000
88/88 - 0s - loss: 0.2149 - accuracy: 0.9182 - val_loss: 0.1986 - val_accuracy: 0.9219
Epoch 245/4000
88/88 - 0s - loss: 0.2143 - accuracy: 0.9176 - val_loss: 0.1969 - val_accuracy: 0.9228
Epoch 246/4000
88/88 - 0s - loss: 0.2144 - accuracy: 0.9179 - val_loss: 0.1973 - val_accuracy: 0.9220
Epoch 247/4000
88/88 - 0s - loss: 0.2135 - accuracy: 0.9180 - val_loss: 0.1950 - val_accuracy: 0.9235
Epoch 248/4000
88/88 - 0s - loss: 0.2143 - accuracy: 0.9178 - val_loss: 0.1984 - val_accuracy: 0.9238
Epoch 249/4000
88/88 - 0s - loss: 0.2134 - accuracy: 0.9181 - val_loss: 0.1949 - val_accuracy: 0.9239
Epoch 250/4000
88/88 - 0s - loss: 0.2127 - accuracy: 0.9179 - val_loss: 0.1968 - val_accuracy: 0.9233
Epoch 251/4000
88/88 - 0s - loss: 0.2134 - accuracy: 0.9188 - val_loss: 0.1957 - val_accuracy: 0.9235
Epoch 252/4000
88/88 - 0s - loss: 0.2120 - accuracy: 0.9186 - val_loss: 0.2004 - val_accuracy: 0.9213
Epoch 253/4000
88/88 - 0s - loss: 0.2132 - accuracy: 0.9180 - val_loss: 0.1955 - val_accuracy: 0.9237
Epoch 254/4000
88/88 - 0s - loss: 0.2125 - accuracy: 0.9182 - val_loss: 0.1938 - val_accuracy: 0.9241
Epoch 255/4000
88/88 - 0s - loss: 0.2115 - accuracy: 0.9189 - val_loss: 0.1950 - val_accuracy: 0.9235
Epoch 256/4000
88/88 - 0s - loss: 0.2129 - accuracy: 0.9183 - val_loss: 0.1960 - val_accuracy: 0.9232
Epoch 257/4000
88/88 - 0s - loss: 0.2114 - accuracy: 0.9195 - val_loss: 0.1950 - val_accuracy: 0.9240
Epoch 258/4000
88/88 - 0s - loss: 0.2133 - accuracy: 0.9183 - val_loss: 0.1937 - val_accuracy: 0.9245
Epoch 259/4000
88/88 - 0s - loss: 0.2130 - accuracy: 0.9181 - val_loss: 0.1942 - val_accuracy: 0.9247
Epoch 260/4000
88/88 - 0s - loss: 0.2124 - accuracy: 0.9183 - val_loss: 0.1939 - val_accuracy: 0.9242
Epoch 261/4000
88/88 - 0s - loss: 0.2115 - accuracy: 0.9186 - val_loss: 0.1943 - val_accuracy: 0.9244
Epoch 262/4000
88/88 - 0s - loss: 0.2115 - accuracy: 0.9194 - val_loss: 0.1949 - val_accuracy: 0.9241
Epoch 263/4000
88/88 - 0s - loss: 0.2121 - accuracy: 0.9187 - val_loss: 0.1935 - val_accuracy: 0.9242
Epoch 264/4000
88/88 - 0s - loss: 0.2112 - accuracy: 0.9197 - val_loss: 0.1941 - val_accuracy: 0.9248
Epoch 265/4000
88/88 - 0s - loss: 0.2111 - accuracy: 0.9191 - val_loss: 0.1937 - val_accuracy: 0.9239
Epoch 266/4000
88/88 - 0s - loss: 0.2105 - accuracy: 0.9190 - val_loss: 0.1933 - val_accuracy: 0.9250
Epoch 267/4000
88/88 - 0s - loss: 0.2102 - accuracy: 0.9192 - val_loss: 0.1988 - val_accuracy: 0.9223
Epoch 268/4000
88/88 - 0s - loss: 0.2119 - accuracy: 0.9183 - val_loss: 0.1928 - val_accuracy: 0.9254
Epoch 269/4000
88/88 - 0s - loss: 0.2103 - accuracy: 0.9187 - val_loss: 0.1965 - val_accuracy: 0.9230
Epoch 270/4000
88/88 - 0s - loss: 0.2125 - accuracy: 0.9183 - val_loss: 0.2002 - val_accuracy: 0.9218
Epoch 271/4000
88/88 - 0s - loss: 0.2109 - accuracy: 0.9192 - val_loss: 0.1929 - val_accuracy: 0.9239
Epoch 272/4000
88/88 - 0s - loss: 0.2112 - accuracy: 0.9192 - val_loss: 0.1983 - val_accuracy: 0.9242
Epoch 273/4000
88/88 - 0s - loss: 0.2107 - accuracy: 0.9194 - val_loss: 0.1929 - val_accuracy: 0.9247
Epoch 274/4000
88/88 - 0s - loss: 0.2111 - accuracy: 0.9198 - val_loss: 0.1932 - val_accuracy: 0.9244
Epoch 275/4000
88/88 - 0s - loss: 0.2114 - accuracy: 0.9186 - val_loss: 0.1936 - val_accuracy: 0.9239
Epoch 276/4000
88/88 - 0s - loss: 0.2106 - accuracy: 0.9193 - val_loss: 0.1918 - val_accuracy: 0.9247
Epoch 277/4000
88/88 - 0s - loss: 0.2113 - accuracy: 0.9193 - val_loss: 0.1999 - val_accuracy: 0.9215
Epoch 278/4000
88/88 - 0s - loss: 0.2095 - accuracy: 0.9194 - val_loss: 0.1937 - val_accuracy: 0.9235
Epoch 279/4000
88/88 - 0s - loss: 0.2101 - accuracy: 0.9196 - val_loss: 0.1935 - val_accuracy: 0.9255
Epoch 280/4000
88/88 - 0s - loss: 0.2102 - accuracy: 0.9189 - val_loss: 0.1933 - val_accuracy: 0.9253
Epoch 281/4000
88/88 - 0s - loss: 0.2102 - accuracy: 0.9193 - val_loss: 0.1944 - val_accuracy: 0.9236
Epoch 282/4000
88/88 - 0s - loss: 0.2100 - accuracy: 0.9194 - val_loss: 0.1938 - val_accuracy: 0.9245
Epoch 283/4000
88/88 - 0s - loss: 0.2094 - accuracy: 0.9192 - val_loss: 0.1993 - val_accuracy: 0.9237
Epoch 284/4000
88/88 - 0s - loss: 0.2101 - accuracy: 0.9191 - val_loss: 0.1926 - val_accuracy: 0.9254
Epoch 285/4000
88/88 - 0s - loss: 0.2087 - accuracy: 0.9197 - val_loss: 0.1938 - val_accuracy: 0.9252
Epoch 286/4000
88/88 - 0s - loss: 0.2112 - accuracy: 0.9184 - val_loss: 0.1933 - val_accuracy: 0.9246
Epoch 287/4000
88/88 - 0s - loss: 0.2092 - accuracy: 0.9196 - val_loss: 0.1935 - val_accuracy: 0.9252
Epoch 288/4000
88/88 - 0s - loss: 0.2071 - accuracy: 0.9205 - val_loss: 0.1924 - val_accuracy: 0.9243
Epoch 289/4000
88/88 - 0s - loss: 0.2096 - accuracy: 0.9192 - val_loss: 0.1920 - val_accuracy: 0.9253
Epoch 290/4000
88/88 - 0s - loss: 0.2095 - accuracy: 0.9197 - val_loss: 0.1941 - val_accuracy: 0.9235
Epoch 291/4000
88/88 - 0s - loss: 0.2103 - accuracy: 0.9197 - val_loss: 0.1925 - val_accuracy: 0.9259
Epoch 292/4000
88/88 - 0s - loss: 0.2082 - accuracy: 0.9198 - val_loss: 0.1930 - val_accuracy: 0.9246
Epoch 293/4000
88/88 - 0s - loss: 0.2094 - accuracy: 0.9197 - val_loss: 0.1931 - val_accuracy: 0.9247
Epoch 294/4000
88/88 - 0s - loss: 0.2089 - accuracy: 0.9201 - val_loss: 0.1917 - val_accuracy: 0.9246
Epoch 295/4000
88/88 - 0s - loss: 0.2094 - accuracy: 0.9199 - val_loss: 0.1947 - val_accuracy: 0.9248
Epoch 296/4000
88/88 - 0s - loss: 0.2082 - accuracy: 0.9198 - val_loss: 0.1973 - val_accuracy: 0.9229
Epoch 297/4000
88/88 - 0s - loss: 0.2085 - accuracy: 0.9199 - val_loss: 0.1914 - val_accuracy: 0.9254
Epoch 298/4000
88/88 - 0s - loss: 0.2069 - accuracy: 0.9206 - val_loss: 0.1938 - val_accuracy: 0.9242
Epoch 299/4000
88/88 - 0s - loss: 0.2084 - accuracy: 0.9203 - val_loss: 0.1921 - val_accuracy: 0.9252
Epoch 300/4000
88/88 - 0s - loss: 0.2086 - accuracy: 0.9201 - val_loss: 0.1919 - val_accuracy: 0.9254
Epoch 301/4000
88/88 - 0s - loss: 0.2079 - accuracy: 0.9196 - val_loss: 0.1920 - val_accuracy: 0.9246
Epoch 302/4000
88/88 - 0s - loss: 0.2081 - accuracy: 0.9200 - val_loss: 0.1919 - val_accuracy: 0.9245
Epoch 303/4000
88/88 - 0s - loss: 0.2071 - accuracy: 0.9210 - val_loss: 0.1965 - val_accuracy: 0.9232
Epoch 304/4000
88/88 - 0s - loss: 0.2072 - accuracy: 0.9208 - val_loss: 0.1935 - val_accuracy: 0.9246
Epoch 305/4000
88/88 - 0s - loss: 0.2072 - accuracy: 0.9201 - val_loss: 0.1911 - val_accuracy: 0.9263
Epoch 306/4000
88/88 - 0s - loss: 0.2073 - accuracy: 0.9208 - val_loss: 0.1933 - val_accuracy: 0.9244
Epoch 307/4000
88/88 - 0s - loss: 0.2066 - accuracy: 0.9205 - val_loss: 0.1987 - val_accuracy: 0.9233
Epoch 308/4000
88/88 - 0s - loss: 0.2092 - accuracy: 0.9203 - val_loss: 0.1933 - val_accuracy: 0.9237
Epoch 309/4000
88/88 - 0s - loss: 0.2073 - accuracy: 0.9198 - val_loss: 0.1913 - val_accuracy: 0.9253
Epoch 310/4000
88/88 - 0s - loss: 0.2076 - accuracy: 0.9200 - val_loss: 0.1938 - val_accuracy: 0.9233
Epoch 311/4000
88/88 - 0s - loss: 0.2076 - accuracy: 0.9199 - val_loss: 0.1962 - val_accuracy: 0.9244
Epoch 312/4000
88/88 - 0s - loss: 0.2074 - accuracy: 0.9206 - val_loss: 0.1930 - val_accuracy: 0.9233
Epoch 313/4000
88/88 - 0s - loss: 0.2072 - accuracy: 0.9204 - val_loss: 0.1922 - val_accuracy: 0.9255
Epoch 314/4000
88/88 - 0s - loss: 0.2062 - accuracy: 0.9207 - val_loss: 0.1906 - val_accuracy: 0.9262
Epoch 315/4000
88/88 - 0s - loss: 0.2062 - accuracy: 0.9209 - val_loss: 0.1914 - val_accuracy: 0.9253
Epoch 316/4000
88/88 - 0s - loss: 0.2089 - accuracy: 0.9204 - val_loss: 0.1920 - val_accuracy: 0.9253
Epoch 317/4000
88/88 - 0s - loss: 0.2066 - accuracy: 0.9208 - val_loss: 0.1905 - val_accuracy: 0.9266
Epoch 318/4000
88/88 - 0s - loss: 0.2059 - accuracy: 0.9205 - val_loss: 0.1924 - val_accuracy: 0.9247
Epoch 319/4000
88/88 - 0s - loss: 0.2060 - accuracy: 0.9210 - val_loss: 0.1929 - val_accuracy: 0.9239
Epoch 320/4000
88/88 - 0s - loss: 0.2063 - accuracy: 0.9209 - val_loss: 0.1913 - val_accuracy: 0.9268
Epoch 321/4000
88/88 - 0s - loss: 0.2057 - accuracy: 0.9207 - val_loss: 0.1908 - val_accuracy: 0.9263
Epoch 322/4000
88/88 - 0s - loss: 0.2059 - accuracy: 0.9208 - val_loss: 0.1917 - val_accuracy: 0.9265
Epoch 323/4000
88/88 - 0s - loss: 0.2048 - accuracy: 0.9215 - val_loss: 0.1969 - val_accuracy: 0.9223
Epoch 324/4000
88/88 - 0s - loss: 0.2057 - accuracy: 0.9209 - val_loss: 0.1917 - val_accuracy: 0.9266
Epoch 325/4000
88/88 - 0s - loss: 0.2054 - accuracy: 0.9211 - val_loss: 0.1913 - val_accuracy: 0.9251
Epoch 326/4000
88/88 - 0s - loss: 0.2070 - accuracy: 0.9206 - val_loss: 0.1912 - val_accuracy: 0.9261
Epoch 327/4000
88/88 - 0s - loss: 0.2046 - accuracy: 0.9215 - val_loss: 0.1922 - val_accuracy: 0.9254
Epoch 328/4000
88/88 - 0s - loss: 0.2054 - accuracy: 0.9215 - val_loss: 0.1910 - val_accuracy: 0.9249
Epoch 329/4000
88/88 - 0s - loss: 0.2070 - accuracy: 0.9207 - val_loss: 0.1923 - val_accuracy: 0.9254
Epoch 330/4000
88/88 - 0s - loss: 0.2058 - accuracy: 0.9212 - val_loss: 0.1904 - val_accuracy: 0.9260
Epoch 331/4000
88/88 - 0s - loss: 0.2049 - accuracy: 0.9214 - val_loss: 0.1907 - val_accuracy: 0.9256
Epoch 332/4000
88/88 - 0s - loss: 0.2056 - accuracy: 0.9213 - val_loss: 0.1904 - val_accuracy: 0.9262
Epoch 333/4000
88/88 - 0s - loss: 0.2060 - accuracy: 0.9210 - val_loss: 0.1914 - val_accuracy: 0.9251
Epoch 334/4000
88/88 - 0s - loss: 0.2048 - accuracy: 0.9209 - val_loss: 0.1935 - val_accuracy: 0.9240
Epoch 335/4000
88/88 - 0s - loss: 0.2045 - accuracy: 0.9219 - val_loss: 0.1923 - val_accuracy: 0.9253
Epoch 336/4000
88/88 - 0s - loss: 0.2052 - accuracy: 0.9210 - val_loss: 0.1917 - val_accuracy: 0.9253
Epoch 337/4000
88/88 - 0s - loss: 0.2048 - accuracy: 0.9218 - val_loss: 0.1961 - val_accuracy: 0.9229
Epoch 338/4000
88/88 - 0s - loss: 0.2047 - accuracy: 0.9216 - val_loss: 0.1895 - val_accuracy: 0.9264
Epoch 339/4000
88/88 - 0s - loss: 0.2039 - accuracy: 0.9217 - val_loss: 0.1914 - val_accuracy: 0.9260
Epoch 340/4000
88/88 - 0s - loss: 0.2064 - accuracy: 0.9206 - val_loss: 0.1926 - val_accuracy: 0.9249
Epoch 341/4000
88/88 - 0s - loss: 0.2042 - accuracy: 0.9211 - val_loss: 0.1913 - val_accuracy: 0.9251
Epoch 342/4000
88/88 - 0s - loss: 0.2071 - accuracy: 0.9205 - val_loss: 0.1904 - val_accuracy: 0.9262
Epoch 343/4000
88/88 - 0s - loss: 0.2033 - accuracy: 0.9220 - val_loss: 0.1900 - val_accuracy: 0.9258
Epoch 344/4000
88/88 - 0s - loss: 0.2038 - accuracy: 0.9215 - val_loss: 0.1907 - val_accuracy: 0.9260
Epoch 345/4000
88/88 - 0s - loss: 0.2051 - accuracy: 0.9211 - val_loss: 0.1896 - val_accuracy: 0.9264
Epoch 346/4000
88/88 - 0s - loss: 0.2040 - accuracy: 0.9215 - val_loss: 0.1934 - val_accuracy: 0.9256
Epoch 347/4000
88/88 - 0s - loss: 0.2046 - accuracy: 0.9212 - val_loss: 0.1900 - val_accuracy: 0.9257
Epoch 348/4000
88/88 - 0s - loss: 0.2044 - accuracy: 0.9216 - val_loss: 0.1896 - val_accuracy: 0.9260
Epoch 349/4000
88/88 - 0s - loss: 0.2043 - accuracy: 0.9215 - val_loss: 0.1910 - val_accuracy: 0.9267
Epoch 350/4000
88/88 - 0s - loss: 0.2034 - accuracy: 0.9221 - val_loss: 0.1914 - val_accuracy: 0.9243
Epoch 351/4000
88/88 - 0s - loss: 0.2046 - accuracy: 0.9217 - val_loss: 0.1900 - val_accuracy: 0.9261
Epoch 352/4000
88/88 - 0s - loss: 0.2042 - accuracy: 0.9223 - val_loss: 0.1896 - val_accuracy: 0.9265
Epoch 353/4000
88/88 - 0s - loss: 0.2032 - accuracy: 0.9222 - val_loss: 0.1905 - val_accuracy: 0.9262
Epoch 354/4000
88/88 - 0s - loss: 0.2040 - accuracy: 0.9215 - val_loss: 0.1904 - val_accuracy: 0.9255
Epoch 355/4000
88/88 - 0s - loss: 0.2041 - accuracy: 0.9217 - val_loss: 0.1911 - val_accuracy: 0.9259
Epoch 356/4000
88/88 - 0s - loss: 0.2046 - accuracy: 0.9213 - val_loss: 0.1909 - val_accuracy: 0.9257
Epoch 357/4000
88/88 - 0s - loss: 0.2033 - accuracy: 0.9215 - val_loss: 0.1935 - val_accuracy: 0.9251
Epoch 358/4000
88/88 - 0s - loss: 0.2028 - accuracy: 0.9222 - val_loss: 0.1911 - val_accuracy: 0.9258
Epoch 359/4000
88/88 - 0s - loss: 0.2033 - accuracy: 0.9222 - val_loss: 0.1905 - val_accuracy: 0.9260
Epoch 360/4000
88/88 - 0s - loss: 0.2050 - accuracy: 0.9212 - val_loss: 0.1920 - val_accuracy: 0.9242
Epoch 361/4000
88/88 - 0s - loss: 0.2035 - accuracy: 0.9214 - val_loss: 0.1912 - val_accuracy: 0.9261
Epoch 362/4000
88/88 - 0s - loss: 0.2033 - accuracy: 0.9218 - val_loss: 0.1903 - val_accuracy: 0.9268
Epoch 363/4000
88/88 - 0s - loss: 0.2031 - accuracy: 0.9218 - val_loss: 0.1887 - val_accuracy: 0.9269
Epoch 364/4000
88/88 - 0s - loss: 0.2030 - accuracy: 0.9215 - val_loss: 0.1909 - val_accuracy: 0.9266
Epoch 365/4000
88/88 - 0s - loss: 0.2027 - accuracy: 0.9228 - val_loss: 0.1889 - val_accuracy: 0.9259
Epoch 366/4000
88/88 - 0s - loss: 0.2024 - accuracy: 0.9215 - val_loss: 0.1901 - val_accuracy: 0.9258
Epoch 367/4000
88/88 - 0s - loss: 0.2030 - accuracy: 0.9224 - val_loss: 0.1891 - val_accuracy: 0.9266
Epoch 368/4000
88/88 - 0s - loss: 0.2029 - accuracy: 0.9217 - val_loss: 0.1910 - val_accuracy: 0.9252
Epoch 369/4000
88/88 - 0s - loss: 0.2022 - accuracy: 0.9220 - val_loss: 0.1896 - val_accuracy: 0.9267
Epoch 370/4000
88/88 - 0s - loss: 0.2021 - accuracy: 0.9224 - val_loss: 0.1893 - val_accuracy: 0.9263
Epoch 371/4000
88/88 - 0s - loss: 0.2018 - accuracy: 0.9226 - val_loss: 0.1892 - val_accuracy: 0.9262
Epoch 372/4000
88/88 - 0s - loss: 0.2020 - accuracy: 0.9224 - val_loss: 0.1915 - val_accuracy: 0.9252
Epoch 373/4000
88/88 - 0s - loss: 0.2018 - accuracy: 0.9219 - val_loss: 0.1893 - val_accuracy: 0.9263
Epoch 374/4000
88/88 - 0s - loss: 0.2018 - accuracy: 0.9226 - val_loss: 0.1885 - val_accuracy: 0.9268
Epoch 375/4000
88/88 - 0s - loss: 0.2027 - accuracy: 0.9215 - val_loss: 0.1898 - val_accuracy: 0.9258
Epoch 376/4000
88/88 - 0s - loss: 0.2017 - accuracy: 0.9222 - val_loss: 0.1925 - val_accuracy: 0.9260
Epoch 377/4000
88/88 - 0s - loss: 0.2017 - accuracy: 0.9229 - val_loss: 0.1920 - val_accuracy: 0.9264
Epoch 378/4000
88/88 - 0s - loss: 0.2020 - accuracy: 0.9226 - val_loss: 0.1896 - val_accuracy: 0.9275
Epoch 379/4000
88/88 - 0s - loss: 0.2014 - accuracy: 0.9225 - val_loss: 0.1883 - val_accuracy: 0.9270
Epoch 380/4000
88/88 - 0s - loss: 0.2022 - accuracy: 0.9225 - val_loss: 0.1923 - val_accuracy: 0.9255
Epoch 381/4000
88/88 - 0s - loss: 0.2015 - accuracy: 0.9227 - val_loss: 0.1883 - val_accuracy: 0.9269
Epoch 382/4000
88/88 - 0s - loss: 0.2006 - accuracy: 0.9232 - val_loss: 0.1905 - val_accuracy: 0.9265
Epoch 383/4000
88/88 - 0s - loss: 0.2017 - accuracy: 0.9220 - val_loss: 0.1908 - val_accuracy: 0.9265
Epoch 384/4000
88/88 - 0s - loss: 0.2019 - accuracy: 0.9224 - val_loss: 0.1898 - val_accuracy: 0.9265
Epoch 385/4000
88/88 - 0s - loss: 0.2020 - accuracy: 0.9226 - val_loss: 0.1910 - val_accuracy: 0.9263
Epoch 386/4000
88/88 - 0s - loss: 0.2018 - accuracy: 0.9226 - val_loss: 0.1929 - val_accuracy: 0.9250
Epoch 387/4000
88/88 - 0s - loss: 0.2016 - accuracy: 0.9229 - val_loss: 0.1894 - val_accuracy: 0.9260
Epoch 388/4000
88/88 - 0s - loss: 0.2016 - accuracy: 0.9226 - val_loss: 0.1893 - val_accuracy: 0.9269
Epoch 389/4000
88/88 - 0s - loss: 0.2007 - accuracy: 0.9228 - val_loss: 0.1900 - val_accuracy: 0.9260
Epoch 390/4000
88/88 - 0s - loss: 0.2004 - accuracy: 0.9224 - val_loss: 0.1924 - val_accuracy: 0.9254
Epoch 391/4000
88/88 - 0s - loss: 0.1999 - accuracy: 0.9225 - val_loss: 0.1899 - val_accuracy: 0.9256
Epoch 392/4000
88/88 - 0s - loss: 0.2015 - accuracy: 0.9225 - val_loss: 0.1889 - val_accuracy: 0.9261
Epoch 393/4000
88/88 - 0s - loss: 0.1996 - accuracy: 0.9232 - val_loss: 0.1887 - val_accuracy: 0.9279
Epoch 394/4000
88/88 - 0s - loss: 0.1999 - accuracy: 0.9229 - val_loss: 0.1927 - val_accuracy: 0.9261
Epoch 395/4000
88/88 - 0s - loss: 0.2023 - accuracy: 0.9219 - val_loss: 0.1924 - val_accuracy: 0.9249
Epoch 396/4000
88/88 - 0s - loss: 0.2008 - accuracy: 0.9225 - val_loss: 0.1887 - val_accuracy: 0.9266
Epoch 397/4000
88/88 - 0s - loss: 0.2000 - accuracy: 0.9225 - val_loss: 0.1918 - val_accuracy: 0.9264
Epoch 398/4000
88/88 - 0s - loss: 0.2001 - accuracy: 0.9228 - val_loss: 0.1896 - val_accuracy: 0.9261
Epoch 399/4000
88/88 - 0s - loss: 0.2012 - accuracy: 0.9226 - val_loss: 0.1900 - val_accuracy: 0.9274
Epoch 400/4000
88/88 - 0s - loss: 0.2001 - accuracy: 0.9232 - val_loss: 0.1896 - val_accuracy: 0.9268
Epoch 401/4000
88/88 - 0s - loss: 0.2014 - accuracy: 0.9229 - val_loss: 0.1874 - val_accuracy: 0.9277
Epoch 402/4000
88/88 - 0s - loss: 0.2009 - accuracy: 0.9231 - val_loss: 0.1883 - val_accuracy: 0.9267
Epoch 403/4000
88/88 - 0s - loss: 0.2008 - accuracy: 0.9232 - val_loss: 0.1901 - val_accuracy: 0.9251
Epoch 404/4000
88/88 - 0s - loss: 0.1996 - accuracy: 0.9236 - val_loss: 0.1885 - val_accuracy: 0.9276
Epoch 405/4000
88/88 - 0s - loss: 0.2011 - accuracy: 0.9227 - val_loss: 0.1887 - val_accuracy: 0.9258
Epoch 406/4000
88/88 - 0s - loss: 0.2000 - accuracy: 0.9233 - val_loss: 0.1923 - val_accuracy: 0.9244
Epoch 407/4000
88/88 - 0s - loss: 0.2007 - accuracy: 0.9225 - val_loss: 0.1908 - val_accuracy: 0.9268
Epoch 408/4000
88/88 - 0s - loss: 0.2004 - accuracy: 0.9227 - val_loss: 0.1885 - val_accuracy: 0.9275
Epoch 409/4000
88/88 - 0s - loss: 0.2010 - accuracy: 0.9220 - val_loss: 0.1894 - val_accuracy: 0.9274
Epoch 410/4000
88/88 - 0s - loss: 0.1995 - accuracy: 0.9236 - val_loss: 0.1888 - val_accuracy: 0.9274
Epoch 411/4000
88/88 - 0s - loss: 0.1998 - accuracy: 0.9232 - val_loss: 0.1887 - val_accuracy: 0.9269
Epoch 412/4000
88/88 - 0s - loss: 0.1999 - accuracy: 0.9228 - val_loss: 0.1891 - val_accuracy: 0.9257
Epoch 413/4000
88/88 - 0s - loss: 0.2007 - accuracy: 0.9230 - val_loss: 0.1908 - val_accuracy: 0.9271
Epoch 414/4000
88/88 - 0s - loss: 0.1984 - accuracy: 0.9234 - val_loss: 0.1885 - val_accuracy: 0.9271
Epoch 415/4000
88/88 - 0s - loss: 0.2003 - accuracy: 0.9226 - val_loss: 0.1891 - val_accuracy: 0.9269
Epoch 416/4000
88/88 - 0s - loss: 0.1993 - accuracy: 0.9233 - val_loss: 0.1902 - val_accuracy: 0.9261
Epoch 417/4000
88/88 - 0s - loss: 0.1993 - accuracy: 0.9236 - val_loss: 0.1886 - val_accuracy: 0.9277
Epoch 418/4000
88/88 - 0s - loss: 0.1983 - accuracy: 0.9236 - val_loss: 0.1907 - val_accuracy: 0.9264
Epoch 419/4000
88/88 - 0s - loss: 0.1986 - accuracy: 0.9237 - val_loss: 0.1888 - val_accuracy: 0.9262
Epoch 420/4000
88/88 - 0s - loss: 0.2001 - accuracy: 0.9238 - val_loss: 0.1884 - val_accuracy: 0.9270
Epoch 421/4000
88/88 - 0s - loss: 0.1978 - accuracy: 0.9239 - val_loss: 0.1885 - val_accuracy: 0.9268
Epoch 422/4000
88/88 - 0s - loss: 0.1994 - accuracy: 0.9232 - val_loss: 0.1912 - val_accuracy: 0.9265
Epoch 423/4000
88/88 - 0s - loss: 0.2000 - accuracy: 0.9235 - val_loss: 0.1875 - val_accuracy: 0.9275
Epoch 424/4000
88/88 - 0s - loss: 0.1982 - accuracy: 0.9242 - val_loss: 0.1870 - val_accuracy: 0.9279
Epoch 425/4000
88/88 - 0s - loss: 0.1977 - accuracy: 0.9236 - val_loss: 0.1876 - val_accuracy: 0.9279
Epoch 426/4000
88/88 - 0s - loss: 0.1984 - accuracy: 0.9237 - val_loss: 0.1883 - val_accuracy: 0.9272
Epoch 427/4000
88/88 - 0s - loss: 0.1973 - accuracy: 0.9238 - val_loss: 0.1867 - val_accuracy: 0.9276
Epoch 428/4000
88/88 - 0s - loss: 0.1988 - accuracy: 0.9234 - val_loss: 0.1881 - val_accuracy: 0.9278
Epoch 429/4000
88/88 - 0s - loss: 0.1991 - accuracy: 0.9233 - val_loss: 0.1883 - val_accuracy: 0.9262
Epoch 430/4000
88/88 - 0s - loss: 0.1999 - accuracy: 0.9230 - val_loss: 0.1883 - val_accuracy: 0.9267
Epoch 431/4000
88/88 - 0s - loss: 0.2002 - accuracy: 0.9226 - val_loss: 0.1881 - val_accuracy: 0.9268
Epoch 432/4000
88/88 - 0s - loss: 0.2000 - accuracy: 0.9233 - val_loss: 0.1884 - val_accuracy: 0.9264
Epoch 433/4000
88/88 - 0s - loss: 0.1996 - accuracy: 0.9234 - val_loss: 0.1894 - val_accuracy: 0.9260
Epoch 434/4000
88/88 - 0s - loss: 0.1985 - accuracy: 0.9231 - val_loss: 0.1909 - val_accuracy: 0.9257
Epoch 435/4000
88/88 - 0s - loss: 0.1985 - accuracy: 0.9232 - val_loss: 0.1871 - val_accuracy: 0.9277
Epoch 436/4000
88/88 - 0s - loss: 0.1986 - accuracy: 0.9237 - val_loss: 0.1901 - val_accuracy: 0.9268
Epoch 437/4000
88/88 - 0s - loss: 0.1989 - accuracy: 0.9231 - val_loss: 0.1877 - val_accuracy: 0.9278
Epoch 438/4000
88/88 - 0s - loss: 0.1982 - accuracy: 0.9238 - val_loss: 0.1881 - val_accuracy: 0.9274
Epoch 439/4000
88/88 - 0s - loss: 0.1979 - accuracy: 0.9240 - val_loss: 0.1899 - val_accuracy: 0.9263
Epoch 440/4000
88/88 - 0s - loss: 0.1973 - accuracy: 0.9241 - val_loss: 0.1887 - val_accuracy: 0.9275
Epoch 441/4000
88/88 - 0s - loss: 0.1978 - accuracy: 0.9236 - val_loss: 0.1884 - val_accuracy: 0.9270
Epoch 442/4000
88/88 - 0s - loss: 0.1987 - accuracy: 0.9235 - val_loss: 0.1885 - val_accuracy: 0.9268
Epoch 443/4000
88/88 - 0s - loss: 0.1979 - accuracy: 0.9239 - val_loss: 0.1897 - val_accuracy: 0.9269
Epoch 444/4000
88/88 - 0s - loss: 0.1974 - accuracy: 0.9246 - val_loss: 0.1889 - val_accuracy: 0.9275
Epoch 445/4000
88/88 - 0s - loss: 0.1973 - accuracy: 0.9237 - val_loss: 0.1888 - val_accuracy: 0.9275
Epoch 446/4000
88/88 - 0s - loss: 0.1976 - accuracy: 0.9232 - val_loss: 0.1871 - val_accuracy: 0.9289
Epoch 447/4000
88/88 - 0s - loss: 0.1980 - accuracy: 0.9241 - val_loss: 0.1892 - val_accuracy: 0.9262
Epoch 448/4000
88/88 - 0s - loss: 0.1982 - accuracy: 0.9233 - val_loss: 0.1876 - val_accuracy: 0.9272
Epoch 449/4000
88/88 - 0s - loss: 0.1985 - accuracy: 0.9235 - val_loss: 0.1901 - val_accuracy: 0.9256
Epoch 450/4000
88/88 - 0s - loss: 0.1969 - accuracy: 0.9240 - val_loss: 0.1878 - val_accuracy: 0.9286
Epoch 451/4000
88/88 - 0s - loss: 0.1966 - accuracy: 0.9241 - val_loss: 0.1867 - val_accuracy: 0.9282
Epoch 452/4000
88/88 - 0s - loss: 0.1980 - accuracy: 0.9235 - val_loss: 0.1901 - val_accuracy: 0.9272
Epoch 453/4000
88/88 - 0s - loss: 0.1973 - accuracy: 0.9241 - val_loss: 0.1891 - val_accuracy: 0.9260
Epoch 454/4000
88/88 - 0s - loss: 0.1980 - accuracy: 0.9235 - val_loss: 0.1915 - val_accuracy: 0.9249
Epoch 455/4000
88/88 - 0s - loss: 0.1973 - accuracy: 0.9240 - val_loss: 0.1883 - val_accuracy: 0.9278
Epoch 456/4000
88/88 - 0s - loss: 0.1976 - accuracy: 0.9242 - val_loss: 0.1917 - val_accuracy: 0.9252
Epoch 457/4000
88/88 - 0s - loss: 0.1963 - accuracy: 0.9237 - val_loss: 0.1868 - val_accuracy: 0.9290
Epoch 458/4000
88/88 - 0s - loss: 0.1979 - accuracy: 0.9238 - val_loss: 0.1875 - val_accuracy: 0.9279
Epoch 459/4000
88/88 - 0s - loss: 0.1962 - accuracy: 0.9249 - val_loss: 0.1868 - val_accuracy: 0.9282
Epoch 460/4000
88/88 - 0s - loss: 0.1979 - accuracy: 0.9241 - val_loss: 0.1881 - val_accuracy: 0.9272
Epoch 461/4000
88/88 - 0s - loss: 0.1972 - accuracy: 0.9244 - val_loss: 0.1856 - val_accuracy: 0.9286
Epoch 462/4000
88/88 - 0s - loss: 0.1967 - accuracy: 0.9246 - val_loss: 0.1909 - val_accuracy: 0.9262
Epoch 463/4000
88/88 - 0s - loss: 0.1973 - accuracy: 0.9243 - val_loss: 0.1879 - val_accuracy: 0.9274
Epoch 464/4000
88/88 - 0s - loss: 0.1959 - accuracy: 0.9245 - val_loss: 0.1869 - val_accuracy: 0.9275
Epoch 465/4000
88/88 - 1s - loss: 0.1961 - accuracy: 0.9243 - val_loss: 0.1872 - val_accuracy: 0.9277
Epoch 466/4000
88/88 - 0s - loss: 0.1967 - accuracy: 0.9244 - val_loss: 0.1873 - val_accuracy: 0.9272
Epoch 467/4000
88/88 - 0s - loss: 0.1961 - accuracy: 0.9245 - val_loss: 0.1867 - val_accuracy: 0.9275
Epoch 468/4000
88/88 - 0s - loss: 0.1966 - accuracy: 0.9246 - val_loss: 0.1868 - val_accuracy: 0.9275
Epoch 469/4000
88/88 - 0s - loss: 0.1965 - accuracy: 0.9242 - val_loss: 0.1878 - val_accuracy: 0.9272
Epoch 470/4000
88/88 - 0s - loss: 0.1956 - accuracy: 0.9248 - val_loss: 0.1873 - val_accuracy: 0.9283
Epoch 471/4000
88/88 - 0s - loss: 0.1965 - accuracy: 0.9238 - val_loss: 0.1873 - val_accuracy: 0.9276
Epoch 472/4000
88/88 - 0s - loss: 0.1967 - accuracy: 0.9241 - val_loss: 0.1916 - val_accuracy: 0.9250
Epoch 473/4000
88/88 - 0s - loss: 0.1967 - accuracy: 0.9242 - val_loss: 0.1869 - val_accuracy: 0.9272
Epoch 474/4000
88/88 - 0s - loss: 0.1959 - accuracy: 0.9240 - val_loss: 0.1868 - val_accuracy: 0.9283
Epoch 475/4000
88/88 - 0s - loss: 0.1953 - accuracy: 0.9249 - val_loss: 0.1884 - val_accuracy: 0.9269
Epoch 476/4000
88/88 - 0s - loss: 0.1969 - accuracy: 0.9242 - val_loss: 0.1875 - val_accuracy: 0.9289
Epoch 477/4000
88/88 - 0s - loss: 0.1969 - accuracy: 0.9241 - val_loss: 0.1868 - val_accuracy: 0.9288
Epoch 478/4000
88/88 - 0s - loss: 0.1961 - accuracy: 0.9239 - val_loss: 0.1911 - val_accuracy: 0.9258
Epoch 479/4000
88/88 - 0s - loss: 0.1978 - accuracy: 0.9237 - val_loss: 0.1885 - val_accuracy: 0.9276
Epoch 480/4000
88/88 - 0s - loss: 0.1970 - accuracy: 0.9238 - val_loss: 0.1884 - val_accuracy: 0.9258
Epoch 481/4000
88/88 - 0s - loss: 0.1967 - accuracy: 0.9237 - val_loss: 0.1876 - val_accuracy: 0.9279
Epoch 482/4000
88/88 - 0s - loss: 0.1978 - accuracy: 0.9238 - val_loss: 0.1881 - val_accuracy: 0.9273
Epoch 483/4000
88/88 - 0s - loss: 0.1961 - accuracy: 0.9245 - val_loss: 0.1904 - val_accuracy: 0.9252
Epoch 484/4000
88/88 - 0s - loss: 0.1951 - accuracy: 0.9255 - val_loss: 0.1868 - val_accuracy: 0.9277
Epoch 485/4000
88/88 - 0s - loss: 0.1952 - accuracy: 0.9254 - val_loss: 0.1880 - val_accuracy: 0.9275
Epoch 486/4000
88/88 - 0s - loss: 0.1960 - accuracy: 0.9249 - val_loss: 0.1881 - val_accuracy: 0.9283
Epoch 487/4000
88/88 - 0s - loss: 0.1956 - accuracy: 0.9253 - val_loss: 0.1861 - val_accuracy: 0.9287
Epoch 488/4000
88/88 - 0s - loss: 0.1981 - accuracy: 0.9240 - val_loss: 0.1869 - val_accuracy: 0.9285
Epoch 489/4000
88/88 - 0s - loss: 0.1961 - accuracy: 0.9248 - val_loss: 0.1877 - val_accuracy: 0.9279
Epoch 490/4000
88/88 - 0s - loss: 0.1943 - accuracy: 0.9250 - val_loss: 0.1894 - val_accuracy: 0.9263
Epoch 491/4000
88/88 - 0s - loss: 0.1942 - accuracy: 0.9255 - val_loss: 0.1871 - val_accuracy: 0.9276
Epoch 492/4000
88/88 - 0s - loss: 0.1957 - accuracy: 0.9241 - val_loss: 0.1900 - val_accuracy: 0.9246
Epoch 493/4000
88/88 - 0s - loss: 0.1942 - accuracy: 0.9249 - val_loss: 0.1876 - val_accuracy: 0.9263
Epoch 494/4000
88/88 - 0s - loss: 0.1954 - accuracy: 0.9247 - val_loss: 0.1871 - val_accuracy: 0.9279
Epoch 495/4000
88/88 - 0s - loss: 0.1956 - accuracy: 0.9250 - val_loss: 0.1875 - val_accuracy: 0.9275
Epoch 496/4000
88/88 - 0s - loss: 0.1948 - accuracy: 0.9246 - val_loss: 0.1868 - val_accuracy: 0.9277
Epoch 497/4000
88/88 - 0s - loss: 0.1949 - accuracy: 0.9252 - val_loss: 0.1869 - val_accuracy: 0.9268
Epoch 498/4000
88/88 - 0s - loss: 0.1934 - accuracy: 0.9252 - val_loss: 0.1874 - val_accuracy: 0.9270
Epoch 499/4000
88/88 - 0s - loss: 0.1963 - accuracy: 0.9243 - val_loss: 0.1868 - val_accuracy: 0.9276
Epoch 500/4000
88/88 - 0s - loss: 0.1959 - accuracy: 0.9250 - val_loss: 0.1863 - val_accuracy: 0.9285
Epoch 501/4000
88/88 - 0s - loss: 0.1948 - accuracy: 0.9250 - val_loss: 0.1878 - val_accuracy: 0.9262
Epoch 502/4000
88/88 - 0s - loss: 0.1945 - accuracy: 0.9248 - val_loss: 0.1857 - val_accuracy: 0.9289
Epoch 503/4000
88/88 - 0s - loss: 0.1938 - accuracy: 0.9252 - val_loss: 0.1857 - val_accuracy: 0.9281
Epoch 504/4000
88/88 - 0s - loss: 0.1938 - accuracy: 0.9253 - val_loss: 0.1863 - val_accuracy: 0.9278
Epoch 505/4000
88/88 - 0s - loss: 0.1935 - accuracy: 0.9252 - val_loss: 0.1866 - val_accuracy: 0.9274
Epoch 506/4000
88/88 - 0s - loss: 0.1941 - accuracy: 0.9249 - val_loss: 0.1903 - val_accuracy: 0.9263
Epoch 507/4000
88/88 - 0s - loss: 0.1950 - accuracy: 0.9246 - val_loss: 0.1901 - val_accuracy: 0.9255
Epoch 508/4000
88/88 - 0s - loss: 0.1943 - accuracy: 0.9255 - val_loss: 0.1861 - val_accuracy: 0.9273
Epoch 509/4000
88/88 - 0s - loss: 0.1930 - accuracy: 0.9256 - val_loss: 0.1866 - val_accuracy: 0.9280
Epoch 510/4000
88/88 - 0s - loss: 0.1940 - accuracy: 0.9251 - val_loss: 0.1864 - val_accuracy: 0.9274
Epoch 511/4000
88/88 - 0s - loss: 0.1943 - accuracy: 0.9255 - val_loss: 0.1870 - val_accuracy: 0.9274
Epoch 512/4000
88/88 - 0s - loss: 0.1946 - accuracy: 0.9253 - val_loss: 0.1892 - val_accuracy: 0.9273
Epoch 513/4000
88/88 - 0s - loss: 0.1945 - accuracy: 0.9247 - val_loss: 0.1861 - val_accuracy: 0.9283
Epoch 514/4000
88/88 - 0s - loss: 0.1942 - accuracy: 0.9251 - val_loss: 0.1870 - val_accuracy: 0.9277
Epoch 515/4000
88/88 - 0s - loss: 0.1946 - accuracy: 0.9247 - val_loss: 0.1874 - val_accuracy: 0.9277
Epoch 516/4000
88/88 - 0s - loss: 0.1937 - accuracy: 0.9255 - val_loss: 0.1869 - val_accuracy: 0.9273
Epoch 517/4000
88/88 - 0s - loss: 0.1941 - accuracy: 0.9252 - val_loss: 0.1864 - val_accuracy: 0.9279
Epoch 518/4000
88/88 - 0s - loss: 0.1938 - accuracy: 0.9252 - val_loss: 0.1867 - val_accuracy: 0.9279
Epoch 519/4000
88/88 - 0s - loss: 0.1953 - accuracy: 0.9253 - val_loss: 0.1891 - val_accuracy: 0.9268
Epoch 520/4000
88/88 - 0s - loss: 0.1950 - accuracy: 0.9250 - val_loss: 0.1864 - val_accuracy: 0.9286
Epoch 521/4000
88/88 - 0s - loss: 0.1932 - accuracy: 0.9256 - val_loss: 0.1875 - val_accuracy: 0.9267
Epoch 522/4000
88/88 - 0s - loss: 0.1944 - accuracy: 0.9257 - val_loss: 0.1892 - val_accuracy: 0.9268
Epoch 523/4000
88/88 - 0s - loss: 0.1940 - accuracy: 0.9249 - val_loss: 0.1867 - val_accuracy: 0.9276
Epoch 524/4000
88/88 - 0s - loss: 0.1941 - accuracy: 0.9252 - val_loss: 0.1872 - val_accuracy: 0.9277
Epoch 525/4000
88/88 - 0s - loss: 0.1950 - accuracy: 0.9251 - val_loss: 0.1861 - val_accuracy: 0.9284
Epoch 526/4000
88/88 - 0s - loss: 0.1937 - accuracy: 0.9250 - val_loss: 0.1868 - val_accuracy: 0.9279
Epoch 527/4000
88/88 - 0s - loss: 0.1942 - accuracy: 0.9249 - val_loss: 0.1860 - val_accuracy: 0.9289
Epoch 528/4000
88/88 - 0s - loss: 0.1944 - accuracy: 0.9249 - val_loss: 0.1866 - val_accuracy: 0.9282
Epoch 529/4000
88/88 - 0s - loss: 0.1932 - accuracy: 0.9250 - val_loss: 0.1886 - val_accuracy: 0.9275
Epoch 530/4000
88/88 - 0s - loss: 0.1928 - accuracy: 0.9253 - val_loss: 0.1868 - val_accuracy: 0.9273
Epoch 531/4000
88/88 - 0s - loss: 0.1945 - accuracy: 0.9250 - val_loss: 0.1871 - val_accuracy: 0.9272
Epoch 532/4000
88/88 - 0s - loss: 0.1928 - accuracy: 0.9261 - val_loss: 0.1858 - val_accuracy: 0.9269
Epoch 533/4000
88/88 - 0s - loss: 0.1932 - accuracy: 0.9261 - val_loss: 0.1875 - val_accuracy: 0.9275
Epoch 534/4000
88/88 - 0s - loss: 0.1937 - accuracy: 0.9254 - val_loss: 0.1871 - val_accuracy: 0.9273
Epoch 535/4000
88/88 - 0s - loss: 0.1935 - accuracy: 0.9253 - val_loss: 0.1874 - val_accuracy: 0.9271
Epoch 536/4000
88/88 - 0s - loss: 0.1936 - accuracy: 0.9252 - val_loss: 0.1852 - val_accuracy: 0.9284
Epoch 537/4000
88/88 - 0s - loss: 0.1937 - accuracy: 0.9258 - val_loss: 0.1859 - val_accuracy: 0.9280
Epoch 538/4000
88/88 - 0s - loss: 0.1932 - accuracy: 0.9260 - val_loss: 0.1867 - val_accuracy: 0.9278
Epoch 539/4000
88/88 - 0s - loss: 0.1923 - accuracy: 0.9258 - val_loss: 0.1861 - val_accuracy: 0.9284
Epoch 540/4000
88/88 - 0s - loss: 0.1944 - accuracy: 0.9252 - val_loss: 0.1895 - val_accuracy: 0.9266
Epoch 541/4000
88/88 - 0s - loss: 0.1931 - accuracy: 0.9261 - val_loss: 0.1856 - val_accuracy: 0.9285
Epoch 542/4000
88/88 - 0s - loss: 0.1922 - accuracy: 0.9261 - val_loss: 0.1860 - val_accuracy: 0.9282
Epoch 543/4000
88/88 - 0s - loss: 0.1934 - accuracy: 0.9256 - val_loss: 0.1859 - val_accuracy: 0.9285
Epoch 544/4000
88/88 - 0s - loss: 0.1917 - accuracy: 0.9260 - val_loss: 0.1863 - val_accuracy: 0.9279
Epoch 545/4000
88/88 - 0s - loss: 0.1926 - accuracy: 0.9258 - val_loss: 0.1880 - val_accuracy: 0.9274
Epoch 546/4000
88/88 - 0s - loss: 0.1921 - accuracy: 0.9259 - val_loss: 0.1859 - val_accuracy: 0.9283
Epoch 547/4000
88/88 - 0s - loss: 0.1921 - accuracy: 0.9257 - val_loss: 0.1860 - val_accuracy: 0.9292
Epoch 548/4000
88/88 - 0s - loss: 0.1927 - accuracy: 0.9256 - val_loss: 0.1878 - val_accuracy: 0.9279
Epoch 549/4000
88/88 - 0s - loss: 0.1930 - accuracy: 0.9253 - val_loss: 0.1867 - val_accuracy: 0.9274
Epoch 550/4000
88/88 - 0s - loss: 0.1915 - accuracy: 0.9260 - val_loss: 0.1878 - val_accuracy: 0.9275
Epoch 551/4000
88/88 - 0s - loss: 0.1917 - accuracy: 0.9255 - val_loss: 0.1865 - val_accuracy: 0.9288
Epoch 552/4000
88/88 - 0s - loss: 0.1926 - accuracy: 0.9255 - val_loss: 0.1893 - val_accuracy: 0.9276
Epoch 553/4000
88/88 - 0s - loss: 0.1913 - accuracy: 0.9260 - val_loss: 0.1857 - val_accuracy: 0.9285
Epoch 554/4000
88/88 - 0s - loss: 0.1911 - accuracy: 0.9260 - val_loss: 0.1866 - val_accuracy: 0.9285
Epoch 555/4000
88/88 - 0s - loss: 0.1929 - accuracy: 0.9260 - val_loss: 0.1856 - val_accuracy: 0.9281
Epoch 556/4000
88/88 - 0s - loss: 0.1923 - accuracy: 0.9261 - val_loss: 0.1858 - val_accuracy: 0.9290
Epoch 557/4000
88/88 - 0s - loss: 0.1916 - accuracy: 0.9255 - val_loss: 0.1861 - val_accuracy: 0.9274
Epoch 558/4000
88/88 - 0s - loss: 0.1920 - accuracy: 0.9258 - val_loss: 0.1856 - val_accuracy: 0.9281
Epoch 559/4000
88/88 - 0s - loss: 0.1920 - accuracy: 0.9258 - val_loss: 0.1877 - val_accuracy: 0.9264
Epoch 560/4000
88/88 - 0s - loss: 0.1927 - accuracy: 0.9263 - val_loss: 0.1876 - val_accuracy: 0.9284
Epoch 561/4000
88/88 - 0s - loss: 0.1921 - accuracy: 0.9261 - val_loss: 0.1874 - val_accuracy: 0.9278
Epoch 562/4000
88/88 - 0s - loss: 0.1902 - accuracy: 0.9265 - val_loss: 0.1849 - val_accuracy: 0.9295
Epoch 563/4000
88/88 - 0s - loss: 0.1923 - accuracy: 0.9255 - val_loss: 0.1872 - val_accuracy: 0.9288
Epoch 564/4000
88/88 - 0s - loss: 0.1915 - accuracy: 0.9262 - val_loss: 0.1855 - val_accuracy: 0.9291
Epoch 565/4000
88/88 - 0s - loss: 0.1923 - accuracy: 0.9253 - val_loss: 0.1866 - val_accuracy: 0.9285
Epoch 566/4000
88/88 - 0s - loss: 0.1916 - accuracy: 0.9259 - val_loss: 0.1877 - val_accuracy: 0.9288
Epoch 567/4000
88/88 - 0s - loss: 0.1921 - accuracy: 0.9260 - val_loss: 0.1881 - val_accuracy: 0.9279
Epoch 568/4000
88/88 - 0s - loss: 0.1912 - accuracy: 0.9263 - val_loss: 0.1856 - val_accuracy: 0.9291
Epoch 569/4000
88/88 - 0s - loss: 0.1927 - accuracy: 0.9262 - val_loss: 0.1879 - val_accuracy: 0.9283
Epoch 570/4000
88/88 - 0s - loss: 0.1913 - accuracy: 0.9265 - val_loss: 0.1874 - val_accuracy: 0.9277
Epoch 571/4000
88/88 - 0s - loss: 0.1916 - accuracy: 0.9265 - val_loss: 0.1864 - val_accuracy: 0.9281
Epoch 572/4000
88/88 - 0s - loss: 0.1919 - accuracy: 0.9263 - val_loss: 0.1868 - val_accuracy: 0.9276
Epoch 573/4000
88/88 - 0s - loss: 0.1913 - accuracy: 0.9267 - val_loss: 0.1843 - val_accuracy: 0.9288
Epoch 574/4000
88/88 - 0s - loss: 0.1912 - accuracy: 0.9259 - val_loss: 0.1859 - val_accuracy: 0.9287
Epoch 575/4000
88/88 - 0s - loss: 0.1912 - accuracy: 0.9255 - val_loss: 0.1857 - val_accuracy: 0.9284
Epoch 576/4000
88/88 - 0s - loss: 0.1919 - accuracy: 0.9259 - val_loss: 0.1858 - val_accuracy: 0.9282
Epoch 577/4000
88/88 - 0s - loss: 0.1915 - accuracy: 0.9261 - val_loss: 0.1854 - val_accuracy: 0.9284
Epoch 578/4000
88/88 - 0s - loss: 0.1905 - accuracy: 0.9261 - val_loss: 0.1877 - val_accuracy: 0.9277
Epoch 579/4000
88/88 - 0s - loss: 0.1906 - accuracy: 0.9263 - val_loss: 0.1866 - val_accuracy: 0.9279
Epoch 580/4000
88/88 - 0s - loss: 0.1909 - accuracy: 0.9264 - val_loss: 0.1876 - val_accuracy: 0.9280
Epoch 581/4000
88/88 - 0s - loss: 0.1906 - accuracy: 0.9263 - val_loss: 0.1865 - val_accuracy: 0.9279
Epoch 582/4000
88/88 - 0s - loss: 0.1904 - accuracy: 0.9267 - val_loss: 0.1875 - val_accuracy: 0.9274
Epoch 583/4000
88/88 - 0s - loss: 0.1910 - accuracy: 0.9266 - val_loss: 0.1858 - val_accuracy: 0.9280
Epoch 584/4000
88/88 - 0s - loss: 0.1904 - accuracy: 0.9268 - val_loss: 0.1876 - val_accuracy: 0.9278
Epoch 585/4000
88/88 - 0s - loss: 0.1910 - accuracy: 0.9264 - val_loss: 0.1863 - val_accuracy: 0.9288
Epoch 586/4000
88/88 - 0s - loss: 0.1904 - accuracy: 0.9263 - val_loss: 0.1875 - val_accuracy: 0.9273
Epoch 587/4000
88/88 - 0s - loss: 0.1900 - accuracy: 0.9273 - val_loss: 0.1868 - val_accuracy: 0.9290
Epoch 588/4000
88/88 - 0s - loss: 0.1920 - accuracy: 0.9260 - val_loss: 0.1859 - val_accuracy: 0.9293
Epoch 589/4000
88/88 - 0s - loss: 0.1907 - accuracy: 0.9263 - val_loss: 0.1856 - val_accuracy: 0.9291
Epoch 590/4000
88/88 - 0s - loss: 0.1908 - accuracy: 0.9262 - val_loss: 0.1853 - val_accuracy: 0.9297
Epoch 591/4000
88/88 - 0s - loss: 0.1915 - accuracy: 0.9259 - val_loss: 0.1861 - val_accuracy: 0.9281
Epoch 592/4000
88/88 - 0s - loss: 0.1904 - accuracy: 0.9269 - val_loss: 0.1857 - val_accuracy: 0.9289
Epoch 593/4000
88/88 - 0s - loss: 0.1918 - accuracy: 0.9263 - val_loss: 0.1869 - val_accuracy: 0.9275
Epoch 594/4000
88/88 - 0s - loss: 0.1907 - accuracy: 0.9254 - val_loss: 0.1859 - val_accuracy: 0.9282
Epoch 595/4000
88/88 - 0s - loss: 0.1911 - accuracy: 0.9260 - val_loss: 0.1856 - val_accuracy: 0.9288
Epoch 596/4000
88/88 - 0s - loss: 0.1904 - accuracy: 0.9270 - val_loss: 0.1856 - val_accuracy: 0.9283
Epoch 597/4000
88/88 - 0s - loss: 0.1897 - accuracy: 0.9268 - val_loss: 0.1846 - val_accuracy: 0.9289
Epoch 598/4000
88/88 - 0s - loss: 0.1894 - accuracy: 0.9269 - val_loss: 0.1854 - val_accuracy: 0.9292
Epoch 599/4000
88/88 - 0s - loss: 0.1905 - accuracy: 0.9267 - val_loss: 0.1884 - val_accuracy: 0.9289
Epoch 600/4000
88/88 - 0s - loss: 0.1902 - accuracy: 0.9266 - val_loss: 0.1862 - val_accuracy: 0.9279
Epoch 601/4000
88/88 - 0s - loss: 0.1906 - accuracy: 0.9264 - val_loss: 0.1856 - val_accuracy: 0.9284
Epoch 602/4000
88/88 - 0s - loss: 0.1899 - accuracy: 0.9269 - val_loss: 0.1849 - val_accuracy: 0.9279
Epoch 603/4000
88/88 - 0s - loss: 0.1900 - accuracy: 0.9264 - val_loss: 0.1856 - val_accuracy: 0.9279
Epoch 604/4000
88/88 - 0s - loss: 0.1897 - accuracy: 0.9263 - val_loss: 0.1862 - val_accuracy: 0.9290
Epoch 605/4000
88/88 - 0s - loss: 0.1911 - accuracy: 0.9256 - val_loss: 0.1858 - val_accuracy: 0.9292
Epoch 606/4000
88/88 - 0s - loss: 0.1906 - accuracy: 0.9270 - val_loss: 0.1862 - val_accuracy: 0.9278
Epoch 607/4000
88/88 - 0s - loss: 0.1896 - accuracy: 0.9264 - val_loss: 0.1854 - val_accuracy: 0.9290
Epoch 608/4000
88/88 - 0s - loss: 0.1903 - accuracy: 0.9267 - val_loss: 0.1856 - val_accuracy: 0.9287
Epoch 609/4000
88/88 - 0s - loss: 0.1890 - accuracy: 0.9268 - val_loss: 0.1885 - val_accuracy: 0.9264
Epoch 610/4000
88/88 - 0s - loss: 0.1907 - accuracy: 0.9267 - val_loss: 0.1849 - val_accuracy: 0.9293
Epoch 611/4000
88/88 - 0s - loss: 0.1904 - accuracy: 0.9267 - val_loss: 0.1854 - val_accuracy: 0.9291
Epoch 612/4000
88/88 - 0s - loss: 0.1897 - accuracy: 0.9271 - val_loss: 0.1881 - val_accuracy: 0.9270
Epoch 613/4000
88/88 - 0s - loss: 0.1896 - accuracy: 0.9269 - val_loss: 0.1870 - val_accuracy: 0.9284
Epoch 614/4000
88/88 - 0s - loss: 0.1895 - accuracy: 0.9269 - val_loss: 0.1856 - val_accuracy: 0.9279
Epoch 615/4000
88/88 - 0s - loss: 0.1906 - accuracy: 0.9260 - val_loss: 0.1879 - val_accuracy: 0.9277
Epoch 616/4000
88/88 - 0s - loss: 0.1902 - accuracy: 0.9265 - val_loss: 0.1843 - val_accuracy: 0.9292
Epoch 617/4000
88/88 - 0s - loss: 0.1900 - accuracy: 0.9264 - val_loss: 0.1847 - val_accuracy: 0.9292
Epoch 618/4000
88/88 - 0s - loss: 0.1907 - accuracy: 0.9268 - val_loss: 0.1848 - val_accuracy: 0.9283
Epoch 619/4000
88/88 - 0s - loss: 0.1901 - accuracy: 0.9269 - val_loss: 0.1877 - val_accuracy: 0.9271
Epoch 620/4000
88/88 - 0s - loss: 0.1893 - accuracy: 0.9271 - val_loss: 0.1851 - val_accuracy: 0.9284
Epoch 621/4000
88/88 - 0s - loss: 0.1894 - accuracy: 0.9272 - val_loss: 0.1859 - val_accuracy: 0.9284
Epoch 622/4000
88/88 - 0s - loss: 0.1906 - accuracy: 0.9270 - val_loss: 0.1864 - val_accuracy: 0.9281
Epoch 623/4000
88/88 - 0s - loss: 0.1894 - accuracy: 0.9268 - val_loss: 0.1851 - val_accuracy: 0.9292
Epoch 624/4000
88/88 - 0s - loss: 0.1900 - accuracy: 0.9263 - val_loss: 0.1844 - val_accuracy: 0.9285
Epoch 625/4000
88/88 - 0s - loss: 0.1883 - accuracy: 0.9268 - val_loss: 0.1859 - val_accuracy: 0.9283
Epoch 626/4000
88/88 - 0s - loss: 0.1890 - accuracy: 0.9268 - val_loss: 0.1856 - val_accuracy: 0.9288
Epoch 627/4000
88/88 - 0s - loss: 0.1899 - accuracy: 0.9268 - val_loss: 0.1849 - val_accuracy: 0.9288
Epoch 628/4000
88/88 - 0s - loss: 0.1881 - accuracy: 0.9273 - val_loss: 0.1853 - val_accuracy: 0.9284
Epoch 629/4000
88/88 - 0s - loss: 0.1897 - accuracy: 0.9271 - val_loss: 0.1856 - val_accuracy: 0.9287
Epoch 630/4000
88/88 - 0s - loss: 0.1882 - accuracy: 0.9277 - val_loss: 0.1846 - val_accuracy: 0.9287
Epoch 631/4000
88/88 - 0s - loss: 0.1884 - accuracy: 0.9278 - val_loss: 0.1860 - val_accuracy: 0.9281
Epoch 632/4000
88/88 - 0s - loss: 0.1877 - accuracy: 0.9264 - val_loss: 0.1861 - val_accuracy: 0.9286
Epoch 633/4000
88/88 - 0s - loss: 0.1900 - accuracy: 0.9271 - val_loss: 0.1851 - val_accuracy: 0.9283
Epoch 634/4000
88/88 - 0s - loss: 0.1890 - accuracy: 0.9270 - val_loss: 0.1841 - val_accuracy: 0.9296
Epoch 635/4000
88/88 - 0s - loss: 0.1887 - accuracy: 0.9267 - val_loss: 0.1857 - val_accuracy: 0.9277
Epoch 636/4000
88/88 - 0s - loss: 0.1891 - accuracy: 0.9263 - val_loss: 0.1872 - val_accuracy: 0.9278
Epoch 637/4000
88/88 - 0s - loss: 0.1887 - accuracy: 0.9270 - val_loss: 0.1852 - val_accuracy: 0.9291
Epoch 638/4000
88/88 - 0s - loss: 0.1890 - accuracy: 0.9269 - val_loss: 0.1861 - val_accuracy: 0.9297
Epoch 639/4000
88/88 - 0s - loss: 0.1885 - accuracy: 0.9267 - val_loss: 0.1853 - val_accuracy: 0.9286
Epoch 640/4000
88/88 - 0s - loss: 0.1883 - accuracy: 0.9273 - val_loss: 0.1861 - val_accuracy: 0.9282
Epoch 641/4000
88/88 - 0s - loss: 0.1884 - accuracy: 0.9276 - val_loss: 0.1871 - val_accuracy: 0.9270
Epoch 642/4000
88/88 - 0s - loss: 0.1885 - accuracy: 0.9271 - val_loss: 0.1849 - val_accuracy: 0.9289
Epoch 643/4000
88/88 - 0s - loss: 0.1884 - accuracy: 0.9276 - val_loss: 0.1850 - val_accuracy: 0.9292
Epoch 644/4000
88/88 - 0s - loss: 0.1894 - accuracy: 0.9267 - val_loss: 0.1848 - val_accuracy: 0.9292
Epoch 645/4000
88/88 - 0s - loss: 0.1881 - accuracy: 0.9272 - val_loss: 0.1858 - val_accuracy: 0.9286
Epoch 646/4000
88/88 - 0s - loss: 0.1886 - accuracy: 0.9271 - val_loss: 0.1864 - val_accuracy: 0.9271
Epoch 647/4000
88/88 - 0s - loss: 0.1881 - accuracy: 0.9272 - val_loss: 0.1850 - val_accuracy: 0.9289
Epoch 648/4000
88/88 - 0s - loss: 0.1883 - accuracy: 0.9276 - val_loss: 0.1875 - val_accuracy: 0.9277
Epoch 649/4000
88/88 - 0s - loss: 0.1900 - accuracy: 0.9267 - val_loss: 0.1854 - val_accuracy: 0.9287
Epoch 650/4000
88/88 - 0s - loss: 0.1878 - accuracy: 0.9274 - val_loss: 0.1875 - val_accuracy: 0.9281
Epoch 651/4000
88/88 - 0s - loss: 0.1895 - accuracy: 0.9266 - val_loss: 0.1850 - val_accuracy: 0.9294
Epoch 652/4000
88/88 - 0s - loss: 0.1875 - accuracy: 0.9271 - val_loss: 0.1868 - val_accuracy: 0.9284
Epoch 653/4000
88/88 - 0s - loss: 0.1874 - accuracy: 0.9270 - val_loss: 0.1849 - val_accuracy: 0.9294
Epoch 654/4000
88/88 - 0s - loss: 0.1879 - accuracy: 0.9272 - val_loss: 0.1884 - val_accuracy: 0.9275
Epoch 655/4000
88/88 - 0s - loss: 0.1877 - accuracy: 0.9273 - val_loss: 0.1857 - val_accuracy: 0.9289
Epoch 656/4000
88/88 - 0s - loss: 0.1878 - accuracy: 0.9271 - val_loss: 0.1850 - val_accuracy: 0.9293
Epoch 657/4000
88/88 - 0s - loss: 0.1882 - accuracy: 0.9272 - val_loss: 0.1874 - val_accuracy: 0.9271
Epoch 658/4000
88/88 - 0s - loss: 0.1874 - accuracy: 0.9274 - val_loss: 0.1851 - val_accuracy: 0.9287
Epoch 659/4000
88/88 - 0s - loss: 0.1892 - accuracy: 0.9277 - val_loss: 0.1855 - val_accuracy: 0.9274
Epoch 660/4000
88/88 - 0s - loss: 0.1873 - accuracy: 0.9273 - val_loss: 0.1860 - val_accuracy: 0.9288
Epoch 661/4000
88/88 - 0s - loss: 0.1885 - accuracy: 0.9269 - val_loss: 0.1854 - val_accuracy: 0.9287
Epoch 662/4000
88/88 - 0s - loss: 0.1878 - accuracy: 0.9274 - val_loss: 0.1859 - val_accuracy: 0.9288
Epoch 663/4000
88/88 - 0s - loss: 0.1883 - accuracy: 0.9274 - val_loss: 0.1845 - val_accuracy: 0.9283
Epoch 664/4000
88/88 - 0s - loss: 0.1880 - accuracy: 0.9282 - val_loss: 0.1843 - val_accuracy: 0.9285
Epoch 665/4000
88/88 - 0s - loss: 0.1872 - accuracy: 0.9271 - val_loss: 0.1857 - val_accuracy: 0.9285
Epoch 666/4000
88/88 - 0s - loss: 0.1887 - accuracy: 0.9271 - val_loss: 0.1855 - val_accuracy: 0.9291
Epoch 667/4000
88/88 - 0s - loss: 0.1874 - accuracy: 0.9278 - val_loss: 0.1850 - val_accuracy: 0.9296
Epoch 668/4000
88/88 - 0s - loss: 0.1889 - accuracy: 0.9272 - val_loss: 0.1882 - val_accuracy: 0.9273
Epoch 669/4000
88/88 - 0s - loss: 0.1877 - accuracy: 0.9268 - val_loss: 0.1858 - val_accuracy: 0.9296
Epoch 670/4000
88/88 - 0s - loss: 0.1875 - accuracy: 0.9278 - val_loss: 0.1888 - val_accuracy: 0.9267
Epoch 671/4000
88/88 - 0s - loss: 0.1882 - accuracy: 0.9275 - val_loss: 0.1892 - val_accuracy: 0.9272
Epoch 672/4000
88/88 - 0s - loss: 0.1881 - accuracy: 0.9273 - val_loss: 0.1843 - val_accuracy: 0.9293
Epoch 673/4000
88/88 - 0s - loss: 0.1883 - accuracy: 0.9270 - val_loss: 0.1856 - val_accuracy: 0.9289
Epoch 674/4000
88/88 - 0s - loss: 0.1883 - accuracy: 0.9276 - val_loss: 0.1848 - val_accuracy: 0.9285
Epoch 675/4000
88/88 - 0s - loss: 0.1879 - accuracy: 0.9275 - val_loss: 0.1843 - val_accuracy: 0.9295
Epoch 676/4000
88/88 - 0s - loss: 0.1871 - accuracy: 0.9277 - val_loss: 0.1862 - val_accuracy: 0.9285
Epoch 677/4000
88/88 - 0s - loss: 0.1884 - accuracy: 0.9273 - val_loss: 0.1871 - val_accuracy: 0.9285
Epoch 678/4000
88/88 - 0s - loss: 0.1881 - accuracy: 0.9274 - val_loss: 0.1847 - val_accuracy: 0.9294
Epoch 679/4000
88/88 - 0s - loss: 0.1882 - accuracy: 0.9273 - val_loss: 0.1911 - val_accuracy: 0.9271
Epoch 680/4000
88/88 - 0s - loss: 0.1881 - accuracy: 0.9278 - val_loss: 0.1853 - val_accuracy: 0.9290
Epoch 681/4000
88/88 - 0s - loss: 0.1873 - accuracy: 0.9275 - val_loss: 0.1859 - val_accuracy: 0.9279
Epoch 682/4000
88/88 - 0s - loss: 0.1872 - accuracy: 0.9274 - val_loss: 0.1844 - val_accuracy: 0.9299
Epoch 683/4000
88/88 - 0s - loss: 0.1877 - accuracy: 0.9277 - val_loss: 0.1845 - val_accuracy: 0.9290
Epoch 684/4000
88/88 - 0s - loss: 0.1873 - accuracy: 0.9276 - val_loss: 0.1880 - val_accuracy: 0.9273
Epoch 685/4000
88/88 - 0s - loss: 0.1880 - accuracy: 0.9272 - val_loss: 0.1850 - val_accuracy: 0.9288
Epoch 686/4000
88/88 - 0s - loss: 0.1868 - accuracy: 0.9280 - val_loss: 0.1933 - val_accuracy: 0.9263
Epoch 687/4000
88/88 - 0s - loss: 0.1880 - accuracy: 0.9273 - val_loss: 0.1840 - val_accuracy: 0.9290
Epoch 688/4000
88/88 - 0s - loss: 0.1872 - accuracy: 0.9274 - val_loss: 0.1841 - val_accuracy: 0.9291
Epoch 689/4000
88/88 - 0s - loss: 0.1874 - accuracy: 0.9279 - val_loss: 0.1888 - val_accuracy: 0.9275
Epoch 690/4000
88/88 - 0s - loss: 0.1857 - accuracy: 0.9285 - val_loss: 0.1863 - val_accuracy: 0.9286
Epoch 691/4000
88/88 - 0s - loss: 0.1875 - accuracy: 0.9275 - val_loss: 0.1859 - val_accuracy: 0.9284
Epoch 692/4000
88/88 - 0s - loss: 0.1870 - accuracy: 0.9284 - val_loss: 0.1852 - val_accuracy: 0.9288
Epoch 693/4000
88/88 - 0s - loss: 0.1861 - accuracy: 0.9281 - val_loss: 0.1850 - val_accuracy: 0.9291
Epoch 694/4000
88/88 - 0s - loss: 0.1861 - accuracy: 0.9282 - val_loss: 0.1874 - val_accuracy: 0.9277
Epoch 695/4000
88/88 - 0s - loss: 0.1863 - accuracy: 0.9279 - val_loss: 0.1853 - val_accuracy: 0.9280
Epoch 696/4000
88/88 - 0s - loss: 0.1869 - accuracy: 0.9280 - val_loss: 0.1849 - val_accuracy: 0.9291
Epoch 697/4000
88/88 - 0s - loss: 0.1877 - accuracy: 0.9271 - val_loss: 0.1858 - val_accuracy: 0.9279
Epoch 698/4000
88/88 - 0s - loss: 0.1863 - accuracy: 0.9283 - val_loss: 0.1858 - val_accuracy: 0.9285
Epoch 699/4000
88/88 - 0s - loss: 0.1871 - accuracy: 0.9273 - val_loss: 0.1846 - val_accuracy: 0.9291
Epoch 700/4000
88/88 - 0s - loss: 0.1869 - accuracy: 0.9277 - val_loss: 0.1848 - val_accuracy: 0.9289
Epoch 701/4000
88/88 - 0s - loss: 0.1864 - accuracy: 0.9280 - val_loss: 0.1854 - val_accuracy: 0.9289
Epoch 702/4000
88/88 - 0s - loss: 0.1870 - accuracy: 0.9278 - val_loss: 0.1888 - val_accuracy: 0.9273
Epoch 703/4000
88/88 - 0s - loss: 0.1861 - accuracy: 0.9277 - val_loss: 0.1854 - val_accuracy: 0.9287
Epoch 704/4000
88/88 - 0s - loss: 0.1872 - accuracy: 0.9274 - val_loss: 0.1862 - val_accuracy: 0.9282
Epoch 705/4000
88/88 - 0s - loss: 0.1868 - accuracy: 0.9277 - val_loss: 0.1887 - val_accuracy: 0.9280
Epoch 706/4000
88/88 - 0s - loss: 0.1860 - accuracy: 0.9279 - val_loss: 0.1855 - val_accuracy: 0.9285
Epoch 707/4000
88/88 - 0s - loss: 0.1861 - accuracy: 0.9281 - val_loss: 0.1844 - val_accuracy: 0.9299
Epoch 708/4000
88/88 - 0s - loss: 0.1859 - accuracy: 0.9283 - val_loss: 0.1837 - val_accuracy: 0.9294
Epoch 709/4000
88/88 - 0s - loss: 0.1867 - accuracy: 0.9282 - val_loss: 0.1847 - val_accuracy: 0.9284
Epoch 710/4000
88/88 - 0s - loss: 0.1855 - accuracy: 0.9282 - val_loss: 0.1844 - val_accuracy: 0.9284
Epoch 711/4000
88/88 - 0s - loss: 0.1854 - accuracy: 0.9283 - val_loss: 0.1845 - val_accuracy: 0.9289
Epoch 712/4000
88/88 - 0s - loss: 0.1857 - accuracy: 0.9282 - val_loss: 0.1847 - val_accuracy: 0.9296
Epoch 713/4000
88/88 - 0s - loss: 0.1862 - accuracy: 0.9280 - val_loss: 0.1877 - val_accuracy: 0.9276
Epoch 714/4000
88/88 - 0s - loss: 0.1859 - accuracy: 0.9281 - val_loss: 0.1862 - val_accuracy: 0.9286
Epoch 715/4000
88/88 - 0s - loss: 0.1864 - accuracy: 0.9276 - val_loss: 0.1849 - val_accuracy: 0.9299
Epoch 716/4000
88/88 - 0s - loss: 0.1867 - accuracy: 0.9283 - val_loss: 0.1844 - val_accuracy: 0.9288
Epoch 717/4000
88/88 - 0s - loss: 0.1859 - accuracy: 0.9284 - val_loss: 0.1847 - val_accuracy: 0.9295
Epoch 718/4000
88/88 - 0s - loss: 0.1856 - accuracy: 0.9276 - val_loss: 0.1849 - val_accuracy: 0.9297
Epoch 719/4000
88/88 - 0s - loss: 0.1861 - accuracy: 0.9283 - val_loss: 0.1850 - val_accuracy: 0.9289
Epoch 720/4000
88/88 - 0s - loss: 0.1861 - accuracy: 0.9279 - val_loss: 0.1860 - val_accuracy: 0.9282
Epoch 721/4000
88/88 - 0s - loss: 0.1858 - accuracy: 0.9286 - val_loss: 0.1854 - val_accuracy: 0.9295
Epoch 722/4000
88/88 - 0s - loss: 0.1860 - accuracy: 0.9283 - val_loss: 0.1847 - val_accuracy: 0.9292
Epoch 723/4000
88/88 - 0s - loss: 0.1862 - accuracy: 0.9281 - val_loss: 0.1887 - val_accuracy: 0.9273
Epoch 724/4000
88/88 - 0s - loss: 0.1846 - accuracy: 0.9284 - val_loss: 0.1855 - val_accuracy: 0.9285
Epoch 725/4000
88/88 - 0s - loss: 0.1851 - accuracy: 0.9283 - val_loss: 0.1851 - val_accuracy: 0.9291
Epoch 726/4000
88/88 - 0s - loss: 0.1857 - accuracy: 0.9283 - val_loss: 0.1860 - val_accuracy: 0.9292
Epoch 727/4000
88/88 - 0s - loss: 0.1856 - accuracy: 0.9280 - val_loss: 0.1858 - val_accuracy: 0.9290
Epoch 728/4000
88/88 - 0s - loss: 0.1855 - accuracy: 0.9283 - val_loss: 0.1844 - val_accuracy: 0.9285
Epoch 729/4000
88/88 - 0s - loss: 0.1867 - accuracy: 0.9282 - val_loss: 0.1858 - val_accuracy: 0.9294
Epoch 730/4000
88/88 - 0s - loss: 0.1858 - accuracy: 0.9279 - val_loss: 0.1846 - val_accuracy: 0.9294
Epoch 731/4000
88/88 - 0s - loss: 0.1850 - accuracy: 0.9291 - val_loss: 0.1847 - val_accuracy: 0.9294
Epoch 732/4000
88/88 - 0s - loss: 0.1856 - accuracy: 0.9285 - val_loss: 0.1849 - val_accuracy: 0.9294
Epoch 733/4000
88/88 - 0s - loss: 0.1866 - accuracy: 0.9278 - val_loss: 0.1851 - val_accuracy: 0.9297
Epoch 734/4000
88/88 - 0s - loss: 0.1860 - accuracy: 0.9282 - val_loss: 0.1864 - val_accuracy: 0.9297
Epoch 735/4000
88/88 - 0s - loss: 0.1855 - accuracy: 0.9289 - val_loss: 0.1835 - val_accuracy: 0.9295
Epoch 736/4000
88/88 - 0s - loss: 0.1854 - accuracy: 0.9283 - val_loss: 0.1847 - val_accuracy: 0.9288
Epoch 737/4000
88/88 - 0s - loss: 0.1848 - accuracy: 0.9283 - val_loss: 0.1859 - val_accuracy: 0.9283
Epoch 738/4000
88/88 - 0s - loss: 0.1862 - accuracy: 0.9274 - val_loss: 0.1851 - val_accuracy: 0.9294
Epoch 739/4000
88/88 - 0s - loss: 0.1843 - accuracy: 0.9294 - val_loss: 0.1860 - val_accuracy: 0.9291
Epoch 740/4000
88/88 - 0s - loss: 0.1846 - accuracy: 0.9283 - val_loss: 0.1841 - val_accuracy: 0.9291
Epoch 741/4000
88/88 - 0s - loss: 0.1847 - accuracy: 0.9288 - val_loss: 0.1849 - val_accuracy: 0.9288
Epoch 742/4000
88/88 - 0s - loss: 0.1854 - accuracy: 0.9286 - val_loss: 0.1850 - val_accuracy: 0.9288
Epoch 743/4000
88/88 - 0s - loss: 0.1851 - accuracy: 0.9284 - val_loss: 0.1857 - val_accuracy: 0.9288
Epoch 744/4000
88/88 - 0s - loss: 0.1839 - accuracy: 0.9286 - val_loss: 0.1898 - val_accuracy: 0.9262
Epoch 745/4000
88/88 - 0s - loss: 0.1850 - accuracy: 0.9282 - val_loss: 0.1846 - val_accuracy: 0.9289
Epoch 746/4000
88/88 - 0s - loss: 0.1844 - accuracy: 0.9285 - val_loss: 0.1832 - val_accuracy: 0.9289
Epoch 747/4000
88/88 - 0s - loss: 0.1852 - accuracy: 0.9282 - val_loss: 0.1876 - val_accuracy: 0.9281
Epoch 748/4000
88/88 - 0s - loss: 0.1850 - accuracy: 0.9283 - val_loss: 0.1840 - val_accuracy: 0.9298
Epoch 749/4000
88/88 - 0s - loss: 0.1852 - accuracy: 0.9283 - val_loss: 0.1853 - val_accuracy: 0.9298
Epoch 750/4000
88/88 - 0s - loss: 0.1841 - accuracy: 0.9288 - val_loss: 0.1871 - val_accuracy: 0.9277
Epoch 751/4000
88/88 - 0s - loss: 0.1841 - accuracy: 0.9288 - val_loss: 0.1859 - val_accuracy: 0.9285
Epoch 752/4000
88/88 - 0s - loss: 0.1850 - accuracy: 0.9287 - val_loss: 0.1858 - val_accuracy: 0.9286
Epoch 753/4000
88/88 - 0s - loss: 0.1850 - accuracy: 0.9285 - val_loss: 0.1865 - val_accuracy: 0.9278
Epoch 754/4000
88/88 - 0s - loss: 0.1848 - accuracy: 0.9287 - val_loss: 0.1838 - val_accuracy: 0.9301
Epoch 755/4000
88/88 - 0s - loss: 0.1847 - accuracy: 0.9288 - val_loss: 0.1843 - val_accuracy: 0.9297
Epoch 756/4000
88/88 - 0s - loss: 0.1846 - accuracy: 0.9284 - val_loss: 0.1851 - val_accuracy: 0.9291
Epoch 757/4000
88/88 - 0s - loss: 0.1851 - accuracy: 0.9285 - val_loss: 0.1848 - val_accuracy: 0.9301
Epoch 758/4000
88/88 - 0s - loss: 0.1847 - accuracy: 0.9284 - val_loss: 0.1873 - val_accuracy: 0.9281
Epoch 759/4000
88/88 - 0s - loss: 0.1845 - accuracy: 0.9288 - val_loss: 0.1850 - val_accuracy: 0.9291
Epoch 760/4000
88/88 - 0s - loss: 0.1850 - accuracy: 0.9277 - val_loss: 0.1856 - val_accuracy: 0.9290
Epoch 761/4000
88/88 - 0s - loss: 0.1850 - accuracy: 0.9288 - val_loss: 0.1854 - val_accuracy: 0.9283
Epoch 762/4000
88/88 - 0s - loss: 0.1835 - accuracy: 0.9292 - val_loss: 0.1867 - val_accuracy: 0.9290
Epoch 763/4000
88/88 - 0s - loss: 0.1834 - accuracy: 0.9288 - val_loss: 0.1850 - val_accuracy: 0.9287
Epoch 764/4000
88/88 - 0s - loss: 0.1834 - accuracy: 0.9292 - val_loss: 0.1857 - val_accuracy: 0.9282
Epoch 765/4000
88/88 - 0s - loss: 0.1841 - accuracy: 0.9288 - val_loss: 0.1842 - val_accuracy: 0.9289
Epoch 766/4000
88/88 - 0s - loss: 0.1847 - accuracy: 0.9288 - val_loss: 0.1860 - val_accuracy: 0.9287
Epoch 767/4000
88/88 - 0s - loss: 0.1854 - accuracy: 0.9284 - val_loss: 0.1873 - val_accuracy: 0.9280
Epoch 768/4000
88/88 - 0s - loss: 0.1834 - accuracy: 0.9288 - val_loss: 0.1886 - val_accuracy: 0.9280
Epoch 769/4000
88/88 - 0s - loss: 0.1840 - accuracy: 0.9290 - val_loss: 0.1854 - val_accuracy: 0.9302
Epoch 770/4000
88/88 - 0s - loss: 0.1847 - accuracy: 0.9290 - val_loss: 0.1847 - val_accuracy: 0.9290
Epoch 771/4000
88/88 - 0s - loss: 0.1840 - accuracy: 0.9293 - val_loss: 0.1837 - val_accuracy: 0.9296
Epoch 772/4000
88/88 - 0s - loss: 0.1841 - accuracy: 0.9290 - val_loss: 0.1862 - val_accuracy: 0.9283
Epoch 773/4000
88/88 - 0s - loss: 0.1844 - accuracy: 0.9282 - val_loss: 0.1850 - val_accuracy: 0.9299
Epoch 774/4000
88/88 - 0s - loss: 0.1844 - accuracy: 0.9288 - val_loss: 0.1853 - val_accuracy: 0.9280
Epoch 775/4000
88/88 - 0s - loss: 0.1845 - accuracy: 0.9287 - val_loss: 0.1850 - val_accuracy: 0.9283
Epoch 776/4000
88/88 - 0s - loss: 0.1836 - accuracy: 0.9287 - val_loss: 0.1855 - val_accuracy: 0.9288
Epoch 777/4000
88/88 - 0s - loss: 0.1838 - accuracy: 0.9292 - val_loss: 0.1861 - val_accuracy: 0.9290
Epoch 778/4000
88/88 - 0s - loss: 0.1844 - accuracy: 0.9283 - val_loss: 0.1847 - val_accuracy: 0.9294
Epoch 779/4000
88/88 - 0s - loss: 0.1823 - accuracy: 0.9291 - val_loss: 0.1851 - val_accuracy: 0.9282
Epoch 780/4000
88/88 - 0s - loss: 0.1832 - accuracy: 0.9292 - val_loss: 0.1851 - val_accuracy: 0.9288
Epoch 781/4000
88/88 - 0s - loss: 0.1831 - accuracy: 0.9292 - val_loss: 0.1842 - val_accuracy: 0.9304
Epoch 782/4000
88/88 - 0s - loss: 0.1828 - accuracy: 0.9295 - val_loss: 0.1867 - val_accuracy: 0.9282
Epoch 783/4000
88/88 - 0s - loss: 0.1844 - accuracy: 0.9289 - val_loss: 0.1846 - val_accuracy: 0.9295
Epoch 784/4000
88/88 - 0s - loss: 0.1842 - accuracy: 0.9289 - val_loss: 0.1869 - val_accuracy: 0.9279
Epoch 785/4000
88/88 - 0s - loss: 0.1834 - accuracy: 0.9288 - val_loss: 0.1853 - val_accuracy: 0.9288
Epoch 786/4000
88/88 - 0s - loss: 0.1831 - accuracy: 0.9289 - val_loss: 0.1862 - val_accuracy: 0.9287
Epoch 787/4000
88/88 - 0s - loss: 0.1842 - accuracy: 0.9289 - val_loss: 0.1842 - val_accuracy: 0.9297
Epoch 788/4000
88/88 - 0s - loss: 0.1848 - accuracy: 0.9292 - val_loss: 0.1876 - val_accuracy: 0.9290
Epoch 789/4000
88/88 - 0s - loss: 0.1831 - accuracy: 0.9291 - val_loss: 0.1850 - val_accuracy: 0.9283
Epoch 790/4000
88/88 - 0s - loss: 0.1826 - accuracy: 0.9299 - val_loss: 0.1842 - val_accuracy: 0.9303
Epoch 791/4000
88/88 - 0s - loss: 0.1834 - accuracy: 0.9294 - val_loss: 0.1854 - val_accuracy: 0.9283
Epoch 792/4000
88/88 - 0s - loss: 0.1843 - accuracy: 0.9290 - val_loss: 0.1842 - val_accuracy: 0.9288
Epoch 793/4000
88/88 - 0s - loss: 0.1827 - accuracy: 0.9291 - val_loss: 0.1840 - val_accuracy: 0.9289
Epoch 794/4000
88/88 - 0s - loss: 0.1834 - accuracy: 0.9290 - val_loss: 0.1881 - val_accuracy: 0.9274
Epoch 795/4000
88/88 - 0s - loss: 0.1831 - accuracy: 0.9295 - val_loss: 0.1847 - val_accuracy: 0.9293
Epoch 796/4000
88/88 - 0s - loss: 0.1840 - accuracy: 0.9287 - val_loss: 0.1850 - val_accuracy: 0.9289
Epoch 797/4000
88/88 - 0s - loss: 0.1837 - accuracy: 0.9284 - val_loss: 0.1852 - val_accuracy: 0.9288
Epoch 798/4000
88/88 - 0s - loss: 0.1826 - accuracy: 0.9295 - val_loss: 0.1870 - val_accuracy: 0.9289
Epoch 799/4000
88/88 - 0s - loss: 0.1825 - accuracy: 0.9289 - val_loss: 0.1847 - val_accuracy: 0.9290
Epoch 800/4000
88/88 - 0s - loss: 0.1832 - accuracy: 0.9292 - val_loss: 0.1880 - val_accuracy: 0.9278
Epoch 801/4000
88/88 - 0s - loss: 0.1824 - accuracy: 0.9290 - val_loss: 0.1842 - val_accuracy: 0.9296
Epoch 802/4000
88/88 - 0s - loss: 0.1828 - accuracy: 0.9292 - val_loss: 0.1854 - val_accuracy: 0.9296
Epoch 803/4000
88/88 - 0s - loss: 0.1830 - accuracy: 0.9294 - val_loss: 0.1835 - val_accuracy: 0.9294
Epoch 804/4000
88/88 - 0s - loss: 0.1818 - accuracy: 0.9298 - val_loss: 0.1858 - val_accuracy: 0.9296
Epoch 805/4000
88/88 - 0s - loss: 0.1831 - accuracy: 0.9285 - val_loss: 0.1858 - val_accuracy: 0.9285
Epoch 806/4000
88/88 - 0s - loss: 0.1838 - accuracy: 0.9286 - val_loss: 0.1852 - val_accuracy: 0.9289
Epoch 807/4000
88/88 - 0s - loss: 0.1847 - accuracy: 0.9284 - val_loss: 0.1839 - val_accuracy: 0.9288
Epoch 808/4000
88/88 - 1s - loss: 0.1826 - accuracy: 0.9293 - val_loss: 0.1856 - val_accuracy: 0.9286
Epoch 809/4000
88/88 - 0s - loss: 0.1822 - accuracy: 0.9295 - val_loss: 0.1849 - val_accuracy: 0.9286
Epoch 810/4000
88/88 - 0s - loss: 0.1826 - accuracy: 0.9292 - val_loss: 0.1847 - val_accuracy: 0.9285
Epoch 811/4000
88/88 - 0s - loss: 0.1823 - accuracy: 0.9296 - val_loss: 0.1838 - val_accuracy: 0.9296
Epoch 812/4000
88/88 - 0s - loss: 0.1822 - accuracy: 0.9293 - val_loss: 0.1850 - val_accuracy: 0.9287
Epoch 813/4000
88/88 - 0s - loss: 0.1839 - accuracy: 0.9287 - val_loss: 0.1848 - val_accuracy: 0.9293
Epoch 814/4000
88/88 - 0s - loss: 0.1828 - accuracy: 0.9292 - val_loss: 0.1861 - val_accuracy: 0.9293
Epoch 815/4000
88/88 - 0s - loss: 0.1829 - accuracy: 0.9295 - val_loss: 0.1836 - val_accuracy: 0.9295
Epoch 816/4000
88/88 - 0s - loss: 0.1820 - accuracy: 0.9292 - val_loss: 0.1868 - val_accuracy: 0.9290
Epoch 817/4000
88/88 - 0s - loss: 0.1837 - accuracy: 0.9287 - val_loss: 0.1892 - val_accuracy: 0.9268
Epoch 818/4000
88/88 - 0s - loss: 0.1827 - accuracy: 0.9293 - val_loss: 0.1865 - val_accuracy: 0.9282
Epoch 819/4000
88/88 - 0s - loss: 0.1826 - accuracy: 0.9289 - val_loss: 0.1850 - val_accuracy: 0.9286
Epoch 820/4000
88/88 - 0s - loss: 0.1831 - accuracy: 0.9292 - val_loss: 0.1871 - val_accuracy: 0.9277
Epoch 821/4000
88/88 - 0s - loss: 0.1823 - accuracy: 0.9294 - val_loss: 0.1850 - val_accuracy: 0.9292
Epoch 822/4000
88/88 - 0s - loss: 0.1821 - accuracy: 0.9294 - val_loss: 0.1855 - val_accuracy: 0.9287
Epoch 823/4000
88/88 - 0s - loss: 0.1818 - accuracy: 0.9301 - val_loss: 0.1858 - val_accuracy: 0.9275
Epoch 824/4000
88/88 - 0s - loss: 0.1824 - accuracy: 0.9290 - val_loss: 0.1848 - val_accuracy: 0.9295
Epoch 825/4000
88/88 - 0s - loss: 0.1825 - accuracy: 0.9291 - val_loss: 0.1864 - val_accuracy: 0.9290
Epoch 826/4000
88/88 - 0s - loss: 0.1822 - accuracy: 0.9297 - val_loss: 0.1839 - val_accuracy: 0.9306
Epoch 827/4000
88/88 - 0s - loss: 0.1814 - accuracy: 0.9299 - val_loss: 0.1848 - val_accuracy: 0.9297
Epoch 828/4000
88/88 - 0s - loss: 0.1826 - accuracy: 0.9301 - val_loss: 0.1836 - val_accuracy: 0.9289
Epoch 829/4000
88/88 - 0s - loss: 0.1816 - accuracy: 0.9293 - val_loss: 0.1843 - val_accuracy: 0.9290
Epoch 830/4000
88/88 - 0s - loss: 0.1824 - accuracy: 0.9295 - val_loss: 0.1855 - val_accuracy: 0.9291
Epoch 831/4000
88/88 - 0s - loss: 0.1819 - accuracy: 0.9295 - val_loss: 0.1843 - val_accuracy: 0.9290
Epoch 832/4000
88/88 - 0s - loss: 0.1818 - accuracy: 0.9299 - val_loss: 0.1850 - val_accuracy: 0.9281
Epoch 833/4000
88/88 - 0s - loss: 0.1820 - accuracy: 0.9295 - val_loss: 0.1838 - val_accuracy: 0.9290
Epoch 834/4000
88/88 - 0s - loss: 0.1814 - accuracy: 0.9300 - val_loss: 0.1868 - val_accuracy: 0.9277
Epoch 835/4000
88/88 - 0s - loss: 0.1818 - accuracy: 0.9292 - val_loss: 0.1844 - val_accuracy: 0.9296
Epoch 836/4000
88/88 - 0s - loss: 0.1831 - accuracy: 0.9293 - val_loss: 0.1856 - val_accuracy: 0.9283
Epoch 837/4000
88/88 - 0s - loss: 0.1822 - accuracy: 0.9292 - val_loss: 0.1843 - val_accuracy: 0.9289
Epoch 838/4000
88/88 - 0s - loss: 0.1824 - accuracy: 0.9296 - val_loss: 0.1858 - val_accuracy: 0.9284
Epoch 839/4000
88/88 - 0s - loss: 0.1824 - accuracy: 0.9299 - val_loss: 0.1852 - val_accuracy: 0.9288
Epoch 840/4000
88/88 - 0s - loss: 0.1810 - accuracy: 0.9294 - val_loss: 0.1844 - val_accuracy: 0.9291
Epoch 841/4000
88/88 - 0s - loss: 0.1816 - accuracy: 0.9296 - val_loss: 0.1888 - val_accuracy: 0.9262
Epoch 842/4000
88/88 - 0s - loss: 0.1824 - accuracy: 0.9297 - val_loss: 0.1869 - val_accuracy: 0.9275
Epoch 843/4000
88/88 - 0s - loss: 0.1813 - accuracy: 0.9295 - val_loss: 0.1869 - val_accuracy: 0.9301
Epoch 844/4000
88/88 - 0s - loss: 0.1808 - accuracy: 0.9297 - val_loss: 0.1836 - val_accuracy: 0.9302
Epoch 845/4000
88/88 - 0s - loss: 0.1814 - accuracy: 0.9298 - val_loss: 0.1868 - val_accuracy: 0.9281
Epoch 846/4000
88/88 - 0s - loss: 0.1827 - accuracy: 0.9290 - val_loss: 0.1918 - val_accuracy: 0.9258
In [10]:
model.save("output/DNNClassifier.h5")
In [11]:
plot_model(model,"output/DNNMod.pdf",show_shapes=True)
In [12]:
plot_model_change(history,fname="output/DNNTraining.pdf")
In [13]:
preds_test = model.predict(dnnx_test,batch_size=2048, verbose = 0)
print(get_metrics(preds_test.argmax(axis=1), y_test.argmax(axis=1),label_strings))
Identified 27890 correct labels out of 30000 labels
Accuracy: 0.9296666666666666
Precision: 0.930187343825422
Recall: 0.9296465743591963
F1 Score: 0.9297919253175636
Labels are: ['GALAXY' 'QSO' 'STAR']
Confusion Matrix:
 [[9490  382  148]
 [ 296 9246  459]
 [ 126  699 9154]]
Classification_Report:
               precision    recall  f1-score   support

           0       0.96      0.95      0.95     10020
           1       0.90      0.92      0.91     10001
           2       0.94      0.92      0.93      9979

    accuracy                           0.93     30000
   macro avg       0.93      0.93      0.93     30000
weighted avg       0.93      0.93      0.93     30000

(array([    0,     1,     2, ..., 29997, 29998, 29999]), 0.9296666666666666, 0.930187343825422, 0.9296465743591963, array([[9490,  382,  148],
       [ 296, 9246,  459],
       [ 126,  699, 9154]]), '              precision    recall  f1-score   support\n\n           0       0.96      0.95      0.95     10020\n           1       0.90      0.92      0.91     10001\n           2       0.94      0.92      0.93      9979\n\n    accuracy                           0.93     30000\n   macro avg       0.93      0.93      0.93     30000\nweighted avg       0.93      0.93      0.93     30000\n')
In [14]:
cm = metrics.confusion_matrix(preds_test.argmax(axis=1), y_test.argmax(axis=1),normalize='true')
df_cm = pd.DataFrame(cm, index = label_strings,columns = label_strings)
plt.figure(figsize = (10,7))
sns.heatmap(df_cm, annot=True,cmap="Blues",square=True,fmt='.2%')
plt.savefig("output/dnn_cm.pdf")
In [15]:
del(dnnx_train)

3. Train a CNN Classifier¶

The main difference between a regular neural network (ANN) and a CNN is that the latter has convolution operations between a set of filters and the inputs. A convolution is expressed as yk=σ(Σmwkm∗xm+bk)yk=σ(Σmwmk∗xm+bk)

where we sum over the set of input feature maps, ‘*’ is the convolution operator, and w represents the filter weights. Here, a feature map is the array of output activation obtained after applying the activation function. In a typical CNN, there are three kinds of layers: convolution layers, pooling layers, and fully connected layers.

In [13]:
from PIL import Image
im=Image.open('/Users/atharvabagul/MargNet/CNN-1.png')
im = im.resize((1800, 600), Image.LANCZOS)
display(im)
In [16]:
inp_layer = tf.keras.Input(shape=X_train.shape[1:])

mod = Conv2D(filters=64, kernel_size=(5,5), padding='same')(inp_layer)
mod = ReLU()(mod)

c1 = Conv2D(filters=48, kernel_size=(1,1), padding='same')(mod)
c1 = ReLU()(c1)
c2 = Conv2D(filters=48, kernel_size=(1,1), padding='same')(mod)
c2 = ReLU()(c2)
c3 = Conv2D(filters=48, kernel_size=(1,1), padding='same')(mod)
c3 = ReLU()(c3)
c4 = Conv2D(filters=64, kernel_size=(1,1), padding='same')(c1)
c4 = ReLU()(c4)
c5 = Conv2D(filters=64, kernel_size=(3,3), padding='same')(c1)
c5 = ReLU()(c5)
c6 = Conv2D(filters=64, kernel_size=(5,5), padding='same')(c2)
c6 = ReLU()(c6)
p1 = AveragePooling2D(pool_size=(1, 1))(c3)
mod = concatenate([c4,c5,c6,p1])

c7 = Conv2D(filters=64, kernel_size=(1,1), padding='same')(mod)
c7 = ReLU()(c7)
c8 = Conv2D(filters=64, kernel_size=(1,1), padding='same')(mod)
c8 = ReLU()(c8)
c9 = Conv2D(filters=64, kernel_size=(1,1), padding='same')(mod)
c9 = ReLU()(c9)
c10 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(c7)
c10 = ReLU()(c10)
c11 = Conv2D(filters=92, kernel_size=(3,3), padding='same')(c7)
c11 = ReLU()(c11)
c12 = Conv2D(filters=92, kernel_size=(5,5), padding='same')(c8)
c12 = ReLU()(c12)
p2 = AveragePooling2D(pool_size=(1, 1))(c9)
mod = concatenate([c10,c11,c12,p2])
mod = AveragePooling2D(pool_size=(2, 2))(mod)

c13 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(mod)
c13 = ReLU()(c13)
c14 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(mod)
c14 = ReLU()(c14)
c15 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(mod)
c15 = ReLU()(c15)
c16 = Conv2D(filters=128, kernel_size=(1,1), padding='same')(c13)
c16 = ReLU()(c16)
c17 = Conv2D(filters=128, kernel_size=(3,3), padding='same')(c13)
c17 = ReLU()(c17)
c18 = Conv2D(filters=128, kernel_size=(5,5), padding='same')(c14)
c18 = ReLU()(c18)
p3 = AveragePooling2D(pool_size=(1, 1))(c15)
mod = concatenate([c16,c17,c18,p3])

c19 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(mod)
c19 = ReLU()(c19)
c20 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(mod)
c20 = ReLU()(c20)
c21 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(mod)
c21 = ReLU()(c21)
c22 = Conv2D(filters=128, kernel_size=(1,1), padding='same')(c19)
c22 = ReLU()(c22)
c23 = Conv2D(filters=128, kernel_size=(3,3), padding='same')(c19)
c23 = ReLU()(c23)
c24 = Conv2D(filters=128, kernel_size=(5,5), padding='same')(c20)
c24 = ReLU()(c24)
p4 = AveragePooling2D(pool_size=(1, 1))(c21)
mod = concatenate([c22,c23,c24,p4])
mod = AveragePooling2D(pool_size=(2, 2))(mod)

c25 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(mod)
c25 = ReLU()(c25)
c26 = Conv2D(filters=92, kernel_size=(1,1), padding='same')(mod)
c26 = ReLU()(c26)
c27 = Conv2D(filters=128, kernel_size=(1,1), padding='same')(mod)
c27 = ReLU()(c27)
c28 = Conv2D(filters=128, kernel_size=(3,3), padding='same')(c25)
c28 = ReLU()(c28)
p5 = AveragePooling2D(pool_size=(1, 1))(c26)
mod = concatenate([c27,c28,p5])
mod = Flatten()(mod)    #Check
mod = Dense(1024)(mod)
mod = Dense(1024)(mod)
out_layer = Dense(3, activation="softmax") (mod)
model = tf.keras.Model(inputs=inp_layer, outputs=out_layer)

model.compile(optimizer = 'adam' , loss = "categorical_crossentropy", metrics=["accuracy"])
In [17]:
datagen = ImageDataGenerator(
        featurewise_center=False,  # set input mean to 0 over the dataset
        samplewise_center=False,  # set each sample mean to 0
        featurewise_std_normalization=False,  # divide inputs by std of the dataset
        samplewise_std_normalization=False,  # divide each input by its std
        zca_whitening=False,  # apply ZCA whitening
        rotation_range=180,  # randomly rotate images in the range (degrees, 0 to 180)
        width_shift_range=0.1,  # randomly shift images horizontally (fraction of total width)
        height_shift_range=0.1,  # randomly shift images vertically (fraction of total height)
        horizontal_flip=True,  # randomly flip images
        vertical_flip=True)
datagen.fit(X_train)


es = EarlyStopping(monitor='val_loss', verbose=1, patience=30, restore_best_weights=True)

cb = [es]
/opt/conda/lib/python3.7/site-packages/keras_preprocessing/image/image_data_generator.py:947: UserWarning: Expected input to be images (as Numpy array) following the data format convention "channels_last" (channels on axis 3), i.e. expected either 1, 3 or 4 channels on axis 3. However, it was passed an array with shape (180011, 32, 32, 5) (5 channels).
  ' channels).')
In [18]:
history = model.fit(datagen.flow(X_train,y_train, batch_size=512),
                              epochs = 300, validation_data = (X_val,y_val),
                              callbacks = cb,
                              verbose = 1)
/opt/conda/lib/python3.7/site-packages/keras_preprocessing/image/numpy_array_iterator.py:136: UserWarning: NumpyArrayIterator is set to use the data format convention "channels_last" (channels on axis 3), i.e. expected either 1, 3, or 4 channels on axis 3. However, it was passed an array with shape (180011, 32, 32, 5) (5 channels).
  str(self.x.shape[channels_axis]) + ' channels).')
Epoch 1/300
352/352 [==============================] - 176s 499ms/step - loss: 0.4619 - accuracy: 0.8153 - val_loss: 0.3253 - val_accuracy: 0.8808
Epoch 2/300
352/352 [==============================] - 176s 500ms/step - loss: 0.3135 - accuracy: 0.8809 - val_loss: 0.2951 - val_accuracy: 0.8865
Epoch 3/300
352/352 [==============================] - 176s 500ms/step - loss: 0.2930 - accuracy: 0.8880 - val_loss: 0.2933 - val_accuracy: 0.8886
Epoch 4/300
352/352 [==============================] - 177s 503ms/step - loss: 0.2802 - accuracy: 0.8925 - val_loss: 0.2729 - val_accuracy: 0.8951
Epoch 5/300
352/352 [==============================] - 178s 507ms/step - loss: 0.2723 - accuracy: 0.8944 - val_loss: 0.2566 - val_accuracy: 0.9001
Epoch 6/300
352/352 [==============================] - 179s 509ms/step - loss: 0.2639 - accuracy: 0.8974 - val_loss: 0.2630 - val_accuracy: 0.9003
Epoch 7/300
352/352 [==============================] - 177s 502ms/step - loss: 0.2599 - accuracy: 0.8992 - val_loss: 0.2646 - val_accuracy: 0.8956
Epoch 8/300
352/352 [==============================] - 179s 507ms/step - loss: 0.2555 - accuracy: 0.8999 - val_loss: 0.2746 - val_accuracy: 0.8923
Epoch 9/300
352/352 [==============================] - 176s 500ms/step - loss: 0.2505 - accuracy: 0.9020 - val_loss: 0.2717 - val_accuracy: 0.8917
Epoch 10/300
352/352 [==============================] - 179s 510ms/step - loss: 0.2467 - accuracy: 0.9032 - val_loss: 0.2477 - val_accuracy: 0.9035
Epoch 11/300
352/352 [==============================] - 178s 505ms/step - loss: 0.2425 - accuracy: 0.9052 - val_loss: 0.2549 - val_accuracy: 0.8985
Epoch 12/300
352/352 [==============================] - 182s 516ms/step - loss: 0.2405 - accuracy: 0.9056 - val_loss: 0.2424 - val_accuracy: 0.9031
Epoch 13/300
352/352 [==============================] - 180s 513ms/step - loss: 0.2355 - accuracy: 0.9072 - val_loss: 0.2369 - val_accuracy: 0.9074
Epoch 14/300
352/352 [==============================] - 179s 508ms/step - loss: 0.2399 - accuracy: 0.9057 - val_loss: 0.2440 - val_accuracy: 0.9039
Epoch 15/300
352/352 [==============================] - 179s 507ms/step - loss: 0.2353 - accuracy: 0.9076 - val_loss: 0.2537 - val_accuracy: 0.9012
Epoch 16/300
352/352 [==============================] - 178s 505ms/step - loss: 0.2332 - accuracy: 0.9084 - val_loss: 0.2545 - val_accuracy: 0.9007
Epoch 17/300
352/352 [==============================] - 180s 511ms/step - loss: 0.2327 - accuracy: 0.9094 - val_loss: 0.2441 - val_accuracy: 0.9047
Epoch 18/300
352/352 [==============================] - 178s 506ms/step - loss: 0.2348 - accuracy: 0.9088 - val_loss: 0.2377 - val_accuracy: 0.9082
Epoch 19/300
352/352 [==============================] - 182s 516ms/step - loss: 0.2300 - accuracy: 0.9097 - val_loss: 0.2498 - val_accuracy: 0.9022
Epoch 20/300
352/352 [==============================] - 179s 508ms/step - loss: 0.2262 - accuracy: 0.9105 - val_loss: 0.2317 - val_accuracy: 0.9104
Epoch 21/300
352/352 [==============================] - 185s 524ms/step - loss: 0.2254 - accuracy: 0.9110 - val_loss: 0.2356 - val_accuracy: 0.9097
Epoch 22/300
352/352 [==============================] - 181s 515ms/step - loss: 0.2246 - accuracy: 0.9118 - val_loss: 0.2387 - val_accuracy: 0.9094
Epoch 23/300
352/352 [==============================] - 184s 522ms/step - loss: 0.2225 - accuracy: 0.9127 - val_loss: 0.2373 - val_accuracy: 0.9095
Epoch 24/300
352/352 [==============================] - 185s 527ms/step - loss: 0.2243 - accuracy: 0.9118 - val_loss: 0.2247 - val_accuracy: 0.9124
Epoch 25/300
352/352 [==============================] - 184s 522ms/step - loss: 0.2194 - accuracy: 0.9132 - val_loss: 0.2331 - val_accuracy: 0.9094
Epoch 26/300
352/352 [==============================] - 184s 522ms/step - loss: 0.2195 - accuracy: 0.9135 - val_loss: 0.2336 - val_accuracy: 0.9086
Epoch 27/300
352/352 [==============================] - 184s 523ms/step - loss: 0.2185 - accuracy: 0.9135 - val_loss: 0.2303 - val_accuracy: 0.9113
Epoch 28/300
352/352 [==============================] - 185s 525ms/step - loss: 0.2197 - accuracy: 0.9132 - val_loss: 0.2336 - val_accuracy: 0.9078
Epoch 29/300
352/352 [==============================] - 185s 527ms/step - loss: 0.2188 - accuracy: 0.9140 - val_loss: 0.2308 - val_accuracy: 0.9112
Epoch 30/300
352/352 [==============================] - 177s 504ms/step - loss: 0.2227 - accuracy: 0.9131 - val_loss: 0.2296 - val_accuracy: 0.9113
Epoch 31/300
352/352 [==============================] - 180s 512ms/step - loss: 0.2174 - accuracy: 0.9139 - val_loss: 0.2519 - val_accuracy: 0.8986
Epoch 32/300
352/352 [==============================] - 180s 511ms/step - loss: 0.2144 - accuracy: 0.9147 - val_loss: 0.2328 - val_accuracy: 0.9101
Epoch 33/300
352/352 [==============================] - 178s 506ms/step - loss: 0.2140 - accuracy: 0.9156 - val_loss: 0.2312 - val_accuracy: 0.9122
Epoch 34/300
352/352 [==============================] - 179s 507ms/step - loss: 0.2124 - accuracy: 0.9153 - val_loss: 0.2265 - val_accuracy: 0.9117
Epoch 35/300
352/352 [==============================] - 182s 518ms/step - loss: 0.2118 - accuracy: 0.9156 - val_loss: 0.2394 - val_accuracy: 0.9052
Epoch 36/300
352/352 [==============================] - 178s 507ms/step - loss: 0.2113 - accuracy: 0.9159 - val_loss: 0.2415 - val_accuracy: 0.9078
Epoch 37/300
352/352 [==============================] - 182s 517ms/step - loss: 0.2099 - accuracy: 0.9163 - val_loss: 0.2318 - val_accuracy: 0.9101
Epoch 38/300
352/352 [==============================] - 179s 508ms/step - loss: 0.2097 - accuracy: 0.9161 - val_loss: 0.2399 - val_accuracy: 0.9066
Epoch 39/300
352/352 [==============================] - 181s 513ms/step - loss: 0.2080 - accuracy: 0.9175 - val_loss: 0.2312 - val_accuracy: 0.9104
Epoch 40/300
352/352 [==============================] - 184s 524ms/step - loss: 0.2093 - accuracy: 0.9167 - val_loss: 0.2397 - val_accuracy: 0.9080
Epoch 41/300
352/352 [==============================] - 178s 505ms/step - loss: 0.2086 - accuracy: 0.9171 - val_loss: 0.2214 - val_accuracy: 0.9145
Epoch 42/300
352/352 [==============================] - 184s 522ms/step - loss: 0.2074 - accuracy: 0.9178 - val_loss: 0.2249 - val_accuracy: 0.9132
Epoch 43/300
352/352 [==============================] - 180s 510ms/step - loss: 0.2091 - accuracy: 0.9172 - val_loss: 0.2360 - val_accuracy: 0.9085
Epoch 44/300
352/352 [==============================] - 181s 514ms/step - loss: 0.2057 - accuracy: 0.9178 - val_loss: 0.2348 - val_accuracy: 0.9101
Epoch 45/300
352/352 [==============================] - 181s 514ms/step - loss: 0.2060 - accuracy: 0.9180 - val_loss: 0.2244 - val_accuracy: 0.9131
Epoch 46/300
352/352 [==============================] - 179s 509ms/step - loss: 0.2037 - accuracy: 0.9189 - val_loss: 0.2375 - val_accuracy: 0.9094
Epoch 47/300
352/352 [==============================] - 181s 515ms/step - loss: 0.2043 - accuracy: 0.9184 - val_loss: 0.2371 - val_accuracy: 0.9082
Epoch 48/300
352/352 [==============================] - 180s 513ms/step - loss: 0.2028 - accuracy: 0.9193 - val_loss: 0.2331 - val_accuracy: 0.9093
Epoch 49/300
352/352 [==============================] - 185s 525ms/step - loss: 0.2021 - accuracy: 0.9194 - val_loss: 0.2292 - val_accuracy: 0.9118
Epoch 50/300
352/352 [==============================] - 180s 512ms/step - loss: 0.2017 - accuracy: 0.9190 - val_loss: 0.2300 - val_accuracy: 0.9108
Epoch 51/300
352/352 [==============================] - 184s 523ms/step - loss: 0.2014 - accuracy: 0.9193 - val_loss: 0.2290 - val_accuracy: 0.9136
Epoch 52/300
352/352 [==============================] - 180s 512ms/step - loss: 0.2038 - accuracy: 0.9190 - val_loss: 0.2310 - val_accuracy: 0.9115
Epoch 53/300
352/352 [==============================] - 186s 527ms/step - loss: 0.1992 - accuracy: 0.9209 - val_loss: 0.2315 - val_accuracy: 0.9121
Epoch 54/300
352/352 [==============================] - 183s 519ms/step - loss: 0.2030 - accuracy: 0.9198 - val_loss: 0.2379 - val_accuracy: 0.9104
Epoch 55/300
352/352 [==============================] - 187s 532ms/step - loss: 0.2384 - accuracy: 0.9118 - val_loss: 0.2289 - val_accuracy: 0.9129
Epoch 56/300
352/352 [==============================] - 183s 519ms/step - loss: 0.2137 - accuracy: 0.9158 - val_loss: 0.2378 - val_accuracy: 0.9097
Epoch 57/300
352/352 [==============================] - 186s 530ms/step - loss: 0.2064 - accuracy: 0.9184 - val_loss: 0.2288 - val_accuracy: 0.9121
Epoch 58/300
352/352 [==============================] - 185s 525ms/step - loss: 0.2022 - accuracy: 0.9199 - val_loss: 0.2323 - val_accuracy: 0.9107
Epoch 59/300
352/352 [==============================] - 188s 534ms/step - loss: 0.1999 - accuracy: 0.9207 - val_loss: 0.2265 - val_accuracy: 0.9128
Epoch 60/300
352/352 [==============================] - 187s 530ms/step - loss: 0.2003 - accuracy: 0.9208 - val_loss: 0.2283 - val_accuracy: 0.9124
Epoch 61/300
352/352 [==============================] - 190s 541ms/step - loss: 0.1960 - accuracy: 0.9218 - val_loss: 0.2368 - val_accuracy: 0.9115
Epoch 62/300
352/352 [==============================] - 191s 541ms/step - loss: 0.1954 - accuracy: 0.9216 - val_loss: 0.2337 - val_accuracy: 0.9120
Epoch 63/300
352/352 [==============================] - 189s 537ms/step - loss: 0.1959 - accuracy: 0.9220 - val_loss: 0.2232 - val_accuracy: 0.9135
Epoch 64/300
352/352 [==============================] - 192s 547ms/step - loss: 0.1926 - accuracy: 0.9229 - val_loss: 0.2351 - val_accuracy: 0.9112
Epoch 65/300
352/352 [==============================] - 184s 522ms/step - loss: 0.1921 - accuracy: 0.9230 - val_loss: 0.2384 - val_accuracy: 0.9071
Epoch 66/300
352/352 [==============================] - 186s 529ms/step - loss: 0.1940 - accuracy: 0.9220 - val_loss: 0.2425 - val_accuracy: 0.9084
Epoch 67/300
352/352 [==============================] - 181s 515ms/step - loss: 0.1937 - accuracy: 0.9230 - val_loss: 0.2323 - val_accuracy: 0.9109
Epoch 68/300
352/352 [==============================] - 180s 511ms/step - loss: 0.1934 - accuracy: 0.9224 - val_loss: 0.2374 - val_accuracy: 0.9104
Epoch 69/300
352/352 [==============================] - 186s 527ms/step - loss: 0.1906 - accuracy: 0.9232 - val_loss: 0.2305 - val_accuracy: 0.9115
Epoch 70/300
352/352 [==============================] - 179s 510ms/step - loss: 0.1903 - accuracy: 0.9235 - val_loss: 0.2296 - val_accuracy: 0.9124
Epoch 71/300
352/352 [==============================] - ETA: 0s - loss: 0.1950 - accuracy: 0.9221Restoring model weights from the end of the best epoch.
352/352 [==============================] - 181s 515ms/step - loss: 0.1950 - accuracy: 0.9221 - val_loss: 0.2510 - val_accuracy: 0.9075
Epoch 00071: early stopping
In [19]:
model.save("output/CNNClassifier.h5")
In [20]:
plot_model(model,"output/CNNMod.pdf",show_shapes=True)
In [21]:
plot_model_change(history,fname="output/CNNTraining.pdf")
In [22]:
preds_test = model.predict(X_test,batch_size=1024, verbose = 0)
print(get_metrics(preds_test.argmax(axis=1), y_test.argmax(axis=1),label_strings))
Identified 27480 correct labels out of 30000 labels
Accuracy: 0.916
Precision: 0.9166998156433928
Recall: 0.915984589578169
F1 Score: 0.9161752872764594
Labels are: ['GALAXY' 'QSO' 'STAR']
Confusion Matrix:
 [[9305  498  217]
 [ 353 9135  513]
 [ 184  755 9040]]
Classification_Report:
               precision    recall  f1-score   support

           0       0.95      0.93      0.94     10020
           1       0.88      0.91      0.90     10001
           2       0.93      0.91      0.92      9979

    accuracy                           0.92     30000
   macro avg       0.92      0.92      0.92     30000
weighted avg       0.92      0.92      0.92     30000

(array([    0,     1,     2, ..., 29997, 29998, 29999]), 0.916, 0.9166998156433928, 0.915984589578169, array([[9305,  498,  217],
       [ 353, 9135,  513],
       [ 184,  755, 9040]]), '              precision    recall  f1-score   support\n\n           0       0.95      0.93      0.94     10020\n           1       0.88      0.91      0.90     10001\n           2       0.93      0.91      0.92      9979\n\n    accuracy                           0.92     30000\n   macro avg       0.92      0.92      0.92     30000\nweighted avg       0.92      0.92      0.92     30000\n')
In [23]:
cm = metrics.confusion_matrix(preds_test.argmax(axis=1), y_test.argmax(axis=1),normalize='true')
df_cm = pd.DataFrame(cm, index = label_strings,columns = label_strings)
plt.figure(figsize = (10,7))
sns.heatmap(df_cm, annot=True,cmap="Blues",square=True,fmt='.2%')
plt.savefig("cnn_cm.pdf")
In [24]:
del(X_train)

4. Train the Ensemble Classifier¶

  • The outputs of the CNN classifier and the ANN classifier consist of three neurons each, which represent the separate predicted probabilities.
  • These are then ‘concatenated’ together and fed forward to a fully connected dense layer of 10 neurons. These previously trained models are both stacked parallel to each other, in a stacking ensemble.
  • These are then finally fed to three softmax activated neurons, which represent final probability predictions of Margnet
  • The expected outputs for both components are merged, and a statistical combination of them leads to the final output.
  • MargNet contains 103 neurons of its own and a total of 25,875,217 inclusive of CNN and ANN model trained neurons.
In [15]:
im=Image.open('/Users/atharvabagul/MargNet/MargNet_Ensemble.png')
im = im.resize((600, 200), Image.LANCZOS)
display(im)
In [25]:
cnnclassifier = load_model("output/CNNClassifier.h5")
dnnclassifier = load_model("output/DNNClassifier.h5")
In [26]:
def define_stacked_model(members):
    # update all layers in all models to not be trainable
    for i in range(len(members)):
        model = members[i]
        for layer in model.layers:
            # make not trainable
            layer.trainable = False
            # rename to avoid 'unique layer name' issue
            layer._name = 'ensemble_' + str(i+1) + '_' + layer.name
    # define multi-headed input
    ensemble_visible = [model.input for model in members]
    # concatenate merge output from each model
    ensemble_outputs = [model.output for model in members]
    merge = tf.keras.layers.concatenate(ensemble_outputs)
    hidden = Dense(10, activation='relu')(merge)
    output = Dense(3, activation='softmax')(hidden)
    model = tf.keras.Model(inputs=ensemble_visible, outputs=output)
    # plot graph of ensemble
    plot_model(model, show_shapes=True, to_file='model_graph.png')
    # compile
    model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
    return model
In [27]:
# define ensemble model
members = [cnnclassifier,dnnclassifier]
model = define_stacked_model(members)
You must install pydot (`pip install pydot`) and install graphviz (see instructions at https://graphviz.gitlab.io/download/) for plot_model to work.
In [28]:
filepath="output/EnsembleClassifier.h5"

checkpointcb = tf.keras.callbacks.ModelCheckpoint(filepath=filepath,monitor='loss',mode='min',save_best_only=True,verbose=1,save_weights_only=False)
cb = [checkpointcb]
In [29]:
history = model.fit([X_val, dnnx_val],
                            y_val, epochs=100,
                            batch_size=512,
                            callbacks=cb,
                            verbose=1)
Epoch 1/100
59/59 [==============================] - ETA: 0s - loss: 1.2855 - accuracy: 0.0360
Epoch 00001: loss improved from inf to 1.28553, saving model to EnsembleClassifier.h5
59/59 [==============================] - 7s 113ms/step - loss: 1.2855 - accuracy: 0.0360
Epoch 2/100
59/59 [==============================] - ETA: 0s - loss: 1.0646 - accuracy: 0.4235
Epoch 00002: loss improved from 1.28553 to 1.06456, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 108ms/step - loss: 1.0646 - accuracy: 0.4235
Epoch 3/100
59/59 [==============================] - ETA: 0s - loss: 0.8474 - accuracy: 0.9120
Epoch 00003: loss improved from 1.06456 to 0.84736, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 108ms/step - loss: 0.8474 - accuracy: 0.9120
Epoch 4/100
59/59 [==============================] - ETA: 0s - loss: 0.6441 - accuracy: 0.9256
Epoch 00004: loss improved from 0.84736 to 0.64412, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 107ms/step - loss: 0.6441 - accuracy: 0.9256
Epoch 5/100
59/59 [==============================] - ETA: 0s - loss: 0.4765 - accuracy: 0.9279
Epoch 00005: loss improved from 0.64412 to 0.47653, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 108ms/step - loss: 0.4765 - accuracy: 0.9279
Epoch 6/100
59/59 [==============================] - ETA: 0s - loss: 0.3668 - accuracy: 0.9280
Epoch 00006: loss improved from 0.47653 to 0.36678, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 106ms/step - loss: 0.3668 - accuracy: 0.9280
Epoch 7/100
59/59 [==============================] - ETA: 0s - loss: 0.2993 - accuracy: 0.9309
Epoch 00007: loss improved from 0.36678 to 0.29926, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 107ms/step - loss: 0.2993 - accuracy: 0.9309
Epoch 8/100
59/59 [==============================] - ETA: 0s - loss: 0.2629 - accuracy: 0.9306
Epoch 00008: loss improved from 0.29926 to 0.26293, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 109ms/step - loss: 0.2629 - accuracy: 0.9306
Epoch 9/100
59/59 [==============================] - ETA: 0s - loss: 0.2407 - accuracy: 0.9294
Epoch 00009: loss improved from 0.26293 to 0.24074, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 108ms/step - loss: 0.2407 - accuracy: 0.9294
Epoch 10/100
59/59 [==============================] - ETA: 0s - loss: 0.2258 - accuracy: 0.9305
Epoch 00010: loss improved from 0.24074 to 0.22580, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 109ms/step - loss: 0.2258 - accuracy: 0.9305
Epoch 11/100
59/59 [==============================] - ETA: 0s - loss: 0.2171 - accuracy: 0.9307
Epoch 00011: loss improved from 0.22580 to 0.21714, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 107ms/step - loss: 0.2171 - accuracy: 0.9307
Epoch 12/100
59/59 [==============================] - ETA: 0s - loss: 0.2125 - accuracy: 0.9304
Epoch 00012: loss improved from 0.21714 to 0.21250, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 106ms/step - loss: 0.2125 - accuracy: 0.9304
Epoch 13/100
59/59 [==============================] - ETA: 0s - loss: 0.2087 - accuracy: 0.9292
Epoch 00013: loss improved from 0.21250 to 0.20875, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 107ms/step - loss: 0.2087 - accuracy: 0.9292
Epoch 14/100
59/59 [==============================] - ETA: 0s - loss: 0.2064 - accuracy: 0.9307
Epoch 00014: loss improved from 0.20875 to 0.20635, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 107ms/step - loss: 0.2064 - accuracy: 0.9307
Epoch 15/100
59/59 [==============================] - ETA: 0s - loss: 0.2069 - accuracy: 0.9281
Epoch 00015: loss did not improve from 0.20635
59/59 [==============================] - 6s 103ms/step - loss: 0.2069 - accuracy: 0.9281
Epoch 16/100
59/59 [==============================] - ETA: 0s - loss: 0.2042 - accuracy: 0.9295
Epoch 00016: loss improved from 0.20635 to 0.20421, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 107ms/step - loss: 0.2042 - accuracy: 0.9295
Epoch 17/100
59/59 [==============================] - ETA: 0s - loss: 0.2026 - accuracy: 0.9304
Epoch 00017: loss improved from 0.20421 to 0.20261, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 107ms/step - loss: 0.2026 - accuracy: 0.9304
Epoch 18/100
59/59 [==============================] - ETA: 0s - loss: 0.2015 - accuracy: 0.9294
Epoch 00018: loss improved from 0.20261 to 0.20148, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 108ms/step - loss: 0.2015 - accuracy: 0.9294
Epoch 19/100
59/59 [==============================] - ETA: 0s - loss: 0.2005 - accuracy: 0.9305
Epoch 00019: loss improved from 0.20148 to 0.20055, saving model to EnsembleClassifier.h5
59/59 [==============================] - 7s 112ms/step - loss: 0.2005 - accuracy: 0.9305
Epoch 20/100
59/59 [==============================] - ETA: 0s - loss: 0.2010 - accuracy: 0.9304
Epoch 00020: loss did not improve from 0.20055
59/59 [==============================] - 6s 105ms/step - loss: 0.2010 - accuracy: 0.9304
Epoch 21/100
59/59 [==============================] - ETA: 0s - loss: 0.2008 - accuracy: 0.9295
Epoch 00021: loss did not improve from 0.20055
59/59 [==============================] - 6s 102ms/step - loss: 0.2008 - accuracy: 0.9295
Epoch 22/100
59/59 [==============================] - ETA: 0s - loss: 0.2006 - accuracy: 0.9293
Epoch 00022: loss did not improve from 0.20055
59/59 [==============================] - 6s 102ms/step - loss: 0.2006 - accuracy: 0.9293
Epoch 23/100
59/59 [==============================] - ETA: 0s - loss: 0.2010 - accuracy: 0.9295
Epoch 00023: loss did not improve from 0.20055
59/59 [==============================] - 6s 101ms/step - loss: 0.2010 - accuracy: 0.9295
Epoch 24/100
59/59 [==============================] - ETA: 0s - loss: 0.2008 - accuracy: 0.9295
Epoch 00024: loss did not improve from 0.20055
59/59 [==============================] - 6s 101ms/step - loss: 0.2008 - accuracy: 0.9295
Epoch 25/100
59/59 [==============================] - ETA: 0s - loss: 0.1996 - accuracy: 0.9306
Epoch 00025: loss improved from 0.20055 to 0.19964, saving model to EnsembleClassifier.h5
59/59 [==============================] - 7s 110ms/step - loss: 0.1996 - accuracy: 0.9306
Epoch 26/100
59/59 [==============================] - ETA: 0s - loss: 0.1987 - accuracy: 0.9301
Epoch 00026: loss improved from 0.19964 to 0.19873, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 108ms/step - loss: 0.1987 - accuracy: 0.9301
Epoch 27/100
59/59 [==============================] - ETA: 0s - loss: 0.1996 - accuracy: 0.9300
Epoch 00027: loss did not improve from 0.19873
59/59 [==============================] - 6s 102ms/step - loss: 0.1996 - accuracy: 0.9300
Epoch 28/100
59/59 [==============================] - ETA: 0s - loss: 0.1995 - accuracy: 0.9305
Epoch 00028: loss did not improve from 0.19873
59/59 [==============================] - 6s 101ms/step - loss: 0.1995 - accuracy: 0.9305
Epoch 29/100
59/59 [==============================] - ETA: 0s - loss: 0.1991 - accuracy: 0.9309
Epoch 00029: loss did not improve from 0.19873
59/59 [==============================] - 6s 102ms/step - loss: 0.1991 - accuracy: 0.9309
Epoch 30/100
59/59 [==============================] - ETA: 0s - loss: 0.1997 - accuracy: 0.9305
Epoch 00030: loss did not improve from 0.19873
59/59 [==============================] - 6s 102ms/step - loss: 0.1997 - accuracy: 0.9305
Epoch 31/100
59/59 [==============================] - ETA: 0s - loss: 0.1987 - accuracy: 0.9302
Epoch 00031: loss improved from 0.19873 to 0.19869, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 108ms/step - loss: 0.1987 - accuracy: 0.9302
Epoch 32/100
59/59 [==============================] - ETA: 0s - loss: 0.1998 - accuracy: 0.9298
Epoch 00032: loss did not improve from 0.19869
59/59 [==============================] - 6s 101ms/step - loss: 0.1998 - accuracy: 0.9298
Epoch 33/100
59/59 [==============================] - ETA: 0s - loss: 0.2000 - accuracy: 0.9305
Epoch 00033: loss did not improve from 0.19869
59/59 [==============================] - 6s 102ms/step - loss: 0.2000 - accuracy: 0.9305
Epoch 34/100
59/59 [==============================] - ETA: 0s - loss: 0.1977 - accuracy: 0.9310
Epoch 00034: loss improved from 0.19869 to 0.19767, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 107ms/step - loss: 0.1977 - accuracy: 0.9310
Epoch 35/100
59/59 [==============================] - ETA: 0s - loss: 0.1985 - accuracy: 0.9303
Epoch 00035: loss did not improve from 0.19767
59/59 [==============================] - 6s 102ms/step - loss: 0.1985 - accuracy: 0.9303
Epoch 36/100
59/59 [==============================] - ETA: 0s - loss: 0.1990 - accuracy: 0.9303
Epoch 00036: loss did not improve from 0.19767
59/59 [==============================] - 6s 102ms/step - loss: 0.1990 - accuracy: 0.9303
Epoch 37/100
59/59 [==============================] - ETA: 0s - loss: 0.1998 - accuracy: 0.9299
Epoch 00037: loss did not improve from 0.19767
59/59 [==============================] - 6s 101ms/step - loss: 0.1998 - accuracy: 0.9299
Epoch 38/100
59/59 [==============================] - ETA: 0s - loss: 0.1995 - accuracy: 0.9288
Epoch 00038: loss did not improve from 0.19767
59/59 [==============================] - 6s 102ms/step - loss: 0.1995 - accuracy: 0.9288
Epoch 39/100
59/59 [==============================] - ETA: 0s - loss: 0.1985 - accuracy: 0.9302
Epoch 00039: loss did not improve from 0.19767
59/59 [==============================] - 6s 101ms/step - loss: 0.1985 - accuracy: 0.9302
Epoch 40/100
59/59 [==============================] - ETA: 0s - loss: 0.1991 - accuracy: 0.9290
Epoch 00040: loss did not improve from 0.19767
59/59 [==============================] - 6s 102ms/step - loss: 0.1991 - accuracy: 0.9290
Epoch 41/100
59/59 [==============================] - ETA: 0s - loss: 0.1988 - accuracy: 0.9307
Epoch 00041: loss did not improve from 0.19767
59/59 [==============================] - 6s 106ms/step - loss: 0.1988 - accuracy: 0.9307
Epoch 42/100
59/59 [==============================] - ETA: 0s - loss: 0.1964 - accuracy: 0.9311
Epoch 00042: loss improved from 0.19767 to 0.19640, saving model to EnsembleClassifier.h5
59/59 [==============================] - 7s 113ms/step - loss: 0.1964 - accuracy: 0.9311
Epoch 43/100
59/59 [==============================] - ETA: 0s - loss: 0.1985 - accuracy: 0.9296
Epoch 00043: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1985 - accuracy: 0.9296
Epoch 44/100
59/59 [==============================] - ETA: 0s - loss: 0.1987 - accuracy: 0.9304
Epoch 00044: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1987 - accuracy: 0.9304
Epoch 45/100
59/59 [==============================] - ETA: 0s - loss: 0.1982 - accuracy: 0.9310
Epoch 00045: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1982 - accuracy: 0.9310
Epoch 46/100
59/59 [==============================] - ETA: 0s - loss: 0.1971 - accuracy: 0.9313
Epoch 00046: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1971 - accuracy: 0.9313
Epoch 47/100
59/59 [==============================] - ETA: 0s - loss: 0.1994 - accuracy: 0.9304
Epoch 00047: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1994 - accuracy: 0.9304
Epoch 48/100
59/59 [==============================] - ETA: 0s - loss: 0.1988 - accuracy: 0.9300
Epoch 00048: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1988 - accuracy: 0.9300
Epoch 49/100
59/59 [==============================] - ETA: 0s - loss: 0.1994 - accuracy: 0.9295
Epoch 00049: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1994 - accuracy: 0.9295
Epoch 50/100
59/59 [==============================] - ETA: 0s - loss: 0.1996 - accuracy: 0.9293
Epoch 00050: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1996 - accuracy: 0.9293
Epoch 51/100
59/59 [==============================] - ETA: 0s - loss: 0.1971 - accuracy: 0.9309
Epoch 00051: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1971 - accuracy: 0.9309
Epoch 52/100
59/59 [==============================] - ETA: 0s - loss: 0.1996 - accuracy: 0.9309
Epoch 00052: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1996 - accuracy: 0.9309
Epoch 53/100
59/59 [==============================] - ETA: 0s - loss: 0.1990 - accuracy: 0.9293
Epoch 00053: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1990 - accuracy: 0.9293
Epoch 54/100
59/59 [==============================] - ETA: 0s - loss: 0.1990 - accuracy: 0.9297
Epoch 00054: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1990 - accuracy: 0.9297
Epoch 55/100
59/59 [==============================] - ETA: 0s - loss: 0.1999 - accuracy: 0.9300
Epoch 00055: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1999 - accuracy: 0.9300
Epoch 56/100
59/59 [==============================] - ETA: 0s - loss: 0.1998 - accuracy: 0.9300
Epoch 00056: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1998 - accuracy: 0.9300
Epoch 57/100
59/59 [==============================] - ETA: 0s - loss: 0.1989 - accuracy: 0.9297
Epoch 00057: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1989 - accuracy: 0.9297
Epoch 58/100
59/59 [==============================] - ETA: 0s - loss: 0.1985 - accuracy: 0.9301
Epoch 00058: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1985 - accuracy: 0.9301
Epoch 59/100
59/59 [==============================] - ETA: 0s - loss: 0.1992 - accuracy: 0.9290
Epoch 00059: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1992 - accuracy: 0.9290
Epoch 60/100
59/59 [==============================] - ETA: 0s - loss: 0.1995 - accuracy: 0.9301
Epoch 00060: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1995 - accuracy: 0.9301
Epoch 61/100
59/59 [==============================] - ETA: 0s - loss: 0.1979 - accuracy: 0.9307
Epoch 00061: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1979 - accuracy: 0.9307
Epoch 62/100
59/59 [==============================] - ETA: 0s - loss: 0.1987 - accuracy: 0.9293
Epoch 00062: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1987 - accuracy: 0.9293
Epoch 63/100
59/59 [==============================] - ETA: 0s - loss: 0.1984 - accuracy: 0.9304
Epoch 00063: loss did not improve from 0.19640
59/59 [==============================] - 6s 103ms/step - loss: 0.1984 - accuracy: 0.9304
Epoch 64/100
59/59 [==============================] - ETA: 0s - loss: 0.1986 - accuracy: 0.9302
Epoch 00064: loss did not improve from 0.19640
59/59 [==============================] - 6s 104ms/step - loss: 0.1986 - accuracy: 0.9302
Epoch 65/100
59/59 [==============================] - ETA: 0s - loss: 0.1980 - accuracy: 0.9299
Epoch 00065: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1980 - accuracy: 0.9299
Epoch 66/100
59/59 [==============================] - ETA: 0s - loss: 0.1979 - accuracy: 0.9309
Epoch 00066: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1979 - accuracy: 0.9309
Epoch 67/100
59/59 [==============================] - ETA: 0s - loss: 0.1987 - accuracy: 0.9292
Epoch 00067: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1987 - accuracy: 0.9292
Epoch 68/100
59/59 [==============================] - ETA: 0s - loss: 0.1989 - accuracy: 0.9306
Epoch 00068: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1989 - accuracy: 0.9306
Epoch 69/100
59/59 [==============================] - ETA: 0s - loss: 0.1993 - accuracy: 0.9292
Epoch 00069: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1993 - accuracy: 0.9292
Epoch 70/100
59/59 [==============================] - ETA: 0s - loss: 0.1975 - accuracy: 0.9300
Epoch 00070: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1975 - accuracy: 0.9300
Epoch 71/100
59/59 [==============================] - ETA: 0s - loss: 0.1977 - accuracy: 0.9300
Epoch 00071: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1977 - accuracy: 0.9300
Epoch 72/100
59/59 [==============================] - ETA: 0s - loss: 0.1981 - accuracy: 0.9297
Epoch 00072: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1981 - accuracy: 0.9297
Epoch 73/100
59/59 [==============================] - ETA: 0s - loss: 0.1969 - accuracy: 0.9313
Epoch 00073: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1969 - accuracy: 0.9313
Epoch 74/100
59/59 [==============================] - ETA: 0s - loss: 0.1969 - accuracy: 0.9313
Epoch 00074: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1969 - accuracy: 0.9313
Epoch 75/100
59/59 [==============================] - ETA: 0s - loss: 0.1982 - accuracy: 0.9294
Epoch 00075: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1982 - accuracy: 0.9294
Epoch 76/100
59/59 [==============================] - ETA: 0s - loss: 0.1977 - accuracy: 0.9297
Epoch 00076: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1977 - accuracy: 0.9297
Epoch 77/100
59/59 [==============================] - ETA: 0s - loss: 0.1979 - accuracy: 0.9297
Epoch 00077: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1979 - accuracy: 0.9297
Epoch 78/100
59/59 [==============================] - ETA: 0s - loss: 0.1975 - accuracy: 0.9300
Epoch 00078: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1975 - accuracy: 0.9300
Epoch 79/100
59/59 [==============================] - ETA: 0s - loss: 0.1977 - accuracy: 0.9309
Epoch 00079: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1977 - accuracy: 0.9309
Epoch 80/100
59/59 [==============================] - ETA: 0s - loss: 0.1968 - accuracy: 0.9304
Epoch 00080: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1968 - accuracy: 0.9304
Epoch 81/100
59/59 [==============================] - ETA: 0s - loss: 0.1987 - accuracy: 0.9297
Epoch 00081: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1987 - accuracy: 0.9297
Epoch 82/100
59/59 [==============================] - ETA: 0s - loss: 0.1967 - accuracy: 0.9313
Epoch 00082: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1967 - accuracy: 0.9313
Epoch 83/100
59/59 [==============================] - ETA: 0s - loss: 0.1985 - accuracy: 0.9289
Epoch 00083: loss did not improve from 0.19640
59/59 [==============================] - 6s 101ms/step - loss: 0.1985 - accuracy: 0.9289
Epoch 84/100
59/59 [==============================] - ETA: 0s - loss: 0.1973 - accuracy: 0.9305
Epoch 00084: loss did not improve from 0.19640
59/59 [==============================] - 6s 102ms/step - loss: 0.1973 - accuracy: 0.9305
Epoch 85/100
59/59 [==============================] - ETA: 0s - loss: 0.1968 - accuracy: 0.9309
Epoch 00085: loss did not improve from 0.19640
59/59 [==============================] - 6s 104ms/step - loss: 0.1968 - accuracy: 0.9309
Epoch 86/100
59/59 [==============================] - ETA: 0s - loss: 0.1962 - accuracy: 0.9312
Epoch 00086: loss improved from 0.19640 to 0.19622, saving model to EnsembleClassifier.h5
59/59 [==============================] - 7s 112ms/step - loss: 0.1962 - accuracy: 0.9312
Epoch 87/100
59/59 [==============================] - ETA: 0s - loss: 0.1974 - accuracy: 0.9294
Epoch 00087: loss did not improve from 0.19622
59/59 [==============================] - 6s 102ms/step - loss: 0.1974 - accuracy: 0.9294
Epoch 88/100
59/59 [==============================] - ETA: 0s - loss: 0.1974 - accuracy: 0.9294
Epoch 00088: loss did not improve from 0.19622
59/59 [==============================] - 6s 102ms/step - loss: 0.1974 - accuracy: 0.9294
Epoch 89/100
59/59 [==============================] - ETA: 0s - loss: 0.1968 - accuracy: 0.9301
Epoch 00089: loss did not improve from 0.19622
59/59 [==============================] - 6s 101ms/step - loss: 0.1968 - accuracy: 0.9301
Epoch 90/100
59/59 [==============================] - ETA: 0s - loss: 0.1982 - accuracy: 0.9305
Epoch 00090: loss did not improve from 0.19622
59/59 [==============================] - 6s 102ms/step - loss: 0.1982 - accuracy: 0.9305
Epoch 91/100
59/59 [==============================] - ETA: 0s - loss: 0.1968 - accuracy: 0.9303
Epoch 00091: loss did not improve from 0.19622
59/59 [==============================] - 6s 100ms/step - loss: 0.1968 - accuracy: 0.9303
Epoch 92/100
59/59 [==============================] - ETA: 0s - loss: 0.1969 - accuracy: 0.9306
Epoch 00092: loss did not improve from 0.19622
59/59 [==============================] - 6s 101ms/step - loss: 0.1969 - accuracy: 0.9306
Epoch 93/100
59/59 [==============================] - ETA: 0s - loss: 0.1949 - accuracy: 0.9314
Epoch 00093: loss improved from 0.19622 to 0.19491, saving model to EnsembleClassifier.h5
59/59 [==============================] - 6s 109ms/step - loss: 0.1949 - accuracy: 0.9314
Epoch 94/100
59/59 [==============================] - ETA: 0s - loss: 0.1964 - accuracy: 0.9310
Epoch 00094: loss did not improve from 0.19491
59/59 [==============================] - 6s 102ms/step - loss: 0.1964 - accuracy: 0.9310
Epoch 95/100
59/59 [==============================] - ETA: 0s - loss: 0.1965 - accuracy: 0.9298
Epoch 00095: loss did not improve from 0.19491
59/59 [==============================] - 6s 101ms/step - loss: 0.1965 - accuracy: 0.9298
Epoch 96/100
59/59 [==============================] - ETA: 0s - loss: 0.1981 - accuracy: 0.9289
Epoch 00096: loss did not improve from 0.19491
59/59 [==============================] - 6s 101ms/step - loss: 0.1981 - accuracy: 0.9289
Epoch 97/100
59/59 [==============================] - ETA: 0s - loss: 0.1957 - accuracy: 0.9300
Epoch 00097: loss did not improve from 0.19491
59/59 [==============================] - 6s 102ms/step - loss: 0.1957 - accuracy: 0.9300
Epoch 98/100
59/59 [==============================] - ETA: 0s - loss: 0.1971 - accuracy: 0.9292
Epoch 00098: loss did not improve from 0.19491
59/59 [==============================] - 6s 102ms/step - loss: 0.1971 - accuracy: 0.9292
Epoch 99/100
59/59 [==============================] - ETA: 0s - loss: 0.1962 - accuracy: 0.9301
Epoch 00099: loss did not improve from 0.19491
59/59 [==============================] - 6s 101ms/step - loss: 0.1962 - accuracy: 0.9301
Epoch 100/100
59/59 [==============================] - ETA: 0s - loss: 0.1973 - accuracy: 0.9290
Epoch 00100: loss did not improve from 0.19491
59/59 [==============================] - 6s 101ms/step - loss: 0.1973 - accuracy: 0.9290
In [30]:
del(X_val, dnnx_val)
In [31]:
model = load_model("output/EnsembleClassifier.h5")
In [32]:
model.evaluate([X_test, dnnx_test],y_test)
938/938 [==============================] - 10s 11ms/step - loss: 0.1911 - accuracy: 0.9332
Out[32]:
[0.19114574790000916, 0.9332333207130432]
In [33]:
plot_model(model,"output/EnsembleMod.pdf",show_shapes=True)
In [34]:
preds_test = model.predict([X_test, dnnx_test],batch_size=512, verbose = 0)
print(get_metrics(preds_test.argmax(axis=1), y_test.argmax(axis=1),label_strings))
Identified 27997 correct labels out of 30000 labels
Accuracy: 0.9332333333333334
Precision: 0.9333156684689717
Recall: 0.933217720394221
F1 Score: 0.9332602061738186
Labels are: ['GALAXY' 'QSO' 'STAR']
Confusion Matrix:
 [[9553  317  150]
 [ 291 9170  540]
 [ 120  585 9274]]
Classification_Report:
               precision    recall  f1-score   support

           0       0.96      0.95      0.96     10020
           1       0.91      0.92      0.91     10001
           2       0.93      0.93      0.93      9979

    accuracy                           0.93     30000
   macro avg       0.93      0.93      0.93     30000
weighted avg       0.93      0.93      0.93     30000

(array([    0,     1,     2, ..., 29997, 29998, 29999]), 0.9332333333333334, 0.9333156684689717, 0.933217720394221, array([[9553,  317,  150],
       [ 291, 9170,  540],
       [ 120,  585, 9274]]), '              precision    recall  f1-score   support\n\n           0       0.96      0.95      0.96     10020\n           1       0.91      0.92      0.91     10001\n           2       0.93      0.93      0.93      9979\n\n    accuracy                           0.93     30000\n   macro avg       0.93      0.93      0.93     30000\nweighted avg       0.93      0.93      0.93     30000\n')
In [35]:
cm = metrics.confusion_matrix(preds_test.argmax(axis=1), y_test.argmax(axis=1),normalize='true')
df_cm = pd.DataFrame(cm, index = label_strings,columns = label_strings)
plt.figure(figsize = (10,7))
sns.heatmap(df_cm, annot=True,cmap="Blues",square=True,fmt='.2%')
plt.savefig("output/ensemble_cm.pdf")

5. Analyse Results¶

In [36]:
model = load_model("output/EnsembleClassifier.h5")
In [37]:
preds_train = model.predict([X_train, dnnx_train],batch_size=512, verbose = 0)
print(get_metrics(preds_train.argmax(axis=1), y_train.argmax(axis=1),label_strings))
Identified 168981 correct labels out of 180011 labels
Accuracy: 0.9387259667464766
Precision: 0.9387949859392456
Recall: 0.9387145758823535
F1 Score: 0.9387464153870226
Labels are: ['GALAXY' 'QSO' 'STAR']
Confusion Matrix:
 [[57705  1764   592]
 [ 1674 55272  3016]
 [  571  3413 56004]]
Classification_Report:
               precision    recall  f1-score   support

           0       0.96      0.96      0.96     60061
           1       0.91      0.92      0.92     59962
           2       0.94      0.93      0.94     59988

    accuracy                           0.94    180011
   macro avg       0.94      0.94      0.94    180011
weighted avg       0.94      0.94      0.94    180011

(array([     0,      1,      2, ..., 180008, 180009, 180010]), 0.9387259667464766, 0.9387949859392456, 0.9387145758823535, array([[57705,  1764,   592],
       [ 1674, 55272,  3016],
       [  571,  3413, 56004]]), '              precision    recall  f1-score   support\n\n           0       0.96      0.96      0.96     60061\n           1       0.91      0.92      0.92     59962\n           2       0.94      0.93      0.94     59988\n\n    accuracy                           0.94    180011\n   macro avg       0.94      0.94      0.94    180011\nweighted avg       0.94      0.94      0.94    180011\n')
In [38]:
del(dnnx_train,X_train)
In [39]:
preds_val = model.predict([X_val, dnnx_val],batch_size=512, verbose = 0)
print(get_metrics(preds_val.argmax(axis=1), y_val.argmax(axis=1),label_strings))
Identified 27951 correct labels out of 29988 labels
Accuracy: 0.9320728291316527
Precision: 0.9322192094283954
Recall: 0.9321634242566498
F1 Score: 0.9321845593955201
Labels are: ['GALAXY' 'QSO' 'STAR']
Confusion Matrix:
 [[9473  328  117]
 [ 333 9167  537]
 [ 115  607 9311]]
Classification_Report:
               precision    recall  f1-score   support

           0       0.95      0.96      0.95      9918
           1       0.91      0.91      0.91     10037
           2       0.93      0.93      0.93     10033

    accuracy                           0.93     29988
   macro avg       0.93      0.93      0.93     29988
weighted avg       0.93      0.93      0.93     29988

(array([    0,     1,     2, ..., 29985, 29986, 29987]), 0.9320728291316527, 0.9322192094283954, 0.9321634242566498, array([[9473,  328,  117],
       [ 333, 9167,  537],
       [ 115,  607, 9311]]), '              precision    recall  f1-score   support\n\n           0       0.95      0.96      0.95      9918\n           1       0.91      0.91      0.91     10037\n           2       0.93      0.93      0.93     10033\n\n    accuracy                           0.93     29988\n   macro avg       0.93      0.93      0.93     29988\nweighted avg       0.93      0.93      0.93     29988\n')
In [40]:
del(X_val, dnnx_val)
In [41]:
preds_test = model.predict([X_test, dnnx_test],batch_size=512, verbose = 0)
print(get_metrics(preds_test.argmax(axis=1), y_test.argmax(axis=1),label_strings))
Identified 27999 correct labels out of 30000 labels
Accuracy: 0.9333
Precision: 0.9333117964761226
Recall: 0.9332824843769045
F1 Score: 0.9332960665290292
Labels are: ['GALAXY' 'QSO' 'STAR']
Confusion Matrix:
 [[9583  298  139]
 [ 313 9142  546]
 [ 114  591 9274]]
Classification_Report:
               precision    recall  f1-score   support

           0       0.96      0.96      0.96     10020
           1       0.91      0.91      0.91     10001
           2       0.93      0.93      0.93      9979

    accuracy                           0.93     30000
   macro avg       0.93      0.93      0.93     30000
weighted avg       0.93      0.93      0.93     30000

(array([    0,     1,     2, ..., 29997, 29998, 29999]), 0.9333, 0.9333117964761226, 0.9332824843769045, array([[9583,  298,  139],
       [ 313, 9142,  546],
       [ 114,  591, 9274]]), '              precision    recall  f1-score   support\n\n           0       0.96      0.96      0.96     10020\n           1       0.91      0.91      0.91     10001\n           2       0.93      0.93      0.93      9979\n\n    accuracy                           0.93     30000\n   macro avg       0.93      0.93      0.93     30000\nweighted avg       0.93      0.93      0.93     30000\n')
In [42]:
df = pd.read_csv("../dataset/photofeatures_exp1.csv",index_col=0)
In [43]:
df.loc[objlist_train, ["set"]] = "TRAIN"
df.loc[objlist_val, ["set"]] = "VALIDATION"
df.loc[objlist_test, ["set"]] = "TEST"
In [44]:
df.loc[objlist_train, ["pred_class"]] = label_strings[preds_train.argmax(axis=1)]
df.loc[objlist_val, ["pred_class"]] = label_strings[preds_val.argmax(axis=1)]
df.loc[objlist_test, ["pred_class"]] = label_strings[preds_test.argmax(axis=1)]
In [45]:
pgal_train = preds_train[:,np.where(label_strings=="GALAXY")[0][0]]
pstar_train = preds_train[:,np.where(label_strings=="STAR")[0][0]]
pqso_train = preds_train[:,np.where(label_strings=="QSO")[0][0]]

pgal_val = preds_val[:,np.where(label_strings=="GALAXY")[0][0]]
pstar_val = preds_val[:,np.where(label_strings=="STAR")[0][0]]
pqso_val = preds_val[:,np.where(label_strings=="QSO")[0][0]]

pgal_test = preds_test[:,np.where(label_strings=="GALAXY")[0][0]]
pstar_test = preds_test[:,np.where(label_strings=="STAR")[0][0]]
pqso_test = preds_test[:,np.where(label_strings=="QSO")[0][0]]
In [46]:
df.loc[objlist_train, ["prob_gal"]] = pgal_train
df.loc[objlist_train, ["prob_star"]] = pstar_train
df.loc[objlist_train, ["prob_qso"]] = pqso_train

df.loc[objlist_val, ["prob_gal"]] = pgal_val
df.loc[objlist_val, ["prob_star"]] = pstar_val
df.loc[objlist_val, ["prob_qso"]] = pqso_val

df.loc[objlist_test, ["prob_gal"]] = pgal_test
df.loc[objlist_test, ["prob_star"]] = pstar_test
df.loc[objlist_test, ["prob_qso"]] = pqso_test
In [47]:
df.to_csv("output/star_galaxy_qso_results_exp1.csv")
In [48]:
subdf = df.loc[df["set"]=="TEST",["class","pred_class",'prob_gal', 'prob_star', 'prob_qso']]
In [49]:
subdf[(subdf["pred_class"]!=subdf["class"]) & (subdf["pred_class"]=="GALAXY")]["prob_gal"].min()
Out[49]:
0.37265077233314514