In this project, I will try to model the energy of charged leptons produced from neutrino interactions in water Cerenkov detectors using Gradient Boosted Regression Trees (GBT).
Neutrino interactions in the water can be described with the inverse beta-decay process, leading to the creation of charged leptons:
n+νe →p+e− p + ν ̄ e → n + e +
The cross section of these processes is typically very small. In experiments studying atmospheric neutrinos, the existence of neutrinos can be in-ferred indirectly via the charged current interactions with nuclei, and the release of high-energetic charged leptons, e.g.
νe + N → e− + X
ν ̄ μ + N → μ + + X
which are then detected with Cerenkov light detectors. Examples of neutrino detection with Cerenkov detectors in the Super-Kamiokande experiment as seen in the image below. Light from Cerenkov radiation is collected by photomultipliers (PMTs). The geometric distribution of the photon energy deposits is used to reconstruct the energy, position and direction of the charged leptons. The goal of this project is to reconstruct the lepton energy in a neutrino experiment with machine-learning based techniques.
The “Tokai Intermediate Tank for the Unoscillated Spectrum” (TITUS) is a proposed intermediate detector for the Hyper-Kamiokande experiment. TITUS has a total mass of 2.1 kton, and is horizontally placed along the neutrino beam to act as an intermedi- ate detector. It has the same off-axis angle and target material as the far detector at Hyper-Kamiokande, with the goal of reducing the systematics uncertainties. A schematic representation of the detector can be seen below. In this project, I will be using simulated interactions of muon neutrinos inside the TITUS detector. I will analyse the pattern of PMT hits, and the distance of the muon track from the walls of the detector in order to reconstruct the energy of the muon created in the neutrino-water interaction.
The information for about 160k simulated muon neutrino interactions inside the TITUS detector can be found at https://cernbox.cern.ch/s/n9bYKkVksyOkcV4, tabulated in a 26 MB CSV file. Every entry in the spreadsheet corresponds to a separate neutrino interaction. The file contains a large number of variables per event. Here I will focus on a subset of them, summarised below.
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
import sklearn
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Dropout
from tensorflow.keras.wrappers.scikit_learn import KerasRegressor
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import KFold
from sklearn.preprocessing import StandardScaler
from sklearn.pipeline import Pipeline
import tensorflow
from tensorflow.keras import backend
from sklearn.experimental import enable_halving_search_cv
from sklearn.model_selection import HalvingGridSearchCV
import nn_utils
from sklearn import model_selection
from sklearn.model_selection import learning_curve
from tensorflow.keras.callbacks import EarlyStopping , ModelCheckpoint
from sklearn.ensemble import GradientBoostingRegressor
from sklearn.model_selection import GridSearchCV
data = pd.read_csv('numu_energy_studies.csv')
data = data.dropna()
column_lengths = data.apply(lambda x: len(x))
min_length = column_lengths.min()
data = data[:min_length]
data.head(10)
Unnamed: 0 | i | neutrinoE | trueKE | recoE_lookup | total_PMTs_hits2 | total_hits2 | total_ring_PEs2 | pot_length2 | hits_pot_length2 | recoDWallR2 | recoDWallZ2 | lambda_max_2 | recoDWall_2 | recoToWall_2 | vtxTrackBias_2 | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0 | 0 | 664.541 | 236.327 | 22.35380 | 0.000000 | 0.023325 | 0.00000 | 0.662787 | 0.003754 | 0.003389 | 0.458415 | 0.650854 | 0.003389 | 0.220977 | -0.011905 |
1 | 1 | 3 | 400.551 | 186.765 | 2.46971 | 0.000000 | 0.010075 | 0.00000 | 0.662787 | 0.001621 | 0.003389 | 0.458415 | 0.405903 | 0.003389 | 0.220977 | -2.269040 |
2 | 2 | 4 | 633.353 | 448.481 | 409.95400 | 0.308444 | 0.080400 | 0.13070 | 0.531171 | 0.016146 | 0.035389 | 0.753969 | 0.537975 | 0.035389 | 0.494926 | 0.024453 |
3 | 3 | 7 | 845.921 | 586.647 | 37.83740 | 0.000000 | 0.011950 | 0.00000 | 0.918554 | 0.001388 | 0.042498 | 0.083664 | 0.830098 | 0.042498 | 0.805578 | -0.007114 |
4 | 4 | 9 | 1323.140 | 963.822 | 1408.07000 | 0.111333 | 0.142300 | 0.26265 | 0.093641 | 0.162093 | 0.616305 | 0.164307 | 0.092105 | 0.358487 | 0.135845 | -0.004307 |
5 | 5 | 10 | 602.977 | 435.152 | 908.66900 | 0.136222 | 0.071425 | 0.12375 | 0.009318 | 0.817594 | -0.031748 | 0.808230 | 0.584673 | -0.031748 | 0.474233 | -0.009901 |
6 | 6 | 17 | 656.550 | 264.111 | 262.79900 | 0.099556 | 0.042525 | 0.05590 | 0.294422 | 0.015406 | 0.653350 | 0.118767 | 0.281821 | 0.259129 | 0.294422 | 0.011576 |
7 | 7 | 19 | 561.754 | 328.597 | 338.98900 | 0.077111 | 0.055850 | 0.08005 | 0.229034 | 0.026011 | 0.480335 | 0.235204 | 0.220081 | 0.480335 | 0.229034 | 0.011409 |
8 | 8 | 20 | 579.378 | 163.706 | 42.23790 | 0.000000 | 0.021075 | 0.00000 | 0.839011 | 0.002679 | 0.083842 | 0.166220 | 0.836803 | 0.083842 | 0.096443 | 0.012322 |
9 | 9 | 24 | 841.103 | 455.504 | 440.51400 | 0.238444 | 0.084375 | 0.14170 | 0.358473 | 0.025107 | 0.090244 | 0.446567 | 0.344590 | 0.090244 | 0.305072 | 0.008773 |
data.shape
(163592, 16)
data.columns
Index(['Unnamed: 0', 'i', 'neutrinoE', 'trueKE', 'recoE_lookup', 'total_PMTs_hits2', 'total_hits2', 'total_ring_PEs2', 'pot_length2', 'hits_pot_length2', 'recoDWallR2', 'recoDWallZ2', 'lambda_max_2', 'recoDWall_2', 'recoToWall_2', 'vtxTrackBias_2'], dtype='object')
variables = ['total_hits2', 'total_ring_PEs2', 'recoDWallR2', 'recoDWallZ2', 'lambda_max_2', 'trueKE']
new_data = data[variables]
new_data.head(10)
total_hits2 | total_ring_PEs2 | recoDWallR2 | recoDWallZ2 | lambda_max_2 | trueKE | |
---|---|---|---|---|---|---|
0 | 0.023325 | 0.00000 | 0.003389 | 0.458415 | 0.650854 | 236.327 |
1 | 0.010075 | 0.00000 | 0.003389 | 0.458415 | 0.405903 | 186.765 |
2 | 0.080400 | 0.13070 | 0.035389 | 0.753969 | 0.537975 | 448.481 |
3 | 0.011950 | 0.00000 | 0.042498 | 0.083664 | 0.830098 | 586.647 |
4 | 0.142300 | 0.26265 | 0.616305 | 0.164307 | 0.092105 | 963.822 |
5 | 0.071425 | 0.12375 | -0.031748 | 0.808230 | 0.584673 | 435.152 |
6 | 0.042525 | 0.05590 | 0.653350 | 0.118767 | 0.281821 | 264.111 |
7 | 0.055850 | 0.08005 | 0.480335 | 0.235204 | 0.220081 | 328.597 |
8 | 0.021075 | 0.00000 | 0.083842 | 0.166220 | 0.836803 | 163.706 |
9 | 0.084375 | 0.14170 | 0.090244 | 0.446567 | 0.344590 | 455.504 |
for i in range(len(variables)):
plt.hist(new_data[variables[i]], bins =100, log=True )
plt.xlabel(variables[i])
plt.ylabel('Counts')
plt.show()
Answer_to_all_questions = 42
#train-test split of dataset
input_data = new_data[variables[:-1]]
target = new_data[variables[-1]]
train_data , test_data , train_target , test_target = model_selection.train_test_split(input_data,target,test_size =0.3 , random_state = Answer_to_all_questions)
print(train_data.shape , train_target.shape , test_data.shape , test_target.shape )
(114514, 5) (114514,) (49078, 5) (49078,)
num_nodes = 20
num_outputs = 1
num_inputs = 5
def my_model():
# create model
model = Sequential()
model.add(Dense(num_nodes,input_dim = num_inputs,kernel_initializer = 'normal',activation ='relu'))
model.add(Dropout(0.1))
model.add(Dense(num_nodes,input_dim = num_nodes,kernel_initializer = 'normal',activation ='relu'))
model.add(Dropout(0.1))
model.add(Dense(num_nodes,input_dim = num_nodes,kernel_initializer = 'normal',activation ='relu'))
model.add(Dense(num_outputs,kernel_initializer ='normal'))
# Compile model with mean squared error loss function
model.compile(loss = 'mean_squared_error', optimizer ='adam')
return model
callbacks_ = [
# If we don't have an increase of the accuracy for 10 epochs, terminate training.
EarlyStopping(verbose = True, patience = 10, monitor = 'loss'),
# Always make sure that we're saving the model weights with the best accuracy.
ModelCheckpoint('model.h5', monitor = 'loss', verbose = 0 ,save_best_only = True, mode = 'max')]
batchSize = 500
N_epochs = 50
Answer_to_all_questions = 42
np.random.seed(Answer_to_all_questions)
estimators = []
estimators.append(('mlp', KerasRegressor(build_fn = my_model, epochs = N_epochs,
batch_size = batchSize, verbose = 1)))
pipeline = Pipeline(estimators)
kfold = KFold(n_splits = 10, random_state = Answer_to_all_questions, shuffle = True)
results = cross_val_score(pipeline, train_data, train_target, cv = kfold,
fit_params = {'mlp__callbacks': callbacks_} ,scoring ='r2')
C:\Users\boazm\AppData\Local\Temp\ipykernel_1880\521896612.py:7: DeprecationWarning: KerasRegressor is deprecated, use Sci-Keras (https://github.com/adriangb/scikeras) instead. See https://www.adriangb.com/scikeras/stable/migration.html for help migrating. estimators.append(('mlp', KerasRegressor(build_fn = my_model, epochs = N_epochs,
Epoch 1/50 207/207 [==============================] - 2s 5ms/step - loss: 614757.9375 Epoch 2/50 207/207 [==============================] - 1s 4ms/step - loss: 342252.7500 Epoch 3/50 207/207 [==============================] - 1s 4ms/step - loss: 262568.5312 Epoch 4/50 207/207 [==============================] - 1s 4ms/step - loss: 157793.9531 Epoch 5/50 207/207 [==============================] - 1s 4ms/step - loss: 120500.6641 Epoch 6/50 207/207 [==============================] - 1s 4ms/step - loss: 116810.2500 Epoch 7/50 207/207 [==============================] - 1s 3ms/step - loss: 115763.9922 Epoch 8/50 207/207 [==============================] - 1s 4ms/step - loss: 115111.3594 Epoch 9/50 207/207 [==============================] - 1s 4ms/step - loss: 114088.7422 Epoch 10/50 207/207 [==============================] - 1s 3ms/step - loss: 113391.0938 Epoch 11/50 207/207 [==============================] - 1s 3ms/step - loss: 112711.0234 Epoch 12/50 207/207 [==============================] - 1s 3ms/step - loss: 111469.6250 Epoch 13/50 207/207 [==============================] - 1s 4ms/step - loss: 110852.4844 Epoch 14/50 207/207 [==============================] - 1s 3ms/step - loss: 110133.5625 Epoch 15/50 207/207 [==============================] - 1s 4ms/step - loss: 108933.9688 Epoch 16/50 207/207 [==============================] - 1s 4ms/step - loss: 108822.4688 Epoch 17/50 207/207 [==============================] - 1s 3ms/step - loss: 107704.1719 Epoch 18/50 207/207 [==============================] - 1s 4ms/step - loss: 107553.1172 Epoch 19/50 207/207 [==============================] - 1s 4ms/step - loss: 106907.4688 Epoch 20/50 207/207 [==============================] - 1s 4ms/step - loss: 106611.9297 Epoch 21/50 207/207 [==============================] - 1s 3ms/step - loss: 105439.8281 Epoch 22/50 207/207 [==============================] - 1s 4ms/step - loss: 105684.1484 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 105010.6641 Epoch 24/50 207/207 [==============================] - 1s 4ms/step - loss: 104070.3984 Epoch 25/50 207/207 [==============================] - 1s 4ms/step - loss: 103554.0078 Epoch 26/50 207/207 [==============================] - 1s 3ms/step - loss: 103505.7891 Epoch 27/50 207/207 [==============================] - 1s 4ms/step - loss: 102710.5703 Epoch 28/50 207/207 [==============================] - 1s 3ms/step - loss: 102361.1406 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 102716.7422 Epoch 30/50 207/207 [==============================] - 1s 3ms/step - loss: 101445.1406 Epoch 31/50 207/207 [==============================] - 1s 3ms/step - loss: 101950.1562 Epoch 32/50 207/207 [==============================] - 1s 3ms/step - loss: 101476.3750 Epoch 33/50 207/207 [==============================] - 1s 4ms/step - loss: 101477.2031 Epoch 34/50 207/207 [==============================] - 1s 5ms/step - loss: 100310.8594 Epoch 35/50 207/207 [==============================] - 1s 5ms/step - loss: 100744.6797 Epoch 36/50 207/207 [==============================] - 1s 5ms/step - loss: 100248.9922 Epoch 37/50 207/207 [==============================] - 1s 5ms/step - loss: 100285.1094 Epoch 38/50 207/207 [==============================] - 1s 5ms/step - loss: 100802.7422 Epoch 39/50 207/207 [==============================] - 1s 5ms/step - loss: 100971.0078 Epoch 40/50 207/207 [==============================] - 1s 5ms/step - loss: 100401.2891 Epoch 41/50 207/207 [==============================] - 1s 5ms/step - loss: 100304.2891 Epoch 42/50 207/207 [==============================] - 1s 5ms/step - loss: 100369.4688 Epoch 43/50 207/207 [==============================] - 1s 5ms/step - loss: 100363.6953 Epoch 44/50 207/207 [==============================] - 1s 5ms/step - loss: 100831.8359 Epoch 45/50 207/207 [==============================] - 1s 5ms/step - loss: 99747.5859 Epoch 46/50 207/207 [==============================] - 1s 5ms/step - loss: 99066.2109 Epoch 47/50 207/207 [==============================] - 1s 5ms/step - loss: 99721.7969 Epoch 48/50 207/207 [==============================] - 1s 5ms/step - loss: 99339.2109 Epoch 49/50 207/207 [==============================] - 1s 5ms/step - loss: 99350.4922 Epoch 50/50 207/207 [==============================] - 1s 5ms/step - loss: 99740.3125 23/23 [==============================] - 0s 2ms/step Epoch 1/50 207/207 [==============================] - 2s 4ms/step - loss: 620939.6875 Epoch 2/50 207/207 [==============================] - 1s 4ms/step - loss: 360805.0938 Epoch 3/50 207/207 [==============================] - 1s 4ms/step - loss: 278353.2812 Epoch 4/50 207/207 [==============================] - 1s 4ms/step - loss: 192013.0312 Epoch 5/50 207/207 [==============================] - 1s 4ms/step - loss: 122259.4609 Epoch 6/50 207/207 [==============================] - 1s 5ms/step - loss: 116661.1016 Epoch 7/50 207/207 [==============================] - 1s 5ms/step - loss: 115004.4609 Epoch 8/50 207/207 [==============================] - 1s 4ms/step - loss: 114126.1797 Epoch 9/50 207/207 [==============================] - 1s 4ms/step - loss: 113759.4453 Epoch 10/50 207/207 [==============================] - 1s 5ms/step - loss: 112878.9375 Epoch 11/50 207/207 [==============================] - 1s 4ms/step - loss: 113236.8047 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 112831.7891 Epoch 13/50 207/207 [==============================] - 1s 5ms/step - loss: 111994.1953 Epoch 14/50 207/207 [==============================] - 1s 5ms/step - loss: 111639.2109 Epoch 15/50 207/207 [==============================] - 1s 5ms/step - loss: 110827.3281 Epoch 16/50 207/207 [==============================] - 1s 5ms/step - loss: 110458.7578 Epoch 17/50 207/207 [==============================] - 1s 5ms/step - loss: 110590.4922 Epoch 18/50 207/207 [==============================] - 1s 4ms/step - loss: 110030.3516 Epoch 19/50 207/207 [==============================] - 1s 5ms/step - loss: 109464.0781 Epoch 20/50 207/207 [==============================] - 1s 5ms/step - loss: 108907.6172 Epoch 21/50 207/207 [==============================] - 1s 4ms/step - loss: 108206.7109 Epoch 22/50 207/207 [==============================] - 1s 4ms/step - loss: 107611.1797 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 106796.3203 Epoch 24/50 207/207 [==============================] - 1s 4ms/step - loss: 107606.9062 Epoch 25/50 207/207 [==============================] - 1s 4ms/step - loss: 105638.5859 Epoch 26/50 207/207 [==============================] - 1s 4ms/step - loss: 105217.1484 Epoch 27/50 207/207 [==============================] - 1s 4ms/step - loss: 104821.2734 Epoch 28/50 207/207 [==============================] - 1s 4ms/step - loss: 103407.2656 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 104239.6484 Epoch 30/50 207/207 [==============================] - 1s 4ms/step - loss: 103617.1484 Epoch 31/50 207/207 [==============================] - 1s 5ms/step - loss: 103611.5781 Epoch 32/50 207/207 [==============================] - 1s 4ms/step - loss: 103611.1484 Epoch 33/50 207/207 [==============================] - 1s 4ms/step - loss: 103413.8828 Epoch 34/50 207/207 [==============================] - 1s 4ms/step - loss: 103067.6172 Epoch 35/50 207/207 [==============================] - 1s 4ms/step - loss: 102321.0312 Epoch 36/50 207/207 [==============================] - 1s 4ms/step - loss: 101379.9062 Epoch 37/50 207/207 [==============================] - 1s 4ms/step - loss: 102344.2344 Epoch 38/50 207/207 [==============================] - 1s 4ms/step - loss: 102144.8359 Epoch 39/50 207/207 [==============================] - 1s 4ms/step - loss: 101951.0859 Epoch 40/50 207/207 [==============================] - 1s 4ms/step - loss: 101751.9766 Epoch 41/50 207/207 [==============================] - 1s 4ms/step - loss: 101833.7031 Epoch 42/50 207/207 [==============================] - 1s 4ms/step - loss: 100921.8281 Epoch 43/50 207/207 [==============================] - 1s 4ms/step - loss: 101286.9766 Epoch 44/50 207/207 [==============================] - 1s 4ms/step - loss: 100771.1641 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 101061.4141 Epoch 46/50 207/207 [==============================] - 1s 4ms/step - loss: 101246.3281 Epoch 47/50 207/207 [==============================] - 1s 4ms/step - loss: 101197.1484 Epoch 48/50 207/207 [==============================] - 1s 4ms/step - loss: 100791.4453 Epoch 49/50 207/207 [==============================] - 1s 4ms/step - loss: 101323.1406 Epoch 50/50 207/207 [==============================] - 1s 5ms/step - loss: 101111.5391 23/23 [==============================] - 0s 3ms/step Epoch 1/50 207/207 [==============================] - 2s 4ms/step - loss: 622389.6250 Epoch 2/50 207/207 [==============================] - 1s 5ms/step - loss: 357313.1562 Epoch 3/50 207/207 [==============================] - 2s 8ms/step - loss: 273323.7812 Epoch 4/50 207/207 [==============================] - 1s 5ms/step - loss: 167645.5625 Epoch 5/50 207/207 [==============================] - 1s 5ms/step - loss: 120493.0703 Epoch 6/50 207/207 [==============================] - 1s 5ms/step - loss: 116188.7578 Epoch 7/50 207/207 [==============================] - 1s 5ms/step - loss: 114997.2344 Epoch 8/50 207/207 [==============================] - 1s 5ms/step - loss: 113396.1016 Epoch 9/50 207/207 [==============================] - 1s 4ms/step - loss: 112390.0469 Epoch 10/50 207/207 [==============================] - 1s 4ms/step - loss: 112528.6328 Epoch 11/50 207/207 [==============================] - 1s 4ms/step - loss: 111388.2422 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 110445.8359 Epoch 13/50 207/207 [==============================] - 1s 5ms/step - loss: 110355.9375 Epoch 14/50 207/207 [==============================] - 1s 4ms/step - loss: 110499.2266 Epoch 15/50 207/207 [==============================] - 1s 4ms/step - loss: 109383.0625 Epoch 16/50 207/207 [==============================] - 1s 4ms/step - loss: 108832.6016 Epoch 17/50 207/207 [==============================] - 1s 4ms/step - loss: 108833.5781 Epoch 18/50 207/207 [==============================] - 1s 4ms/step - loss: 107368.7891 Epoch 19/50 207/207 [==============================] - 1s 4ms/step - loss: 107930.7031 Epoch 20/50 207/207 [==============================] - 1s 4ms/step - loss: 106821.8438 Epoch 21/50 207/207 [==============================] - 1s 4ms/step - loss: 105724.4844 Epoch 22/50 207/207 [==============================] - 1s 4ms/step - loss: 105579.0312 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 105171.0547 Epoch 24/50 207/207 [==============================] - 1s 4ms/step - loss: 104196.3594 Epoch 25/50 207/207 [==============================] - 1s 4ms/step - loss: 104336.8594 Epoch 26/50 207/207 [==============================] - 1s 4ms/step - loss: 103018.3203 Epoch 27/50 207/207 [==============================] - 1s 4ms/step - loss: 103732.5859 Epoch 28/50 207/207 [==============================] - 1s 4ms/step - loss: 102747.1328 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 102628.4531 Epoch 30/50 207/207 [==============================] - 1s 3ms/step - loss: 101892.0469 Epoch 31/50 207/207 [==============================] - 1s 4ms/step - loss: 102023.5703 Epoch 32/50 207/207 [==============================] - 1s 4ms/step - loss: 101285.9688 Epoch 33/50 207/207 [==============================] - 1s 4ms/step - loss: 101097.8594 Epoch 34/50 207/207 [==============================] - 1s 4ms/step - loss: 101351.0938 Epoch 35/50 207/207 [==============================] - 1s 4ms/step - loss: 101562.8047 Epoch 36/50 207/207 [==============================] - 1s 4ms/step - loss: 100937.6953 Epoch 37/50 207/207 [==============================] - 1s 4ms/step - loss: 100668.7109 Epoch 38/50 207/207 [==============================] - 1s 4ms/step - loss: 100718.4844 Epoch 39/50 207/207 [==============================] - 1s 4ms/step - loss: 101097.9609 Epoch 40/50 207/207 [==============================] - 1s 4ms/step - loss: 100366.1719 Epoch 41/50 207/207 [==============================] - 1s 4ms/step - loss: 99906.1953 Epoch 42/50 207/207 [==============================] - 1s 4ms/step - loss: 100850.5234 Epoch 43/50 207/207 [==============================] - 1s 4ms/step - loss: 100377.9609 Epoch 44/50 207/207 [==============================] - 1s 4ms/step - loss: 100439.2891 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 99688.1797 Epoch 46/50 207/207 [==============================] - 1s 4ms/step - loss: 100800.8359 Epoch 47/50 207/207 [==============================] - 1s 4ms/step - loss: 100137.4297 Epoch 48/50 207/207 [==============================] - 1s 4ms/step - loss: 99424.3281 Epoch 49/50 207/207 [==============================] - 1s 4ms/step - loss: 99978.1641 Epoch 50/50 207/207 [==============================] - 1s 4ms/step - loss: 100169.3906 23/23 [==============================] - 0s 2ms/step Epoch 1/50 207/207 [==============================] - 2s 5ms/step - loss: 604401.3750 Epoch 2/50 207/207 [==============================] - 1s 4ms/step - loss: 330015.9062 Epoch 3/50 207/207 [==============================] - 1s 4ms/step - loss: 254407.2188 Epoch 4/50 207/207 [==============================] - 1s 4ms/step - loss: 143653.9844 Epoch 5/50 207/207 [==============================] - 1s 4ms/step - loss: 118033.9219 Epoch 6/50 207/207 [==============================] - 1s 4ms/step - loss: 116645.1406 Epoch 7/50 207/207 [==============================] - 1s 4ms/step - loss: 114980.9766 Epoch 8/50 207/207 [==============================] - 1s 4ms/step - loss: 113816.0000 Epoch 9/50 207/207 [==============================] - 1s 4ms/step - loss: 112957.5156 Epoch 10/50 207/207 [==============================] - 1s 4ms/step - loss: 111569.3594 Epoch 11/50 207/207 [==============================] - 1s 4ms/step - loss: 111252.6797 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 110206.2734 Epoch 13/50 207/207 [==============================] - 1s 4ms/step - loss: 110224.9297 Epoch 14/50 207/207 [==============================] - 1s 3ms/step - loss: 109378.3828 Epoch 15/50 207/207 [==============================] - 1s 3ms/step - loss: 109576.0469 Epoch 16/50 207/207 [==============================] - 1s 5ms/step - loss: 108725.2578 Epoch 17/50 207/207 [==============================] - 1s 5ms/step - loss: 108073.8828 Epoch 18/50 207/207 [==============================] - 1s 5ms/step - loss: 107196.3516 Epoch 19/50 207/207 [==============================] - 1s 4ms/step - loss: 106921.1797 Epoch 20/50 207/207 [==============================] - 1s 4ms/step - loss: 105648.0781 Epoch 21/50 207/207 [==============================] - 1s 4ms/step - loss: 105127.7734 Epoch 22/50 207/207 [==============================] - 1s 4ms/step - loss: 103761.2109 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 103765.5547 Epoch 24/50 207/207 [==============================] - 1s 4ms/step - loss: 103225.5938 Epoch 25/50 207/207 [==============================] - 1s 4ms/step - loss: 102822.3359 Epoch 26/50 207/207 [==============================] - 1s 4ms/step - loss: 101977.7266 Epoch 27/50 207/207 [==============================] - 1s 4ms/step - loss: 102070.2500 Epoch 28/50 207/207 [==============================] - 1s 4ms/step - loss: 102202.9219 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 101679.3125 Epoch 30/50 207/207 [==============================] - 1s 4ms/step - loss: 101074.2500 Epoch 31/50 207/207 [==============================] - 1s 4ms/step - loss: 101261.8672 Epoch 32/50 207/207 [==============================] - 1s 4ms/step - loss: 100682.7812 Epoch 33/50 207/207 [==============================] - 1s 4ms/step - loss: 101065.4688 Epoch 34/50 207/207 [==============================] - 1s 4ms/step - loss: 100934.7031 Epoch 35/50 207/207 [==============================] - 1s 5ms/step - loss: 99649.9219 Epoch 36/50 207/207 [==============================] - 1s 4ms/step - loss: 100993.3750 Epoch 37/50 207/207 [==============================] - 1s 5ms/step - loss: 99448.8203 Epoch 38/50 207/207 [==============================] - 1s 4ms/step - loss: 99899.5938 Epoch 39/50 207/207 [==============================] - 1s 5ms/step - loss: 100043.0391 Epoch 40/50 207/207 [==============================] - 1s 4ms/step - loss: 100087.1719 Epoch 41/50 207/207 [==============================] - 1s 4ms/step - loss: 99594.7266 Epoch 42/50 207/207 [==============================] - 1s 4ms/step - loss: 99853.5703 Epoch 43/50 207/207 [==============================] - 1s 4ms/step - loss: 99192.1094 Epoch 44/50 207/207 [==============================] - 1s 4ms/step - loss: 99613.8750 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 98997.2500 Epoch 46/50 207/207 [==============================] - 1s 5ms/step - loss: 98934.8594 Epoch 47/50 207/207 [==============================] - 1s 5ms/step - loss: 99201.8438 Epoch 48/50 207/207 [==============================] - 1s 5ms/step - loss: 98941.2891 Epoch 49/50 207/207 [==============================] - 1s 4ms/step - loss: 99768.1875 Epoch 50/50 207/207 [==============================] - 1s 5ms/step - loss: 99210.0625 23/23 [==============================] - 0s 3ms/step Epoch 1/50 207/207 [==============================] - 2s 5ms/step - loss: 608341.3750 Epoch 2/50 207/207 [==============================] - 1s 5ms/step - loss: 330794.6875 Epoch 3/50 207/207 [==============================] - 1s 5ms/step - loss: 257166.6875 Epoch 4/50 207/207 [==============================] - 1s 5ms/step - loss: 150669.1094 Epoch 5/50 207/207 [==============================] - 1s 5ms/step - loss: 117373.4453 Epoch 6/50 207/207 [==============================] - 1s 5ms/step - loss: 115383.1875 Epoch 7/50 207/207 [==============================] - 1s 5ms/step - loss: 114157.0312 Epoch 8/50 207/207 [==============================] - 1s 4ms/step - loss: 114457.5703 Epoch 9/50 207/207 [==============================] - 1s 4ms/step - loss: 112551.7578 Epoch 10/50 207/207 [==============================] - 1s 4ms/step - loss: 112748.4453 Epoch 11/50 207/207 [==============================] - 1s 4ms/step - loss: 111839.4141 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 111891.3672 Epoch 13/50 207/207 [==============================] - 1s 4ms/step - loss: 110853.5703 Epoch 14/50 207/207 [==============================] - 1s 4ms/step - loss: 111196.4453 Epoch 15/50 207/207 [==============================] - 1s 4ms/step - loss: 110491.3984 Epoch 16/50 207/207 [==============================] - 1s 5ms/step - loss: 110051.8672 Epoch 17/50 207/207 [==============================] - 1s 4ms/step - loss: 109806.1641 Epoch 18/50 207/207 [==============================] - 1s 5ms/step - loss: 108823.7422 Epoch 19/50 207/207 [==============================] - 1s 5ms/step - loss: 108216.4688 Epoch 20/50 207/207 [==============================] - 1s 5ms/step - loss: 107317.0625 Epoch 21/50 207/207 [==============================] - 1s 5ms/step - loss: 107439.1719 Epoch 22/50 207/207 [==============================] - 1s 5ms/step - loss: 105896.5703 Epoch 23/50 207/207 [==============================] - 1s 5ms/step - loss: 105527.3125 Epoch 24/50 207/207 [==============================] - 1s 5ms/step - loss: 104581.3828 Epoch 25/50 207/207 [==============================] - 1s 5ms/step - loss: 104687.2031 Epoch 26/50 207/207 [==============================] - 1s 5ms/step - loss: 103022.1250 Epoch 27/50 207/207 [==============================] - 1s 4ms/step - loss: 103916.6016 Epoch 28/50 207/207 [==============================] - 1s 4ms/step - loss: 103474.3672 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 103225.8906 Epoch 30/50 207/207 [==============================] - 1s 4ms/step - loss: 102699.8281 Epoch 31/50 207/207 [==============================] - 1s 4ms/step - loss: 102842.9844 Epoch 32/50 207/207 [==============================] - 1s 4ms/step - loss: 102106.7109 Epoch 33/50 207/207 [==============================] - 1s 4ms/step - loss: 101527.1016 Epoch 34/50 207/207 [==============================] - 1s 5ms/step - loss: 101428.9453 Epoch 35/50 207/207 [==============================] - 1s 5ms/step - loss: 101511.2578 Epoch 36/50 207/207 [==============================] - 1s 4ms/step - loss: 102303.4844 Epoch 37/50 207/207 [==============================] - 1s 5ms/step - loss: 101678.5078 Epoch 38/50 207/207 [==============================] - 1s 5ms/step - loss: 100500.5781 Epoch 39/50 207/207 [==============================] - 1s 5ms/step - loss: 101502.7969 Epoch 40/50 207/207 [==============================] - 1s 5ms/step - loss: 101084.0078 Epoch 41/50 207/207 [==============================] - 1s 4ms/step - loss: 100565.7422 Epoch 42/50 207/207 [==============================] - 1s 4ms/step - loss: 100284.3516 Epoch 43/50 207/207 [==============================] - 1s 4ms/step - loss: 100425.9688 Epoch 44/50 207/207 [==============================] - 1s 4ms/step - loss: 100886.1484 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 100244.9141 Epoch 46/50 207/207 [==============================] - 1s 4ms/step - loss: 100501.2031 Epoch 47/50 207/207 [==============================] - 1s 4ms/step - loss: 100204.9609 Epoch 48/50 207/207 [==============================] - 1s 4ms/step - loss: 100102.4141 Epoch 49/50 207/207 [==============================] - 1s 4ms/step - loss: 100221.4531 Epoch 50/50 207/207 [==============================] - 1s 4ms/step - loss: 99997.2266 23/23 [==============================] - 0s 3ms/step Epoch 1/50 207/207 [==============================] - 2s 4ms/step - loss: 616888.6875 Epoch 2/50 207/207 [==============================] - 1s 4ms/step - loss: 353176.6250 Epoch 3/50 207/207 [==============================] - 1s 4ms/step - loss: 275846.3750 Epoch 4/50 207/207 [==============================] - 1s 4ms/step - loss: 185314.1406 Epoch 5/50 207/207 [==============================] - 1s 4ms/step - loss: 119572.7891 Epoch 6/50 207/207 [==============================] - 1s 4ms/step - loss: 115168.0312 Epoch 7/50 207/207 [==============================] - 1s 4ms/step - loss: 113103.0234 Epoch 8/50 207/207 [==============================] - 1s 4ms/step - loss: 112961.3828 Epoch 9/50 207/207 [==============================] - 1s 4ms/step - loss: 112108.9453 Epoch 10/50 207/207 [==============================] - 1s 4ms/step - loss: 111967.8594 Epoch 11/50 207/207 [==============================] - 1s 4ms/step - loss: 111138.3438 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 111356.4766 Epoch 13/50 207/207 [==============================] - 1s 4ms/step - loss: 110949.4531 Epoch 14/50 207/207 [==============================] - 1s 4ms/step - loss: 111488.2266 Epoch 15/50 207/207 [==============================] - 1s 4ms/step - loss: 110455.2422 Epoch 16/50 207/207 [==============================] - 1s 4ms/step - loss: 109361.0078 Epoch 17/50 207/207 [==============================] - 1s 5ms/step - loss: 109145.6875 Epoch 18/50 207/207 [==============================] - 1s 5ms/step - loss: 109010.3828 Epoch 19/50 207/207 [==============================] - 1s 5ms/step - loss: 108490.6953 Epoch 20/50 207/207 [==============================] - 1s 4ms/step - loss: 109038.6875 Epoch 21/50 207/207 [==============================] - 1s 4ms/step - loss: 107550.0547 Epoch 22/50 207/207 [==============================] - 1s 4ms/step - loss: 107155.5078 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 106083.9219 Epoch 24/50 207/207 [==============================] - 1s 4ms/step - loss: 105700.7109 Epoch 25/50 207/207 [==============================] - 1s 4ms/step - loss: 105203.1484 Epoch 26/50 207/207 [==============================] - 1s 4ms/step - loss: 104790.5938 Epoch 27/50 207/207 [==============================] - 1s 4ms/step - loss: 103505.3750 Epoch 28/50 207/207 [==============================] - 1s 4ms/step - loss: 103283.9141 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 103675.0859 Epoch 30/50 207/207 [==============================] - 1s 5ms/step - loss: 103071.5156 Epoch 31/50 207/207 [==============================] - 1s 5ms/step - loss: 102499.8203 Epoch 32/50 207/207 [==============================] - 1s 5ms/step - loss: 102946.2422 Epoch 33/50 207/207 [==============================] - 1s 4ms/step - loss: 102395.1094 Epoch 34/50 207/207 [==============================] - 1s 3ms/step - loss: 101946.7734 Epoch 35/50 207/207 [==============================] - 1s 4ms/step - loss: 102283.1562 Epoch 36/50 207/207 [==============================] - 1s 4ms/step - loss: 101825.4766 Epoch 37/50 207/207 [==============================] - 1s 5ms/step - loss: 100836.5312 Epoch 38/50 207/207 [==============================] - 1s 4ms/step - loss: 100787.6094 Epoch 39/50 207/207 [==============================] - 1s 4ms/step - loss: 101298.7266 Epoch 40/50 207/207 [==============================] - 1s 5ms/step - loss: 101686.1797 Epoch 41/50 207/207 [==============================] - 1s 4ms/step - loss: 101091.2500 Epoch 42/50 207/207 [==============================] - 1s 4ms/step - loss: 100264.9766 Epoch 43/50 207/207 [==============================] - 1s 5ms/step - loss: 100984.1641 Epoch 44/50 207/207 [==============================] - 1s 4ms/step - loss: 100658.5938 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 100135.3047 Epoch 46/50 207/207 [==============================] - 1s 4ms/step - loss: 100923.8984 Epoch 47/50 207/207 [==============================] - 1s 4ms/step - loss: 100120.8594 Epoch 48/50 207/207 [==============================] - 1s 4ms/step - loss: 101097.2188 Epoch 49/50 207/207 [==============================] - 1s 4ms/step - loss: 99662.9609 Epoch 50/50 207/207 [==============================] - 1s 4ms/step - loss: 99919.2422 23/23 [==============================] - 0s 3ms/step Epoch 1/50 207/207 [==============================] - 2s 3ms/step - loss: 608954.9375 Epoch 2/50 207/207 [==============================] - 1s 4ms/step - loss: 333112.0000 Epoch 3/50 207/207 [==============================] - 1s 5ms/step - loss: 258333.5938 Epoch 4/50 207/207 [==============================] - 1s 4ms/step - loss: 154171.0469 Epoch 5/50 207/207 [==============================] - 1s 5ms/step - loss: 116686.7812 Epoch 6/50 207/207 [==============================] - 1s 4ms/step - loss: 114899.0703 Epoch 7/50 207/207 [==============================] - 1s 4ms/step - loss: 113258.3828 Epoch 8/50 207/207 [==============================] - 1s 4ms/step - loss: 113848.5547 Epoch 9/50 207/207 [==============================] - 1s 4ms/step - loss: 112711.1797 Epoch 10/50 207/207 [==============================] - 1s 4ms/step - loss: 112192.9219 Epoch 11/50 207/207 [==============================] - 1s 4ms/step - loss: 111716.8750 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 111469.2578 Epoch 13/50 207/207 [==============================] - 1s 4ms/step - loss: 111283.3438 Epoch 14/50 207/207 [==============================] - 1s 4ms/step - loss: 110530.6250 Epoch 15/50 207/207 [==============================] - 1s 4ms/step - loss: 109521.7500 Epoch 16/50 207/207 [==============================] - 1s 4ms/step - loss: 109274.9375 Epoch 17/50 207/207 [==============================] - 1s 3ms/step - loss: 109102.5625 Epoch 18/50 207/207 [==============================] - 1s 3ms/step - loss: 108243.2578 Epoch 19/50 207/207 [==============================] - 1s 3ms/step - loss: 107386.5859 Epoch 20/50 207/207 [==============================] - 1s 3ms/step - loss: 106140.2734 Epoch 21/50 207/207 [==============================] - 1s 4ms/step - loss: 106102.5156 Epoch 22/50 207/207 [==============================] - 1s 4ms/step - loss: 104643.2891 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 104510.6406 Epoch 24/50 207/207 [==============================] - 1s 4ms/step - loss: 104451.5156 Epoch 25/50 207/207 [==============================] - 1s 4ms/step - loss: 102851.1875 Epoch 26/50 207/207 [==============================] - 1s 4ms/step - loss: 102884.1641 Epoch 27/50 207/207 [==============================] - 1s 4ms/step - loss: 102645.3125 Epoch 28/50 207/207 [==============================] - 1s 4ms/step - loss: 102477.6094 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 102131.7344 Epoch 30/50 207/207 [==============================] - 1s 4ms/step - loss: 101749.6484 Epoch 31/50 207/207 [==============================] - 1s 4ms/step - loss: 101260.5547 Epoch 32/50 207/207 [==============================] - 1s 4ms/step - loss: 101334.5391 Epoch 33/50 207/207 [==============================] - 1s 4ms/step - loss: 100636.4844 Epoch 34/50 207/207 [==============================] - 1s 4ms/step - loss: 100825.5625 Epoch 35/50 207/207 [==============================] - 1s 4ms/step - loss: 100945.1562 Epoch 36/50 207/207 [==============================] - 1s 4ms/step - loss: 100913.6719 Epoch 37/50 207/207 [==============================] - 1s 4ms/step - loss: 101478.4531 Epoch 38/50 207/207 [==============================] - 1s 4ms/step - loss: 100087.7109 Epoch 39/50 207/207 [==============================] - 1s 5ms/step - loss: 100241.0547 Epoch 40/50 207/207 [==============================] - 1s 4ms/step - loss: 99788.7578 Epoch 41/50 207/207 [==============================] - 1s 4ms/step - loss: 100403.3672 Epoch 42/50 207/207 [==============================] - 1s 4ms/step - loss: 99723.6250 Epoch 43/50 207/207 [==============================] - 1s 4ms/step - loss: 100380.4453 Epoch 44/50 207/207 [==============================] - 1s 4ms/step - loss: 99922.6484 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 99562.5938 Epoch 46/50 207/207 [==============================] - 1s 4ms/step - loss: 99764.7422 Epoch 47/50 207/207 [==============================] - 1s 4ms/step - loss: 100134.4609 Epoch 48/50 207/207 [==============================] - 1s 4ms/step - loss: 100450.1953 Epoch 49/50 207/207 [==============================] - 1s 4ms/step - loss: 99026.4609 Epoch 50/50 207/207 [==============================] - 1s 4ms/step - loss: 99218.4375 23/23 [==============================] - 0s 2ms/step Epoch 1/50 207/207 [==============================] - 2s 4ms/step - loss: 592878.4375 Epoch 2/50 207/207 [==============================] - 1s 3ms/step - loss: 314256.8438 Epoch 3/50 207/207 [==============================] - 1s 3ms/step - loss: 233500.5625 Epoch 4/50 207/207 [==============================] - 1s 3ms/step - loss: 129804.4141 Epoch 5/50 207/207 [==============================] - 1s 4ms/step - loss: 114108.8359 Epoch 6/50 207/207 [==============================] - 1s 4ms/step - loss: 111851.0078 Epoch 7/50 207/207 [==============================] - 1s 4ms/step - loss: 112156.8672 Epoch 8/50 207/207 [==============================] - 1s 4ms/step - loss: 110759.2969 Epoch 9/50 207/207 [==============================] - 1s 4ms/step - loss: 110598.9766 Epoch 10/50 207/207 [==============================] - 1s 4ms/step - loss: 110349.7500 Epoch 11/50 207/207 [==============================] - 1s 4ms/step - loss: 109580.0000 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 109987.3281 Epoch 13/50 207/207 [==============================] - 1s 4ms/step - loss: 109014.8750 Epoch 14/50 207/207 [==============================] - 1s 4ms/step - loss: 108642.4297 Epoch 15/50 207/207 [==============================] - 1s 4ms/step - loss: 108186.9531 Epoch 16/50 207/207 [==============================] - 1s 4ms/step - loss: 106862.9609 Epoch 17/50 207/207 [==============================] - 1s 4ms/step - loss: 106512.6797 Epoch 18/50 207/207 [==============================] - 1s 4ms/step - loss: 106048.1328 Epoch 19/50 207/207 [==============================] - 1s 4ms/step - loss: 104900.9766 Epoch 20/50 207/207 [==============================] - 1s 4ms/step - loss: 104595.5625 Epoch 21/50 207/207 [==============================] - 1s 4ms/step - loss: 103370.2891 Epoch 22/50 207/207 [==============================] - 1s 4ms/step - loss: 103375.1797 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 102777.4688 Epoch 24/50 207/207 [==============================] - 1s 5ms/step - loss: 102153.3672 Epoch 25/50 207/207 [==============================] - 1s 4ms/step - loss: 101815.8750 Epoch 26/50 207/207 [==============================] - 1s 5ms/step - loss: 101206.5000 Epoch 27/50 207/207 [==============================] - 1s 5ms/step - loss: 101161.6562 Epoch 28/50 207/207 [==============================] - 1s 5ms/step - loss: 101186.5859 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 100431.0547 Epoch 30/50 207/207 [==============================] - 1s 4ms/step - loss: 100510.1953 Epoch 31/50 207/207 [==============================] - 1s 5ms/step - loss: 100291.1250 Epoch 32/50 207/207 [==============================] - 1s 4ms/step - loss: 100184.6797 Epoch 33/50 207/207 [==============================] - 1s 5ms/step - loss: 100031.6953 Epoch 34/50 207/207 [==============================] - 1s 4ms/step - loss: 100049.6641 Epoch 35/50 207/207 [==============================] - 1s 4ms/step - loss: 99343.0234 Epoch 36/50 207/207 [==============================] - 1s 5ms/step - loss: 99438.7812 Epoch 37/50 207/207 [==============================] - 1s 4ms/step - loss: 99976.1016 Epoch 38/50 207/207 [==============================] - 1s 3ms/step - loss: 99205.0938 Epoch 39/50 207/207 [==============================] - 1s 3ms/step - loss: 99287.4141 Epoch 40/50 207/207 [==============================] - 1s 3ms/step - loss: 99164.2578 Epoch 41/50 207/207 [==============================] - 1s 5ms/step - loss: 99150.0156 Epoch 42/50 207/207 [==============================] - 1s 5ms/step - loss: 98546.7500 Epoch 43/50 207/207 [==============================] - 1s 4ms/step - loss: 98422.2344 Epoch 44/50 207/207 [==============================] - 1s 4ms/step - loss: 98560.1250 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 99007.1406 Epoch 46/50 207/207 [==============================] - 1s 5ms/step - loss: 98604.6562 Epoch 47/50 207/207 [==============================] - 1s 4ms/step - loss: 98712.2734 Epoch 48/50 207/207 [==============================] - 1s 4ms/step - loss: 98596.1094 Epoch 49/50 207/207 [==============================] - 1s 4ms/step - loss: 98438.2734 Epoch 50/50 207/207 [==============================] - 1s 4ms/step - loss: 98401.0938 23/23 [==============================] - 0s 2ms/step Epoch 1/50 207/207 [==============================] - 3s 5ms/step - loss: 597708.3125 Epoch 2/50 207/207 [==============================] - 1s 4ms/step - loss: 319157.0938 Epoch 3/50 207/207 [==============================] - 1s 4ms/step - loss: 240551.4062 Epoch 4/50 207/207 [==============================] - 1s 3ms/step - loss: 133649.5469 Epoch 5/50 207/207 [==============================] - 1s 3ms/step - loss: 114236.0156 Epoch 6/50 207/207 [==============================] - 1s 3ms/step - loss: 112530.6953 Epoch 7/50 207/207 [==============================] - 1s 5ms/step - loss: 112589.1172 Epoch 8/50 207/207 [==============================] - 1s 4ms/step - loss: 111001.2031 Epoch 9/50 207/207 [==============================] - 1s 5ms/step - loss: 111221.4766 Epoch 10/50 207/207 [==============================] - 1s 4ms/step - loss: 111025.7500 Epoch 11/50 207/207 [==============================] - 1s 4ms/step - loss: 110063.3203 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 110265.9297 Epoch 13/50 207/207 [==============================] - 1s 4ms/step - loss: 109223.6719 Epoch 14/50 207/207 [==============================] - 1s 5ms/step - loss: 108407.5469 Epoch 15/50 207/207 [==============================] - 1s 4ms/step - loss: 107811.2734 Epoch 16/50 207/207 [==============================] - 1s 4ms/step - loss: 107633.7656 Epoch 17/50 207/207 [==============================] - 1s 4ms/step - loss: 106724.7656 Epoch 18/50 207/207 [==============================] - 1s 4ms/step - loss: 105682.7031 Epoch 19/50 207/207 [==============================] - 1s 4ms/step - loss: 105636.8828 Epoch 20/50 207/207 [==============================] - 1s 4ms/step - loss: 104780.6875 Epoch 21/50 207/207 [==============================] - 1s 4ms/step - loss: 104226.4375 Epoch 22/50 207/207 [==============================] - 1s 3ms/step - loss: 103931.0938 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 103689.9062 Epoch 24/50 207/207 [==============================] - 1s 5ms/step - loss: 103091.6484 Epoch 25/50 207/207 [==============================] - 1s 5ms/step - loss: 102687.4062 Epoch 26/50 207/207 [==============================] - 1s 4ms/step - loss: 102174.0234 Epoch 27/50 207/207 [==============================] - 1s 5ms/step - loss: 101760.3828 Epoch 28/50 207/207 [==============================] - 1s 4ms/step - loss: 100985.8672 Epoch 29/50 207/207 [==============================] - 1s 5ms/step - loss: 101123.8594 Epoch 30/50 207/207 [==============================] - 1s 4ms/step - loss: 100698.1094 Epoch 31/50 207/207 [==============================] - 1s 4ms/step - loss: 100598.3359 Epoch 32/50 207/207 [==============================] - 1s 4ms/step - loss: 100836.3672 Epoch 33/50 207/207 [==============================] - 1s 4ms/step - loss: 99826.1484 Epoch 34/50 207/207 [==============================] - 1s 4ms/step - loss: 100161.6797 Epoch 35/50 207/207 [==============================] - 1s 4ms/step - loss: 99938.7734 Epoch 36/50 207/207 [==============================] - 1s 4ms/step - loss: 99506.2031 Epoch 37/50 207/207 [==============================] - 1s 4ms/step - loss: 99326.5781 Epoch 38/50 207/207 [==============================] - 1s 4ms/step - loss: 99311.3672 Epoch 39/50 207/207 [==============================] - 1s 4ms/step - loss: 99058.2656 Epoch 40/50 207/207 [==============================] - 1s 4ms/step - loss: 99838.4922 Epoch 41/50 207/207 [==============================] - 1s 4ms/step - loss: 98696.2266 Epoch 42/50 207/207 [==============================] - 1s 5ms/step - loss: 98693.0938 Epoch 43/50 207/207 [==============================] - 1s 5ms/step - loss: 99326.0703 Epoch 44/50 207/207 [==============================] - 1s 4ms/step - loss: 98963.4766 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 98998.8203 Epoch 46/50 207/207 [==============================] - 1s 4ms/step - loss: 98352.3984 Epoch 47/50 207/207 [==============================] - 1s 4ms/step - loss: 99222.0625 Epoch 48/50 207/207 [==============================] - 1s 4ms/step - loss: 99194.6328 Epoch 49/50 207/207 [==============================] - 1s 4ms/step - loss: 98308.0469 Epoch 50/50 207/207 [==============================] - 1s 4ms/step - loss: 98610.1484 23/23 [==============================] - 0s 3ms/step Epoch 1/50 207/207 [==============================] - 2s 5ms/step - loss: 621047.1875 Epoch 2/50 207/207 [==============================] - 1s 4ms/step - loss: 354841.1562 Epoch 3/50 207/207 [==============================] - 1s 4ms/step - loss: 269606.5625 Epoch 4/50 207/207 [==============================] - 1s 4ms/step - loss: 172141.5781 Epoch 5/50 207/207 [==============================] - 1s 4ms/step - loss: 117488.5000 Epoch 6/50 207/207 [==============================] - 1s 3ms/step - loss: 113490.8203 Epoch 7/50 207/207 [==============================] - 1s 4ms/step - loss: 112068.1953 Epoch 8/50 207/207 [==============================] - 1s 5ms/step - loss: 111141.2969 Epoch 9/50 207/207 [==============================] - 2s 11ms/step - loss: 110871.3438 Epoch 10/50 207/207 [==============================] - 3s 13ms/step - loss: 111229.2734 Epoch 11/50 207/207 [==============================] - 241s 1s/step - loss: 110461.8984 Epoch 12/50 207/207 [==============================] - 1s 4ms/step - loss: 110010.1875 Epoch 13/50 207/207 [==============================] - 1s 5ms/step - loss: 109536.5156 Epoch 14/50 207/207 [==============================] - 2s 8ms/step - loss: 109253.8828 Epoch 15/50 207/207 [==============================] - 1s 6ms/step - loss: 108799.0625 Epoch 16/50 207/207 [==============================] - 1s 3ms/step - loss: 108201.6641 Epoch 17/50 207/207 [==============================] - 1s 4ms/step - loss: 108572.1562 Epoch 18/50 207/207 [==============================] - 1s 3ms/step - loss: 107907.4922 Epoch 19/50 207/207 [==============================] - 2s 9ms/step - loss: 106750.6641 Epoch 20/50 207/207 [==============================] - 1s 4ms/step - loss: 106116.5391 Epoch 21/50 207/207 [==============================] - 1s 4ms/step - loss: 105958.9844 Epoch 22/50 207/207 [==============================] - 1s 4ms/step - loss: 105009.6016 Epoch 23/50 207/207 [==============================] - 1s 4ms/step - loss: 105135.7656 Epoch 24/50 207/207 [==============================] - 1s 4ms/step - loss: 103676.3438 Epoch 25/50 207/207 [==============================] - 1s 4ms/step - loss: 104028.1250 Epoch 26/50 207/207 [==============================] - 1s 4ms/step - loss: 103061.9062 Epoch 27/50 207/207 [==============================] - 1s 4ms/step - loss: 102795.1641 Epoch 28/50 207/207 [==============================] - 1s 5ms/step - loss: 101982.0625 Epoch 29/50 207/207 [==============================] - 1s 4ms/step - loss: 102514.7500 Epoch 30/50 207/207 [==============================] - 1s 5ms/step - loss: 101814.2891 Epoch 31/50 207/207 [==============================] - 1s 5ms/step - loss: 102641.5625 Epoch 32/50 207/207 [==============================] - 1s 5ms/step - loss: 100847.3516 Epoch 33/50 207/207 [==============================] - 1s 5ms/step - loss: 100576.4844 Epoch 34/50 207/207 [==============================] - 1s 5ms/step - loss: 101247.2500 Epoch 35/50 207/207 [==============================] - 1s 5ms/step - loss: 101133.3438 Epoch 36/50 207/207 [==============================] - 1s 5ms/step - loss: 100429.3203 Epoch 37/50 207/207 [==============================] - 1s 5ms/step - loss: 100740.5547 Epoch 38/50 207/207 [==============================] - 1s 5ms/step - loss: 100864.0156 Epoch 39/50 207/207 [==============================] - 1s 5ms/step - loss: 99760.5469 Epoch 40/50 207/207 [==============================] - 1s 5ms/step - loss: 99736.9609 Epoch 41/50 207/207 [==============================] - 1s 5ms/step - loss: 100195.0938 Epoch 42/50 207/207 [==============================] - 1s 5ms/step - loss: 99835.9375 Epoch 43/50 207/207 [==============================] - 1s 5ms/step - loss: 99846.8125 Epoch 44/50 207/207 [==============================] - 1s 5ms/step - loss: 99664.4141 Epoch 45/50 207/207 [==============================] - 1s 4ms/step - loss: 99770.9297 Epoch 46/50 207/207 [==============================] - 1s 5ms/step - loss: 99873.8750 Epoch 47/50 207/207 [==============================] - 1s 5ms/step - loss: 99386.9453 Epoch 48/50 207/207 [==============================] - 1s 4ms/step - loss: 99311.9141 Epoch 49/50 207/207 [==============================] - 1s 5ms/step - loss: 98888.2578 Epoch 50/50 207/207 [==============================] - 1s 5ms/step - loss: 99680.4219 23/23 [==============================] - 0s 2ms/step
print("Result: %.2f %s %.2f" %(results.mean(), u"\u00B1", results.std()))
Result: 0.75 ± 0.01
backend.clear_session()
model=my_model()
history=model.fit(train_data,train_target,epochs=N_epochs,batch_size=batchSize,callbacks=callbacks_,\
validation_data=(test_data, test_target))
Epoch 1/50 230/230 [==============================] - 4s 9ms/step - loss: 584153.5625 - val_loss: 371553.6250 Epoch 2/50 230/230 [==============================] - 2s 7ms/step - loss: 310658.0000 - val_loss: 271634.0000 Epoch 3/50 230/230 [==============================] - 2s 7ms/step - loss: 225924.4375 - val_loss: 147937.1094 Epoch 4/50 230/230 [==============================] - 2s 8ms/step - loss: 124956.1953 - val_loss: 105847.2656 Epoch 5/50 230/230 [==============================] - 2s 7ms/step - loss: 114404.2734 - val_loss: 103600.4297 Epoch 6/50 230/230 [==============================] - 1s 6ms/step - loss: 112993.6250 - val_loss: 102734.1484 Epoch 7/50 230/230 [==============================] - 1s 6ms/step - loss: 113235.6172 - val_loss: 102043.0156 Epoch 8/50 230/230 [==============================] - 2s 7ms/step - loss: 112234.5469 - val_loss: 101505.6719 Epoch 9/50 230/230 [==============================] - 1s 6ms/step - loss: 111240.7422 - val_loss: 101105.0000 Epoch 10/50 230/230 [==============================] - 1s 5ms/step - loss: 110996.3359 - val_loss: 100584.9531 Epoch 11/50 230/230 [==============================] - 1s 5ms/step - loss: 110370.7266 - val_loss: 100123.7422 Epoch 12/50 230/230 [==============================] - 1s 6ms/step - loss: 109660.0703 - val_loss: 99688.0703 Epoch 13/50 230/230 [==============================] - 1s 6ms/step - loss: 109471.9609 - val_loss: 99108.2188 Epoch 14/50 230/230 [==============================] - 1s 6ms/step - loss: 109043.3984 - val_loss: 98611.9609 Epoch 15/50 230/230 [==============================] - 1s 6ms/step - loss: 107897.0781 - val_loss: 98042.4219 Epoch 16/50 230/230 [==============================] - 1s 6ms/step - loss: 107745.6562 - val_loss: 97413.5156 Epoch 17/50 230/230 [==============================] - 1s 5ms/step - loss: 106846.4062 - val_loss: 96738.3203 Epoch 18/50 230/230 [==============================] - 1s 6ms/step - loss: 106344.1953 - val_loss: 96199.4062 Epoch 19/50 230/230 [==============================] - 1s 5ms/step - loss: 105796.6719 - val_loss: 95694.1250 Epoch 20/50 230/230 [==============================] - 1s 6ms/step - loss: 104622.0156 - val_loss: 95075.6094 Epoch 21/50 230/230 [==============================] - 1s 5ms/step - loss: 104670.6953 - val_loss: 94544.8750 Epoch 22/50 230/230 [==============================] - 1s 5ms/step - loss: 104246.4531 - val_loss: 94013.5312 Epoch 23/50 230/230 [==============================] - 2s 7ms/step - loss: 103821.1016 - val_loss: 93514.5312 Epoch 24/50 230/230 [==============================] - 2s 7ms/step - loss: 103336.6406 - val_loss: 93095.2734 Epoch 25/50 230/230 [==============================] - 1s 6ms/step - loss: 102698.6016 - val_loss: 92867.5938 Epoch 26/50 230/230 [==============================] - 1s 6ms/step - loss: 102471.2344 - val_loss: 92376.9062 Epoch 27/50 230/230 [==============================] - 1s 6ms/step - loss: 102150.2188 - val_loss: 92015.7969 Epoch 28/50 230/230 [==============================] - 1s 6ms/step - loss: 101872.1641 - val_loss: 91768.7266 Epoch 29/50 230/230 [==============================] - 1s 6ms/step - loss: 101765.0078 - val_loss: 91586.8672 Epoch 30/50 230/230 [==============================] - 1s 6ms/step - loss: 101506.8672 - val_loss: 91662.3203 Epoch 31/50 230/230 [==============================] - 1s 6ms/step - loss: 101216.7500 - val_loss: 90948.7266 Epoch 32/50 230/230 [==============================] - 1s 6ms/step - loss: 101183.0078 - val_loss: 90966.9922 Epoch 33/50 230/230 [==============================] - 1s 6ms/step - loss: 100993.1250 - val_loss: 90755.0938 Epoch 34/50 230/230 [==============================] - 1s 6ms/step - loss: 100152.7578 - val_loss: 90516.1094 Epoch 35/50 230/230 [==============================] - 1s 5ms/step - loss: 100603.3828 - val_loss: 90199.2031 Epoch 36/50 230/230 [==============================] - 1s 6ms/step - loss: 100555.9688 - val_loss: 90393.5156 Epoch 37/50 230/230 [==============================] - 1s 6ms/step - loss: 100266.4688 - val_loss: 89933.9062 Epoch 38/50 230/230 [==============================] - 1s 6ms/step - loss: 100880.0078 - val_loss: 89775.4844 Epoch 39/50 230/230 [==============================] - 1s 6ms/step - loss: 100117.4844 - val_loss: 89805.7266 Epoch 40/50 230/230 [==============================] - 1s 6ms/step - loss: 99995.2422 - val_loss: 89530.2109 Epoch 41/50 230/230 [==============================] - 1s 6ms/step - loss: 99840.4844 - val_loss: 89518.2109 Epoch 42/50 230/230 [==============================] - 1s 6ms/step - loss: 100056.7188 - val_loss: 89525.0156 Epoch 43/50 230/230 [==============================] - 1s 5ms/step - loss: 99662.6562 - val_loss: 89909.4375 Epoch 44/50 230/230 [==============================] - 1s 5ms/step - loss: 99956.4141 - val_loss: 89255.6562 Epoch 45/50 230/230 [==============================] - 1s 6ms/step - loss: 98920.4375 - val_loss: 89250.3906 Epoch 46/50 230/230 [==============================] - 1s 5ms/step - loss: 99486.2969 - val_loss: 89185.5625 Epoch 47/50 230/230 [==============================] - 1s 6ms/step - loss: 99272.1953 - val_loss: 89069.0156 Epoch 48/50 230/230 [==============================] - 1s 6ms/step - loss: 99432.1250 - val_loss: 89045.9297 Epoch 49/50 230/230 [==============================] - 1s 6ms/step - loss: 99228.8906 - val_loss: 88876.4297 Epoch 50/50 230/230 [==============================] - 1s 6ms/step - loss: 99280.3594 - val_loss: 88863.6641
plt.semilogy(history.history['loss'],label='Loss')
plt.semilogy(history.history['val_loss'],label='Validation Loss')
plt.title("Loss against epoch")
plt.xlabel("Epoch")
plt.ylabel("Loss")
plt.legend()
plt.show()
test_predict = model.predict(test_data)
1534/1534 [==============================] - 3s 2ms/step
plt.scatter(test_target, test_predict, s=0.5)
plt.xlabel('actual muon energy')
plt.ylabel('preddicted muon energy')
plt.show()
The prediction is better at low energies than at high energies as the x=y line stops around 2000 MeV. Most energies are underpredicted, rather than overpredicted, as the points are mainly below the diagonal x=y line.
gbr0 = GradientBoostingRegressor(n_estimators = 100)
gbr0.fit( train_data, train_target.ravel())
GradientBoostingRegressor()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
GradientBoostingRegressor()
r2_train = gbr0.score(train_data,train_target) #Determine the R^2 values
r2_test = gbr0.score(test_data,test_target)
print("R^2 of testing data = {}".format(r2_test))
print("R^2 of training data = {}".format(r2_train))
R^2 of testing data = 0.781797050479113 R^2 of training data = 0.7830727956686763
These scores are better than the regressor by around 3%. It therefore appears that a gradient boosting regressor is working better than the neural network.
# Look for the best parameters via a parameter search
# Define the parameter grid to be tested
param_grid_ = {
'n_estimators':[100],
'learning_rate':[0.1, 0.05],
'max_depth':[5, 10],
'min_samples_leaf':[50,100],
}
njobs_ = 8 # jobs to run in parallel
np.random.seed(Answer_to_all_questions)
gbr = GradientBoostingRegressor()
# Use HalvingGridSearch to speed up the process
classifier = HalvingGridSearchCV(estimator = gbr,cv = kfold, param_grid=param_grid_,
verbose=1,n_jobs=njobs_) # Define the grid search parameters
classifier.fit(train_data, train_target.ravel())
print("Best estimator:")
print(classifier.best_estimator_)
n_iterations: 2 n_required_iterations: 2 n_possible_iterations: 2 min_resources_: 38171 max_resources_: 114514 aggressive_elimination: False factor: 3 ---------- iter: 0 n_candidates: 8 n_resources: 38171 Fitting 10 folds for each of 8 candidates, totalling 80 fits ---------- iter: 1 n_candidates: 3 n_resources: 114513 Fitting 10 folds for each of 3 candidates, totalling 30 fits Best estimator: GradientBoostingRegressor(learning_rate=0.05, max_depth=10, min_samples_leaf=50)
train_sizes0, train_scores0, test_scores0 = learning_curve(classifier.best_estimator_, train_data, train_target )
plt.plot(train_sizes0, train_scores0, label="Training score")
plt.plot(train_sizes0, test_scores0, label = "Testing score")
plt.ylabel('score')
plt.xlabel('Training examples')
plt.legend()
plt.show()
train_sizes, train_scores, test_scores = learning_curve(classifier.best_estimator_, test_data, test_target )
plt.plot(train_sizes, train_scores, label="Training score")
plt.plot(train_sizes, test_scores, label ="Testing score")
plt.ylabel('score')
plt.xlabel('Training examples')
plt.legend()
plt.show()
best_depth = classifier.best_estimator_.max_depth
best_samples = classifier.best_estimator_.min_samples_leaf
best_estimators = classifier.best_estimator_.n_estimators
best_learning_rate = classifier.best_estimator_.learning_rate
gbr1 = GradientBoostingRegressor(max_depth=best_depth,min_samples_leaf=best_samples,n_estimators=best_estimators,\
learning_rate=best_learning_rate)
best_results = cross_val_score(gbr1,train_data, train_target, cv = kfold, scoring='r2',n_jobs=8)
print("R^2 result: %.4f %s %.4f" %(best_results.mean(), u"\u00B1", best_results.std()))
R^2 result: 0.8062 ± 0.0124
backend.clear_session()
history_gbr1 = gbr1.fit(train_data,train_target)
test_predict2 = gbr1.predict(test_data)
plt.scatter(test_target, test_predict2, s=0.5)
plt.xlabel('actual muon energy')
plt.ylabel('preddicted muon energy')
plt.show()
This is a better prediction than the neural network as the x=y line extends to higher energies. Again, like the neural network, it too underpredicts the energies more often.
# Given plotting example for feature importance
fig, ax = plt.subplots(figsize=(10, 10))
ax.barh(variables[:-1],classifier.best_estimator_.feature_importances_)
ax.set_title("Training Feature Importance")
plt.show()
From the bar chart, the most important feature is the total_ring_PEs2, and the least important features are the recoDWallR2 and the recoWallZ2 features.