What did the CNN learn?#

We want to obtain some insight into the internal workings of CNNs. The techniques presented below are mainly used for CNNs, but same principles apply to all types of feedforward ANNs.

import numpy as np
import matplotlib.pyplot as plt
import tensorflow.keras as keras
import tensorflow as tf

data_path = '/home/jef19jdw/myfiles/datasets_teaching/ds2/catsdogs/data/'
2023-07-03 06:38:04.012731: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  SSE4.1 SSE4.2 AVX AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
# workarounds for some problems with Tensorflow (only use if neccessary)

#import os

#physical_devices = tf.config.list_physical_devices('GPU')
#tf.config.experimental.set_memory_growth(physical_devices[0], True)

#os.environ['XLA_FLAGS']='--xla_gpu_cuda_data_dir=/usr/lib/cuda'
model = keras.models.load_model('cnnmodel')
model.summary()
2023-07-03 06:38:05.643827: E tensorflow/compiler/xla/stream_executor/cuda/cuda_driver.cc:267] failed call to cuInit: CUDA_ERROR_UNKNOWN: unknown error
2023-07-03 06:38:05.643893: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:169] retrieving CUDA diagnostic information for host: WHZ-46349
2023-07-03 06:38:05.643911: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:176] hostname: WHZ-46349
2023-07-03 06:38:05.644122: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:200] libcuda reported version is: 470.161.3
2023-07-03 06:38:05.644175: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:204] kernel reported version is: 470.161.3
2023-07-03 06:38:05.644191: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:310] kernel version seems to match DSO: 470.161.3
2023-07-03 06:38:05.645067: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  SSE4.1 SSE4.2 AVX AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 rescaling (Rescaling)       (None, 128, 128, 3)       0         
                                                                 
 conv1 (Conv2D)              (None, 126, 126, 16)      448       
                                                                 
 conv2 (Conv2D)              (None, 124, 124, 16)      2320      
                                                                 
 pool1 (MaxPooling2D)        (None, 62, 62, 16)        0         
                                                                 
 conv3 (Conv2D)              (None, 60, 60, 32)        4640      
                                                                 
 conv4 (Conv2D)              (None, 58, 58, 32)        9248      
                                                                 
 pool2 (MaxPooling2D)        (None, 29, 29, 32)        0         
                                                                 
 flatten (Flatten)           (None, 26912)             0         
                                                                 
 dense1 (Dense)              (None, 10)                269130    
                                                                 
 dense2 (Dense)              (None, 10)                110       
                                                                 
 out (Dense)                 (None, 2)                 22        
                                                                 
=================================================================
Total params: 285,918
Trainable params: 285,918
Non-trainable params: 0
_________________________________________________________________

Visualizing Feature Maps#

Each convolutional layer outputs a stack of feature maps. In the language of CNNs feature maps are filtered versions of the input image. In the language of ANNs a feature map contains neuron activations. Given an input image we may look at the feature maps to get an idea of what features the learned filters extract.

To get activations of intermediate layers for a given input image we define a new Keras model, which reuses parts of the existing model. When creating a model Keras builds a TensorFlow data structure (the graph) representing the flow of data and operations on data. This graph starts with an input node (a Tensor object) and ends with the output node (again a Tensor object). When calling Model.predict Keras takes the data and hands it over to TensorFlow. TensorFlow executes the graph with the provided data and returns the output to Keras. Each layer’s output is represented by an intermediate Tensor object in the graph, too. So we may fool Keras by creating a new model providing existing Tensor objects as inputs and outputs of the model. This feature is not well documented. What is missing in the documentation is the fact, that keyword arguments inputs and outputs of the Model constructor also accept TensorFlows Tensor objects instead of Keras’ Input and Layer objects. Tensor objects of existing models or layers are accessible through inputs and outputs member variables. From this knowledge we are able to create a new Model instance using an existing TensorFlow graph or parts of it.

layer_name = 'conv1'

submodel = keras.models.Model(inputs=model.inputs, outputs=model.get_layer(layer_name).output)
submodel.summary()
Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_1 (InputLayer)        [(None, 128, 128, 3)]     0         
                                                                 
 rescaling (Rescaling)       (None, 128, 128, 3)       0         
                                                                 
 conv1 (Conv2D)              (None, 126, 126, 16)      448       
                                                                 
=================================================================
Total params: 448
Trainable params: 448
Non-trainable params: 0
_________________________________________________________________

No we load an image and get corresponding predictions from the submodel. Predictions of the submodel are the feature maps (after applying activation function) of the chosen layer in the original model. The image has to be resized to fit the model’s input size. We use Kera’s load_img. This function returns a PIL image object which is understood by NumPy.

img_size = 128
img = keras.preprocessing.image.load_img(data_path + 'unlabeled/4.jpg',
                                         target_size=(img_size, img_size))
img = np.asarray(img, dtype=np.float32)

fig, ax = plt.subplots()
ax.imshow(img / 255)
plt.show()

fmaps = submodel.predict(img.reshape(1, img_size, img_size, 3))
fmaps = fmaps.reshape(fmaps.shape[1:])
print(fmaps.shape)
../../../_images/cnn-learn_7_0.png
1/1 [==============================] - 0s 203ms/step
(126, 126, 16)

It remains to rescale and plot all the feature maps. We first rescale all feature maps at once to have range \([0,1]\). Then we rescale each map individually to increase contrast for low intensity images. The individual scaling factor will be shown in the plots. A high factor indicates low intensities.

cols = 4
rows = fmaps.shape[2] // cols

fmaps = 1 / (fmaps.max() - fmaps.min()) * (fmaps - fmaps.min())

fig, axs = plt.subplots(rows, cols, figsize=(15, 15))

for r in range(0, rows):
    for c in range(0, cols):
        fmap = fmaps[:, :, r * cols + c]
        if fmap.max() > 0:
            fac = 1 / fmap.max()
            fmap = fac * fmap
        else:
            fac = 1
        axs[r, c].imshow(fmap, cmap='gray')
        axs[r, c].axis('off')
        axs[r, c].set_title('x {:.0f}'.format(fac))
        
plt.show()
../../../_images/cnn-learn_9_0.png

Visualizing Filters#

Each convolutional layer is defined by a list of filters. Filters are a set of shared weights. We may obtain weights of a layer by calling Layer.get_weights. For layers with input from a bias neuron the method returns a list with two items. First item is a NumPy array of regular weights, second is a NumPy array of bias weights.

layer = model.get_layer('conv1')

filters, bias_weights = layer.get_weights()
print(filters.shape, bias_weights.shape)
(3, 3, 3, 16) (16,)

In the first layer we have three input channels (red, green, blue). Thus, filter depth is 3 and we may visualize each filter as color image. Filter pixels may have range different from [0, 1]. Thus, we linearly scale all filters.

filters = 1 / (filters.max() - filters.min()) * (filters - filters.min())

fig, axs = plt.subplots(filters.shape[3], 4, figsize=(4, 12))

for row in range(0, filters.shape[3]):
    axs[row, 0].imshow(filters[:, :, :, row], vmin=0, vmax=1)
    axs[row, 0].axis('off')
    axs[row, 1].imshow(filters[:, :, 0, row], cmap='gray', vmin=0, vmax=1)
    axs[row, 1].axis('off')
    axs[row, 2].imshow(filters[:, :, 1, row], cmap='gray', vmin=0, vmax=1)
    axs[row, 2].axis('off')
    axs[row, 3].imshow(filters[:, :, 2, row], cmap='gray', vmin=0, vmax=1)
    axs[row, 3].axis('off')
    if row == 0:
        axs[row, 0].set_title('RGB')
        axs[row, 1].set_title('R')
        axs[row, 2].set_title('G')
        axs[row, 3].set_title('B')
        
plt.show()
../../../_images/cnn-learn_13_0.png

For deeper layers there is no color interpretation, because filters have more than 3 depth levels. So we may visualize a filter as a list of sections perpendicular to the depth axis. In the following plot each row contains the sections of one filter.

layer = model.get_layer('conv2')

filters, bias_weights = layer.get_weights()
filters = 1 / (filters.max() - filters.min()) * (filters - filters.min())

fig, axs = plt.subplots(filters.shape[3], filters.shape[2], figsize=(12, 12))

for row in range(0, filters.shape[3]):
    for col in range(0, filters.shape[2]):
        axs[row, col].imshow(filters[:, :, col, row], cmap='gray')
        axs[row, col].axis('off')
        
plt.show()
../../../_images/cnn-learn_15_0.png

Maximizing Neuron Activation#

To get a better idea of what causes neurons to fire, we may seek for images with high activation of a fixed neuron. This is an optimization problem. The objective is a neuron’s activation. The search space is the set of all images fitting the model’s input size.

We apply gradient descent to the negative objective (that is, gradient ascent to the objective) and use some Keras features simplifying implementation.

The objective is a neuron’s output and we handle the objective as a Keras model. This will allow for using Keras to compute gradients.

layer = model.get_layer('conv3')
neuron = (5, 5, 0)
#layer = model.get_layer('dense2')
#neuron = (0, )
#layer = model.get_layer('out')
#neuron = (0, )

submodel = keras.models.Model(inputs=model.inputs, outputs=layer.output[(0, ) + neuron])
submodel.summary()
Model: "model_5"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_1 (InputLayer)        [(None, 128, 128, 3)]     0         
                                                                 
 rescaling (Rescaling)       (None, 128, 128, 3)       0         
                                                                 
 conv1 (Conv2D)              (None, 126, 126, 16)      448       
                                                                 
 conv2 (Conv2D)              (None, 124, 124, 16)      2320      
                                                                 
 pool1 (MaxPooling2D)        (None, 62, 62, 16)        0         
                                                                 
 conv3 (Conv2D)              (None, 60, 60, 32)        4640      
                                                                 
 tf.__operators__.getitem_4   ()                       0         
 (SlicingOpLambda)                                               
                                                                 
=================================================================
Total params: 7,408
Trainable params: 7,408
Non-trainable params: 0
_________________________________________________________________

Now we define a function which computes objective value and gradient for a given input image. First we call convert_to_tensor to convert the image into a Tensor object, which fits the model’s input dimensions. Then we tell TensorFlow to watch the operations performed on the image while calculating the objective function. From the collected information TensorFlow then can calculate the gradient of the objective function. To watch the flow of the image through the TensorFlow graph we have to create a context manager of type GradientTape. The flow of all variables marked for watching with GradientTape.watch is recorded for all graph executions inside the with block. After executing the graph we get the gradient from GradientTape.gradient. Note that calling Model.predict does not support watching the variables flow. Instead we have to use a different API variant of Keras: Model objects are callable, that is, they can be used as a function, and yield a prediction if called with some input as argument.

def get_grad(submodel, img):
    
    img_tensor = tf.convert_to_tensor(img.reshape(1, img_size, img_size, 3))

    with tf.GradientTape() as tape:
        tape.watch(img_tensor)
        objective_value = submodel(img_tensor)
        grad = tape.gradient(objective_value, img_tensor)

    return objective_value.numpy(), grad.numpy().reshape(img.shape)

We are ready for gradient ascent. We are free to choose an arbitrary initial guess, but we have to keep in mind that on the one hand we may end up in a local maximum and on the other hand there might be many global maxima. Thus, the initial guess will have influence on the result. We put everything in a function. So we can reuse it below.

def gradient_ascent(submodel, init_img, max_iter, step_length):

    img = init_img

    for i in range(0, max_iter):
        obj, grad = get_grad(submodel, img)
                    
        img = img + step_length * grad
        
        print(i, obj, np.max(np.abs(grad)))
        
    return img
# constant image
img = 128 * np.ones((img_size, img_size, 3), dtype=float)

# photo
#img = keras.preprocessing.image.load_img(data_path + 'unlabeled/365.jpg',
#                                         target_size=(img_size, img_size))
#img = np.asarray(img, dtype=np.float32)
# parameters for gradient ascent
img = gradient_ascent(submodel, img, 1000, 100) # for conv3/dense2 with constant
#img = gradient_ascent(submodel, img, 100, 100) # for output neuron with photo

# show result
img_to_show = 1 / (img.max() - img.min()) * (img - img.min())
fig, ax = plt.subplots()
ax.imshow(img_to_show)
plt.show()
0 0.07344329 0.0003321536350995302
1 0.073513456 0.00028220663079991937
2 0.07358889 0.0003213614400010556
3 0.07369768 0.00030644104117527604
4 0.07383166 0.00031520603806711733
5 0.07399155 0.00030644104117527604
6 0.074129954 0.00031513432622887194
7 0.0742565 0.00030454201623797417
8 0.07441732 0.0003239291545469314
9 0.07457861 0.00030454201623797417
10 0.07470061 0.00030454201623797417
11 0.07487879 0.00030454201623797417
12 0.07501966 0.00030454201623797417
13 0.07516515 0.0003058453439734876
14 0.075305745 0.00030454201623797417
15 0.0754803 0.0003058453439734876
16 0.07562959 0.00030454201623797417
17 0.07579061 0.00030454201623797417
18 0.07593748 0.0003058453439734876
19 0.07606852 0.00030454201623797417
20 0.076212205 0.00030454201623797417
21 0.07638178 0.0003369719488546252
22 0.076535545 0.00030454201623797417
23 0.07666714 0.0003113079583272338
24 0.07682478 0.00030454201623797417
25 0.07697314 0.00030454201623797417
26 0.07714298 0.00030454201623797417
27 0.077249855 0.0003259675286244601
28 0.07743089 0.00030454201623797417
29 0.07758624 0.0003369719488546252
30 0.07774404 0.00030454201623797417
31 0.0778713 0.00031524914084002376
32 0.07803682 0.0003058453439734876
33 0.078191474 0.0003058453439734876
34 0.07833725 0.00036290791467763484
35 0.07847955 0.00030454201623797417
36 0.07863803 0.00030454201623797417
37 0.07877792 0.0003113079583272338
38 0.07892162 0.00030454201623797417
39 0.07909869 0.00030454201623797417
40 0.07921913 0.0003113079583272338
41 0.07938554 0.00030454201623797417
42 0.07953825 0.00030454201623797417
43 0.07970761 0.00030454201623797417
44 0.07983406 0.00031524914084002376
45 0.07996544 0.0003058453439734876
46 0.08013553 0.00030454201623797417
47 0.0802946 0.00030454201623797417
48 0.08043927 0.0003058453439734876
49 0.08059702 0.00030454201623797417
50 0.08072535 0.00030454201623797417
51 0.08089805 0.00030454201623797417
52 0.08103597 0.0003369719488546252
53 0.08119964 0.0003058453439734876
54 0.08135997 0.00030454201623797417
55 0.08145313 0.0003259675286244601
56 0.08164934 0.00030454201623797417
57 0.081796795 0.00030454201623797417
58 0.08194393 0.00030454201623797417
59 0.08208322 0.00033827530569396913
60 0.08226341 0.00030454201623797417
61 0.08238993 0.00031524914084002376
62 0.082546435 0.00030454201623797417
63 0.08268981 0.00030454201623797417
64 0.082855605 0.0003113079583272338
65 0.08297971 0.00030454201623797417
66 0.08315368 0.00030454201623797417
67 0.08331067 0.00030454201623797417
68 0.08345787 0.0003048637881875038
69 0.083590515 0.0003035604313481599
70 0.0837581 0.0003232989402022213
71 0.08390607 0.0003113079583272338
72 0.084045365 0.0003247601562179625
73 0.08419961 0.0003035604313481599
74 0.08436007 0.0003035604313481599
75 0.08451763 0.0003048637881875038
76 0.08465027 0.0003060454619117081
77 0.08482045 0.0003035604313481599
78 0.084977776 0.0003035604313481599
79 0.08512163 0.0003035604313481599
80 0.08527139 0.0003035604313481599
81 0.08544078 0.0003048637881875038
82 0.085562915 0.0003035604313481599
83 0.085736886 0.0003048637881875038
84 0.085881226 0.0003035604313481599
85 0.086030394 0.00033729374990798533
86 0.086165085 0.0003035604313481599
87 0.086344495 0.0003197602345608175
88 0.08646924 0.0003035604313481599
89 0.08664642 0.0003035604313481599
90 0.086805105 0.0003035604313481599
91 0.086939305 0.0003035604313481599
92 0.08710596 0.00031557094189338386
93 0.08726247 0.0003035604313481599
94 0.08738602 0.0003035604313481599
95 0.08757089 0.0003035604313481599
96 0.087709084 0.0003048637881875038
97 0.08785117 0.0003172186843585223
98 0.08802833 0.0003197602345608175
99 0.08816219 0.0003035604313481599
100 0.08833459 0.0003359904221724719
101 0.0884886 0.0003035604313481599
102 0.08864398 0.0003048637881875038
103 0.08880227 0.0003035604313481599
104 0.08896902 0.0003048637881875038
105 0.08910107 0.0003035604313481599
106 0.08926667 0.0003359904221724719
107 0.0894378 0.0003035604313481599
108 0.089568794 0.0003048637881875038
109 0.08974953 0.0003172186843585223
110 0.08987595 0.0003048637881875038
111 0.09004665 0.0003035604313481599
112 0.090225294 0.0003035604313481599
113 0.090371795 0.00031557094189338386
114 0.09052053 0.0003035604313481599
115 0.09068331 0.0003035604313481599
116 0.0908356 0.0003259675286244601
117 0.09098149 0.0003172186843585223
118 0.091161326 0.0003035604313481599
119 0.09132437 0.0003035604313481599
120 0.09145064 0.00031426758505403996
121 0.09163647 0.0003049639635719359
122 0.09175947 0.000303660606732592
123 0.09195082 0.000303660606732592
124 0.09209989 0.00037339873961173
125 0.09226224 0.00033609167439863086
126 0.092403606 0.000303660606732592
127 0.092575684 0.0003232989402022213
128 0.092724964 0.0003113079583272338
129 0.0928749 0.000303660606732592
130 0.093041636 0.0003049639635719359
131 0.0932492 0.000548863725271076
132 0.093575545 0.000548863725271076
133 0.09393424 0.000548863725271076
134 0.09426286 0.0005501671112142503
135 0.09457503 0.000548863725271076
136 0.0949109 0.0006987230153754354
137 0.095343046 0.0006967210792936385
138 0.095765874 0.0006967210792936385
139 0.09618598 0.0006980244070291519
140 0.09660856 0.0006967210792936385
141 0.09702786 0.0006980244070291519
142 0.09744506 0.0006967210792936385
143 0.09787216 0.0006964638596400619
144 0.098283805 0.0006977671873755753
145 0.09869732 0.0006964638596400619
146 0.09912459 0.0006964638596400619
147 0.09951783 0.0006964638596400619
148 0.09995559 0.0006964638596400619
149 0.10036949 0.0006977671873755753
150 0.10079682 0.0006964638596400619
151 0.10120723 0.0006964638596400619
152 0.10162692 0.0006964638596400619
153 0.102024704 0.0006964638596400619
154 0.10247612 0.0006964638596400619
155 0.10286892 0.0006964638596400619
156 0.10330826 0.0006964638596400619
157 0.10371728 0.0006977671873755753
158 0.10411946 0.0006964638596400619
159 0.104556054 0.0006964638596400619
160 0.10494343 0.0006964638596400619
161 0.105392754 0.0006964638596400619
162 0.1058107 0.0006977671873755753
163 0.10622219 0.0006964638596400619
164 0.10665828 0.0006964638596400619
165 0.107054524 0.0006964638596400619
166 0.10749701 0.0006964638596400619
167 0.107880875 0.0006977671873755753
168 0.108363695 0.0009242984815500677
169 0.108925685 0.0009242984815500677
170 0.10950871 0.0009242984815500677
171 0.11008029 0.0009256018092855811
172 0.110664986 0.0009242984815500677
173 0.11124636 0.0009242984815500677
174 0.1118204 0.0009242984815500677
175 0.11237122 0.0009242984815500677
176 0.112955004 0.0009256018092855811
177 0.11352999 0.0009242984815500677
178 0.11410865 0.0009242984815500677
179 0.11469558 0.0009242984815500677
180 0.115223646 0.0009256018092855811
181 0.11583601 0.0009242984815500677
182 0.116399184 0.0009242984815500677
183 0.11696331 0.0009242984815500677
184 0.11755805 0.0009256018092855811
185 0.11812425 0.0009242984815500677
186 0.11870559 0.0009242984815500677
187 0.11927338 0.0009256018092855811
188 0.11986539 0.0009256018092855811
189 0.12042022 0.0009242984815500677
190 0.12099801 0.0009152829879894853
191 0.121533066 0.0009165863157249987
192 0.12208147 0.0009152829879894853
193 0.12265298 0.0009165863157249987
194 0.123206116 0.0009152829879894853
195 0.12377263 0.0009152829879894853
196 0.12432422 0.0009152829879894853
197 0.1248651 0.0009165863157249987
198 0.12543273 0.0009152829879894853
199 0.12598741 0.0009165863157249987
200 0.12649682 0.0009165863157249987
201 0.12707181 0.0009152829879894853
202 0.12763208 0.0009152829879894853
203 0.12818763 0.0009152829879894853
204 0.12873638 0.0009152829879894853
205 0.12930945 0.001039297436363995
206 0.13005458 0.0010406007058918476
207 0.13081264 0.001039297436363995
208 0.13159339 0.001039297436363995
209 0.13235143 0.001039297436363995
210 0.13312067 0.0010406007058918476
211 0.13388121 0.001039297436363995
212 0.13465029 0.001039297436363995
213 0.13540941 0.001039297436363995
214 0.13616672 0.001039297436363995
215 0.13694865 0.001039297436363995
216 0.13772929 0.001039297436363995
217 0.13847396 0.001039297436363995
218 0.13922313 0.0010406007058918476
219 0.14000468 0.001039297436363995
220 0.1407558 0.0010269630001857877
221 0.1414311 0.0010269630001857877
222 0.1421513 0.0010269630001857877
223 0.14284176 0.0010282662697136402
224 0.14353868 0.0010282662697136402
225 0.14421485 0.0010269630001857877
226 0.14493802 0.0010269630001857877
227 0.14563352 0.0010269630001857877
228 0.14630292 0.0010269630001857877
229 0.14701118 0.0010282662697136402
230 0.14767927 0.0010269630001857877
231 0.14838807 0.0010282662697136402
232 0.14909413 0.0010269630001857877
233 0.14977466 0.0010269630001857877
234 0.15044625 0.0010269630001857877
235 0.15116742 0.0010282662697136402
236 0.15185341 0.0010453721042722464
237 0.15255716 0.0010453721042722464
238 0.15326595 0.0010466754902154207
239 0.15398586 0.0010466754902154207
240 0.15467824 0.0010453721042722464
241 0.15538774 0.0010453721042722464
242 0.15609169 0.0010466754902154207
243 0.15681161 0.0010334550170227885
244 0.15749998 0.0010334550170227885
245 0.15817234 0.001034758286550641
246 0.1588495 0.0010334550170227885
247 0.15955731 0.001034758286550641
248 0.1602395 0.0010334550170227885
249 0.16091433 0.001034758286550641
250 0.16161926 0.0010334550170227885
251 0.16230682 0.0010334550170227885
252 0.16297057 0.0010334550170227885
253 0.16366643 0.001034758286550641
254 0.16435352 0.0010334550170227885
255 0.16506052 0.0010422063060104847
256 0.1657235 0.0010422063060104847
257 0.16641217 0.0010409029200673103
258 0.16713175 0.0011633450631052256
259 0.16799636 0.0011633450631052256
260 0.16887681 0.0011633450631052256
261 0.16975336 0.0011633450631052256
262 0.17060798 0.0011633450631052256
263 0.1714916 0.0011633450631052256
264 0.1723558 0.0011633450631052256
265 0.17321947 0.0011633450631052256
266 0.17410064 0.0011633450631052256
267 0.17496765 0.0011646484490484
268 0.17584488 0.0011633450631052256
269 0.17672223 0.0011633450631052256
270 0.17758405 0.0011633450631052256
271 0.17844957 0.0011633450631052256
272 0.17929256 0.0011633450631052256
273 0.1801504 0.0011646484490484
274 0.1810208 0.0011633450631052256
275 0.18185535 0.0011633450631052256
276 0.18274412 0.0011633450631052256
277 0.1835914 0.0011633450631052256
278 0.18444327 0.0011633450631052256
279 0.18529923 0.0011646484490484
280 0.18615171 0.0011633450631052256
281 0.18702075 0.0011633450631052256
282 0.18789604 0.0011633450631052256
283 0.18873143 0.0011633450631052256
284 0.18964577 0.0011633450631052256
285 0.19048303 0.0011633450631052256
286 0.19135761 0.0011646484490484
287 0.1922316 0.0011633450631052256
288 0.19306631 0.0011633450631052256
289 0.19395226 0.0011633450631052256
290 0.19479892 0.0011646484490484
291 0.1956524 0.0011633450631052256
292 0.19652039 0.0011633450631052256
293 0.19735864 0.0011646484490484
294 0.1982168 0.0011633450631052256
295 0.19905323 0.0011633450631052256
296 0.19994366 0.0011633450631052256
297 0.20076564 0.0011646484490484
298 0.20162287 0.0011633450631052256
299 0.20249972 0.0011633450631052256
300 0.20334432 0.0011646484490484
301 0.20421043 0.0011633450631052256
302 0.20506933 0.0011633450631052256
303 0.20591384 0.0011633450631052256
304 0.2067788 0.0011633450631052256
305 0.2076214 0.0011630564695224166
306 0.20847076 0.001161753199994564
307 0.20930934 0.0011547609465196729
308 0.21017495 0.0011547609465196729
309 0.2110126 0.0011560643324628472
310 0.21185476 0.0011547609465196729
311 0.21270594 0.0011547609465196729
312 0.21352428 0.0011547609465196729
313 0.21437383 0.0011547609465196729
314 0.21522999 0.0011547609465196729
315 0.21606508 0.0011547609465196729
316 0.21693018 0.0011547609465196729
317 0.21776438 0.0011547609465196729
318 0.21860445 0.0011547609465196729
319 0.21946377 0.0011547609465196729
320 0.22030687 0.0011547609465196729
321 0.22114545 0.0011547609465196729
322 0.22197145 0.0011547609465196729
323 0.22282624 0.0011547609465196729
324 0.22366986 0.0011547609465196729
325 0.22451457 0.0011547609465196729
326 0.22537518 0.0011547609465196729
327 0.22621363 0.0011547609465196729
328 0.22706088 0.0011547609465196729
329 0.22789448 0.0011547609465196729
330 0.22871679 0.0011547609465196729
331 0.2295802 0.0011547609465196729
332 0.23042527 0.0011547609465196729
333 0.23126474 0.0011547609465196729
334 0.23213267 0.0011547609465196729
335 0.23293126 0.0011547609465196729
336 0.23380387 0.00119047611951828
337 0.23467505 0.00119047611951828
338 0.23550901 0.00119047611951828
339 0.23637694 0.0011949328472837806
340 0.23720443 0.0011949328472837806
341 0.23803565 0.0011949328472837806
342 0.23889732 0.0011949328472837806
343 0.23972046 0.0011859633959829807
344 0.24053669 0.0011859633959829807
345 0.24140269 0.0011859633959829807
346 0.2422227 0.0011859633959829807
347 0.24304807 0.0011769563425332308
348 0.24385178 0.0011859633959829807
349 0.24469233 0.0011769563425332308
350 0.2455194 0.0011859633959829807
351 0.24634546 0.0011859633959829807
352 0.2471638 0.0011769563425332308
353 0.24799958 0.0011769563425332308
354 0.24882391 0.0011859633959829807
355 0.24963304 0.0011859633959829807
356 0.25048518 0.0011801046784967184
357 0.25129056 0.0011693075066432357
358 0.25210515 0.0011693075066432357
359 0.25294277 0.0011801046784967184
360 0.25376484 0.0011801046784967184
361 0.2546005 0.0011801046784967184
362 0.2554038 0.0011693075066432357
363 0.2562459 0.0011801046784967184
364 0.2570611 0.0011693075066432357
365 0.2578666 0.0011801046784967184
366 0.25872436 0.0011693075066432357
367 0.25952893 0.0011801046784967184
368 0.2603687 0.0011801046784967184
369 0.26117754 0.0011693075066432357
370 0.26200697 0.0011693075066432357
371 0.26280937 0.0011801046784967184
372 0.2636469 0.0011801046784967184
373 0.2644574 0.0011801046784967184
374 0.26530296 0.0011775860330089927
375 0.26613316 0.0011883832048624754
376 0.26697263 0.0011775860330089927
377 0.26780468 0.0011883832048624754
378 0.26862416 0.0011775860330089927
379 0.26945022 0.0011883832048624754
380 0.27029875 0.0011775860330089927
381 0.27112618 0.0011883832048624754
382 0.2719626 0.0011883832048624754
383 0.27275893 0.0011775860330089927
384 0.27363223 0.0011883832048624754
385 0.27446464 0.0011883832048624754
386 0.27529076 0.0011775860330089927
387 0.2761158 0.0011775860330089927
388 0.27696207 0.0011883832048624754
389 0.2778019 0.0011730468831956387
390 0.27860922 0.001174796256236732
391 0.27945238 0.001174796256236732
392 0.28024793 0.001174796256236732
393 0.28105906 0.001174796256236732
394 0.2819235 0.001174796256236732
395 0.2827406 0.001174796256236732
396 0.2835723 0.001174796256236732
397 0.284405 0.001174796256236732
398 0.28518394 0.0011692551197484136
399 0.2860337 0.0011756159365177155
400 0.286859 0.0011756159365177155
401 0.2876567 0.0011756159365177155
402 0.2884748 0.0011756159365177155
403 0.28927705 0.0011756159365177155
404 0.2900658 0.0011756159365177155
405 0.29090843 0.0011756159365177155
406 0.29168615 0.0011756159365177155
407 0.29249954 0.0011756159365177155
408 0.29331246 0.0011756159365177155
409 0.29412633 0.0011756159365177155
410 0.29493684 0.0011756159365177155
411 0.29575172 0.0011756159365177155
412 0.29655385 0.0011670526582747698
413 0.2974703 0.0011670526582747698
414 0.29838023 0.0011670526582747698
415 0.29926324 0.0011670526582747698
416 0.30021134 0.0011670526582747698
417 0.30111364 0.0011670526582747698
418 0.3020156 0.0011670526582747698
419 0.30295184 0.0011670526582747698
420 0.3038516 0.0011670526582747698
421 0.30474448 0.0011670526582747698
422 0.3056714 0.0011670526582747698
423 0.30658302 0.0011670526582747698
424 0.307517 0.0011670526582747698
425 0.30842844 0.0011670526582747698
426 0.30931112 0.0011670526582747698
427 0.310214 0.0011670526582747698
428 0.31112623 0.0011670526582747698
429 0.3120542 0.0011670526582747698
430 0.31292132 0.0011670526582747698
431 0.31384367 0.0011670526582747698
432 0.31476337 0.0011670526582747698
433 0.31565458 0.0011670526582747698
434 0.3165592 0.0011670526582747698
435 0.3174939 0.0011670526582747698
436 0.31838903 0.0011670526582747698
437 0.31929648 0.0011670526582747698
438 0.32019052 0.0011670526582747698
439 0.321099 0.0011670526582747698
440 0.32200316 0.0011670526582747698
441 0.32291114 0.0011670526582747698
442 0.32379955 0.0011670526582747698
443 0.32469973 0.0011670526582747698
444 0.32562268 0.0011670526582747698
445 0.32650757 0.0011670526582747698
446 0.32741365 0.0011670526582747698
447 0.3283232 0.0011670526582747698
448 0.3292402 0.0011670526582747698
449 0.330129 0.0011670526582747698
450 0.33103305 0.0011670526582747698
451 0.3319336 0.0011670526582747698
452 0.3328155 0.0011670526582747698
453 0.33374733 0.0011670526582747698
454 0.33464786 0.0011670526582747698
455 0.33553892 0.0011670526582747698
456 0.3364692 0.0011670526582747698
457 0.33737844 0.0011670526582747698
458 0.3382727 0.0011670526582747698
459 0.3391565 0.0011670526582747698
460 0.34008282 0.0011670526582747698
461 0.3409896 0.0011670526582747698
462 0.34186038 0.0011670526582747698
463 0.34278762 0.0011670526582747698
464 0.3436863 0.0011670526582747698
465 0.34458283 0.0011670526582747698
466 0.34549478 0.0010790652595460415
467 0.34631377 0.0010790652595460415
468 0.34719032 0.0010790652595460415
469 0.3480214 0.0010790652595460415
470 0.34890258 0.0010790652595460415
471 0.3497575 0.0010790652595460415
472 0.3505807 0.0010790652595460415
473 0.3514599 0.0010790652595460415
474 0.3522837 0.0010790652595460415
475 0.35314745 0.0010790652595460415
476 0.35401186 0.0010790652595460415
477 0.35486692 0.0010790652595460415
478 0.35568154 0.0010790652595460415
479 0.35657242 0.0010790652595460415
480 0.35737127 0.0010154407937079668
481 0.3581802 0.0010154407937079668
482 0.35896868 0.0010154407937079668
483 0.35978597 0.0010154407937079668
484 0.36058876 0.0010154407937079668
485 0.36137655 0.0010154407937079668
486 0.36217743 0.0010154407937079668
487 0.36296797 0.0010154407937079668
488 0.3637755 0.0010154407937079668
489 0.36456716 0.0010154407937079668
490 0.36538386 0.0010154407937079668
491 0.36614984 0.0010154407937079668
492 0.36697516 0.0010154407937079668
493 0.36775056 0.0010154407937079668
494 0.36854511 0.0010154407937079668
495 0.3693443 0.0010154407937079668
496 0.37014896 0.0010154407937079668
497 0.37095195 0.0010154407937079668
498 0.37172404 0.0010154407937079668
499 0.37254268 0.0010154407937079668
500 0.37333837 0.0010154407937079668
501 0.37415114 0.0010154407937079668
502 0.37494037 0.0010154407937079668
503 0.3757527 0.0010154407937079668
504 0.3765402 0.0010154407937079668
505 0.37734315 0.0010154407937079668
506 0.37814957 0.0010154407937079668
507 0.3789367 0.0010154407937079668
508 0.3797198 0.0010154407937079668
509 0.38052315 0.0010154407937079668
510 0.3813415 0.0010154407937079668
511 0.38213024 0.0010154407937079668
512 0.38292485 0.0010154407937079668
513 0.38370124 0.0010154407937079668
514 0.3845234 0.0010154407937079668
515 0.38530743 0.0010154407937079668
516 0.38611794 0.0010154407937079668
517 0.3869216 0.0010154407937079668
518 0.3876673 0.0010154407937079668
519 0.38849553 0.0010154407937079668
520 0.3892975 0.0010154407937079668
521 0.39006588 0.0010154407937079668
522 0.39090362 0.0010154407937079668
523 0.39167362 0.0010154407937079668
524 0.39247113 0.0010154407937079668
525 0.3932494 0.0010154407937079668
526 0.3940714 0.0010154407937079668
527 0.394863 0.0010154407937079668
528 0.3956417 0.0010154407937079668
529 0.3964413 0.0010154407937079668
530 0.39725867 0.0010154407937079668
531 0.398041 0.0010154407937079668
532 0.39885044 0.0010154407937079668
533 0.3996294 0.0010154407937079668
534 0.40043512 0.0010154407937079668
535 0.40123594 0.0010154407937079668
536 0.40202937 0.0010154407937079668
537 0.402824 0.0010154407937079668
538 0.40364966 0.0010154407937079668
539 0.40445825 0.0010154407937079668
540 0.40526012 0.0010154407937079668
541 0.40606946 0.0010154407937079668
542 0.40684387 0.0010154407937079668
543 0.4076617 0.0010154407937079668
544 0.40847754 0.0010154407937079668
545 0.40928942 0.0010154407937079668
546 0.41008174 0.0010154407937079668
547 0.41090408 0.0010154407937079668
548 0.41169664 0.0010154407937079668
549 0.41249254 0.0010154407937079668
550 0.41331044 0.0010154407937079668
551 0.41412392 0.00101051339879632
552 0.41491395 0.00101051339879632
553 0.41570634 0.00101051339879632
554 0.4165171 0.00101051339879632
555 0.4173471 0.00101051339879632
556 0.41813064 0.00101051339879632
557 0.4189338 0.00101051339879632
558 0.41971493 0.00101051339879632
559 0.42053598 0.00101051339879632
560 0.4213166 0.00101051339879632
561 0.42214185 0.00101051339879632
562 0.42296207 0.00101051339879632
563 0.42373446 0.00101051339879632
564 0.4245199 0.00101051339879632
565 0.42533642 0.00101051339879632
566 0.4261579 0.00101051339879632
567 0.42693517 0.00101051339879632
568 0.42773804 0.0010139307705685496
569 0.42855272 0.0010139307705685496
570 0.42936152 0.0010139307705685496
571 0.43015447 0.0010139307705685496
572 0.4309811 0.0010139307705685496
573 0.43175057 0.0010139307705685496
574 0.43258232 0.0010139307705685496
575 0.43339017 0.0010139307705685496
576 0.4341939 0.0010139307705685496
577 0.43496978 0.0010139307705685496
578 0.4357804 0.0010139307705685496
579 0.4365905 0.0009832626674324274
580 0.4373628 0.0009832626674324274
581 0.43810424 0.0009832626674324274
582 0.4388808 0.0009832626674324274
583 0.4396517 0.0009832626674324274
584 0.44040602 0.0009832626674324274
585 0.441188 0.0009832626674324274
586 0.44193944 0.0009832626674324274
587 0.44268337 0.0009832626674324274
588 0.44346732 0.0009832626674324274
589 0.4442556 0.0009832626674324274
590 0.4449988 0.0009832626674324274
591 0.44578514 0.0009832626674324274
592 0.44653383 0.0009832626674324274
593 0.44730932 0.0009832626674324274
594 0.44806883 0.0009832626674324274
595 0.44879472 0.0009832626674324274
596 0.4495962 0.0009832626674324274
597 0.45036265 0.0009832626674324274
598 0.4511082 0.0009832626674324274
599 0.45191392 0.0009832626674324274
600 0.45263118 0.0009832626674324274
601 0.45340177 0.0009832626674324274
602 0.45418617 0.0009832626674324274
603 0.45495698 0.0009832626674324274
604 0.4556701 0.0009832626674324274
605 0.45644075 0.0009832626674324274
606 0.45722923 0.0009832626674324274
607 0.4580336 0.0010870908154174685
608 0.4589291 0.0010870908154174685
609 0.45981702 0.001059028902091086
610 0.46067777 0.001059028902091086
611 0.461596 0.001059028902091086
612 0.46244848 0.001059028902091086
613 0.46335196 0.001059028902091086
614 0.4642234 0.001059028902091086
615 0.4651056 0.001059028902091086
616 0.46600258 0.001059028902091086
617 0.46684855 0.001059028902091086
618 0.46774 0.001059028902091086
619 0.4686332 0.001059028902091086
620 0.46951178 0.001059028902091086
621 0.47038 0.000985809019766748
622 0.47121063 0.0010329419746994972
623 0.47214574 0.0010329419746994972
624 0.47310114 0.0010329419746994972
625 0.47406873 0.0010329419746994972
626 0.47502652 0.0010329419746994972
627 0.47594658 0.0010326988995075226
628 0.47688997 0.0010326988995075226
629 0.47786474 0.0010326988995075226
630 0.4788167 0.0010326988995075226
631 0.4797461 0.0010326988995075226
632 0.48066002 0.0010326988995075226
633 0.4816415 0.0010326988995075226
634 0.4825965 0.0010326988995075226
635 0.4835614 0.0010326988995075226
636 0.4844857 0.0010326988995075226
637 0.48542342 0.0010326988995075226
638 0.48639002 0.0010326988995075226
639 0.4872793 0.0010326988995075226
640 0.48816916 0.0010326988995075226
641 0.4891415 0.0010326988995075226
642 0.4900629 0.0010326988995075226
643 0.4910013 0.0010326988995075226
644 0.49189064 0.0010326988995075226
645 0.49282485 0.0010326988995075226
646 0.49373966 0.0010326988995075226
647 0.49464673 0.0010326988995075226
648 0.49556008 0.0010326988995075226
649 0.49648935 0.0010326988995075226
650 0.49737707 0.0010326988995075226
651 0.49832752 0.0010326988995075226
652 0.49926782 0.0010326988995075226
653 0.5001807 0.0010326988995075226
654 0.5010698 0.0010326988995075226
655 0.5020299 0.0010326988995075226
656 0.5029193 0.0010326988995075226
657 0.5038766 0.0010326988995075226
658 0.50479966 0.0010326988995075226
659 0.5056684 0.0010326988995075226
660 0.50657624 0.0010326988995075226
661 0.5075078 0.0010326988995075226
662 0.50838625 0.0010326988995075226
663 0.5092764 0.0010326988995075226
664 0.51016474 0.0010326988995075226
665 0.51108813 0.0010326988995075226
666 0.51197284 0.0010326988995075226
667 0.51285535 0.0010326988995075226
668 0.51377314 0.0010326988995075226
669 0.5146683 0.0010326988995075226
670 0.51558805 0.0010326988995075226
671 0.5164205 0.0010326988995075226
672 0.51737213 0.0010326988995075226
673 0.5182538 0.0010326988995075226
674 0.51913476 0.0010326988995075226
675 0.52007437 0.0010326988995075226
676 0.5209829 0.0010326988995075226
677 0.52187335 0.0010326988995075226
678 0.5227336 0.0010326988995075226
679 0.5236664 0.0010326988995075226
680 0.52454185 0.0010326988995075226
681 0.52543604 0.0010326988995075226
682 0.5263518 0.0010326988995075226
683 0.5271962 0.0010326988995075226
684 0.52804476 0.0010326988995075226
685 0.5289265 0.0010326988995075226
686 0.52977735 0.0010326988995075226
687 0.53062344 0.0010326988995075226
688 0.5314872 0.0010326988995075226
689 0.53233325 0.0010326988995075226
690 0.5331717 0.0010326988995075226
691 0.5340625 0.0010326988995075226
692 0.5349046 0.0010326988995075226
693 0.5357088 0.0010326988995075226
694 0.5365881 0.0010326988995075226
695 0.53747183 0.0010326988995075226
696 0.5382971 0.0010326988995075226
697 0.5391491 0.0010307517368346453
698 0.5400001 0.0010307517368346453
699 0.54090506 0.0010307517368346453
700 0.5417581 0.0010307517368346453
701 0.5425821 0.0010307517368346453
702 0.5434768 0.0010307517368346453
703 0.5442994 0.0010307517368346453
704 0.545164 0.0010307517368346453
705 0.5460554 0.0010307517368346453
706 0.54692215 0.0010307517368346453
707 0.5477654 0.0010307517368346453
708 0.54859924 0.0010307517368346453
709 0.5495042 0.0010307517368346453
710 0.5503597 0.0010307517368346453
711 0.55117035 0.0010307517368346453
712 0.55207765 0.0010307517368346453
713 0.5529067 0.0010238096583634615
714 0.5537985 0.0010238096583634615
715 0.5546446 0.0009919860167428851
716 0.55550605 0.0009919860167428851
717 0.5563586 0.0009919860167428851
718 0.55718243 0.0009919860167428851
719 0.55800784 0.0010421713814139366
720 0.5589189 0.0010421713814139366
721 0.559784 0.001022367156110704
722 0.560643 0.001022367156110704
723 0.5615605 0.001022367156110704
724 0.5624156 0.001022367156110704
725 0.5632752 0.001022367156110704
726 0.5641829 0.001022367156110704
727 0.56504005 0.001022367156110704
728 0.5659065 0.0010457722237333655
729 0.5667304 0.000956058909650892
730 0.5675022 0.000956058909650892
731 0.5683356 0.000956058909650892
732 0.56912684 0.000956058909650892
733 0.5699113 0.0009668168495409191
734 0.5706983 0.0009668168495409191
735 0.5714856 0.0009668168495409191
736 0.57226485 0.0009668168495409191
737 0.5730624 0.0009668168495409191
738 0.57386065 0.0009668168495409191
739 0.57462347 0.0009668168495409191
740 0.57542187 0.0009668168495409191
741 0.57620674 0.0009668168495409191
742 0.5769996 0.0009668168495409191
743 0.57778823 0.0009668168495409191
744 0.5785791 0.0009668168495409191
745 0.5793726 0.0009668168495409191
746 0.5801276 0.0009668168495409191
747 0.58092684 0.0009668168495409191
748 0.5817179 0.0009668168495409191
749 0.5825163 0.0009668168495409191
750 0.5832929 0.0009668168495409191
751 0.5840812 0.0009668168495409191
752 0.5848595 0.0009668168495409191
753 0.58565956 0.0009668168495409191
754 0.5864584 0.0009668168495409191
755 0.58722854 0.0009668168495409191
756 0.5880059 0.0009668168495409191
757 0.58879775 0.0009668168495409191
758 0.5895954 0.0009668168495409191
759 0.59037656 0.0009668168495409191
760 0.5911552 0.0009668168495409191
761 0.59193975 0.0009668168495409191
762 0.5927428 0.0009668168495409191
763 0.59353584 0.0009668168495409191
764 0.5942976 0.0009668168495409191
765 0.5951029 0.0009668168495409191
766 0.5958876 0.0009668168495409191
767 0.59666055 0.0009668168495409191
768 0.5974607 0.0009668168495409191
769 0.5982454 0.0009668168495409191
770 0.599039 0.0009668168495409191
771 0.59982723 0.0009668168495409191
772 0.6006173 0.0009668168495409191
773 0.60140604 0.0009668168495409191
774 0.60219723 0.0009668168495409191
775 0.6029881 0.0009668168495409191
776 0.6037439 0.0009668168495409191
777 0.60455 0.0009668168495409191
778 0.6053583 0.0009668168495409191
779 0.60611725 0.0009668168495409191
780 0.6069193 0.0009668168495409191
781 0.6077134 0.0009668168495409191
782 0.6084981 0.0009668168495409191
783 0.6092804 0.0009668168495409191
784 0.61007494 0.0009668168495409191
785 0.61085117 0.0009668168495409191
786 0.6116433 0.0009668168495409191
787 0.61245275 0.0009668168495409191
788 0.6132225 0.0009668168495409191
789 0.61400676 0.0009886898333206773
790 0.61478317 0.0009886898333206773
791 0.6155784 0.0009886898333206773
792 0.6163532 0.0009886898333206773
793 0.6171217 0.0009886898333206773
794 0.6179187 0.0009886898333206773
795 0.61868626 0.0009886898333206773
796 0.6194822 0.0009886898333206773
797 0.6202694 0.0009886898333206773
798 0.62104636 0.0009886898333206773
799 0.6218326 0.0009886898333206773
800 0.6226178 0.0009886898333206773
801 0.62339514 0.0009886898333206773
802 0.6241851 0.0009886898333206773
803 0.62495697 0.0009886898333206773
804 0.6257293 0.0009886898333206773
805 0.6264954 0.0009886898333206773
806 0.62730217 0.0009886898333206773
807 0.62808347 0.0009886898333206773
808 0.628863 0.0009886898333206773
809 0.6296295 0.0009886898333206773
810 0.6304254 0.0009886898333206773
811 0.6312075 0.0009886898333206773
812 0.6320016 0.0009886898333206773
813 0.63276213 0.0009979268070310354
814 0.633546 0.0009979268070310354
815 0.6343409 0.0009979268070310354
816 0.63511616 0.0009979268070310354
817 0.63590056 0.0009979268070310354
818 0.63669586 0.0009979268070310354
819 0.6374912 0.001023309538140893
820 0.6383158 0.001023309538140893
821 0.6391846 0.001023309538140893
822 0.6400445 0.001023309538140893
823 0.6408873 0.001023309538140893
824 0.6417466 0.001023309538140893
825 0.6425912 0.001023309538140893
826 0.643449 0.001023309538140893
827 0.6442834 0.0009945675265043974
828 0.6451267 0.0009945675265043974
829 0.64594156 0.0009945675265043974
830 0.6467531 0.0009945675265043974
831 0.6476129 0.0009945675265043974
832 0.6484228 0.0009945675265043974
833 0.6492394 0.0009945675265043974
834 0.65007967 0.0009945675265043974
835 0.6508905 0.0009945675265043974
836 0.6517256 0.0009945675265043974
837 0.6525289 0.0009945675265043974
838 0.65336 0.0009945675265043974
839 0.65417266 0.0009945675265043974
840 0.6549925 0.0009945675265043974
841 0.6558189 0.0009945675265043974
842 0.6566242 0.0009945675265043974
843 0.6574596 0.0009945675265043974
844 0.6582777 0.0009945675265043974
845 0.6590987 0.0009945675265043974
846 0.65990216 0.0009945675265043974
847 0.66071415 0.0009945675265043974
848 0.6615644 0.0009945675265043974
849 0.6623605 0.0009945675265043974
850 0.6631765 0.0009945675265043974
851 0.66401565 0.0009945675265043974
852 0.6648319 0.0009945675265043974
853 0.665637 0.0009945675265043974
854 0.66644573 0.0009945675265043974
855 0.6672871 0.0009945675265043974
856 0.66809356 0.0009945675265043974
857 0.66891545 0.0009945675265043974
858 0.6697405 0.0009945675265043974
859 0.6705429 0.0009945675265043974
860 0.6713725 0.0009945675265043974
861 0.6721709 0.0009945675265043974
862 0.67298466 0.0009945675265043974
863 0.67382187 0.0009945675265043974
864 0.6746325 0.0009945675265043974
865 0.6754544 0.0009945675265043974
866 0.67626846 0.0009945675265043974
867 0.67705846 0.0009945675265043974
868 0.67789626 0.0009945675265043974
869 0.67871684 0.0009945675265043974
870 0.6795344 0.0009945675265043974
871 0.6803413 0.0009945675265043974
872 0.6811339 0.0009945675265043974
873 0.6819754 0.0009945675265043974
874 0.6827903 0.0009903388563543558
875 0.6835876 0.0009903388563543558
876 0.6843899 0.0009995902655646205
877 0.68519706 0.0009995902655646205
878 0.6860052 0.0009995902655646205
879 0.68679684 0.0009995902655646205
880 0.68760914 0.0009995902655646205
881 0.6884222 0.0009995902655646205
882 0.6892284 0.0009995902655646205
883 0.6900231 0.0009995902655646205
884 0.6908194 0.0009995902655646205
885 0.6916016 0.0009995902655646205
886 0.6924194 0.0009995902655646205
887 0.693211 0.0009995902655646205
888 0.6940029 0.0009995902655646205
889 0.6947848 0.0009995902655646205
890 0.6955754 0.0009995902655646205
891 0.69638306 0.0009995902655646205
892 0.69717735 0.0009995902655646205
893 0.697969 0.0009995902655646205
894 0.69876575 0.0009995902655646205
895 0.6995406 0.0009995902655646205
896 0.7003432 0.0009995902655646205
897 0.70114625 0.0009995902655646205
898 0.70194095 0.0009995902655646205
899 0.70272666 0.0009995902655646205
900 0.7035292 0.0009995902655646205
901 0.7043215 0.0009995902655646205
902 0.7050826 0.0009995902655646205
903 0.70590913 0.0009995902655646205
904 0.7066984 0.0009995902655646205
905 0.7074973 0.0009995902655646205
906 0.7082788 0.0009995902655646205
907 0.7090724 0.0009995902655646205
908 0.7098745 0.0009995902655646205
909 0.710666 0.0009995902655646205
910 0.7114511 0.0009995902655646205
911 0.7122621 0.0009995902655646205
912 0.7130417 0.0009995902655646205
913 0.71384054 0.0009995902655646205
914 0.7146361 0.0009995902655646205
915 0.7154125 0.0009995902655646205
916 0.7162201 0.0009995902655646205
917 0.7170155 0.0010522683151066303
918 0.71787304 0.0010522683151066303
919 0.7187354 0.0010522683151066303
920 0.7195928 0.0010522683151066303
921 0.72045165 0.0010522683151066303
922 0.7212997 0.0010522683151066303
923 0.7221617 0.0010522683151066303
924 0.72301835 0.0010522683151066303
925 0.72386307 0.0010522683151066303
926 0.7247229 0.0010522683151066303
927 0.7255807 0.0010522683151066303
928 0.7264381 0.0010522683151066303
929 0.7272903 0.0010522683151066303
930 0.7281472 0.0010522683151066303
931 0.7290011 0.0010522683151066303
932 0.72984874 0.0010522683151066303
933 0.7307101 0.0010522683151066303
934 0.7315547 0.0010522683151066303
935 0.73242086 0.0010522683151066303
936 0.73327273 0.0010522683151066303
937 0.7341252 0.0010522683151066303
938 0.7349861 0.0010522683151066303
939 0.7358517 0.0010522683151066303
940 0.73670053 0.0010522683151066303
941 0.73756105 0.0010522683151066303
942 0.7384129 0.0010522683151066303
943 0.73925865 0.0010522683151066303
944 0.74012375 0.0010522683151066303
945 0.74095243 0.0009738617809489369
946 0.74173504 0.0009738617809489369
947 0.7424955 0.0009738617809489369
948 0.7432958 0.0009738617809489369
949 0.7440823 0.0009738617809489369
950 0.74485135 0.0009738617809489369
951 0.7456361 0.0009738617809489369
952 0.7464003 0.0009509164374321699
953 0.74717706 0.0009738617809489369
954 0.74796754 0.0009738617809489369
955 0.7487558 0.0009738617809489369
956 0.7495361 0.0009738617809489369
957 0.7503084 0.0009738617809489369
958 0.75110865 0.0009738617809489369
959 0.7518831 0.0009738617809489369
960 0.7526496 0.0009509164374321699
961 0.7534354 0.0009738617809489369
962 0.7542203 0.0009738617809489369
963 0.7550154 0.0009738617809489369
964 0.7557916 0.0009738617809489369
965 0.7565614 0.0009738617809489369
966 0.7573651 0.0009738617809489369
967 0.75814277 0.0009738617809489369
968 0.75891125 0.0009509164374321699
969 0.7597001 0.0009738617809489369
970 0.76046485 0.0009738617809489369
971 0.761245 0.0009738617809489369
972 0.76204234 0.0009672498563304543
973 0.7628137 0.0009672498563304543
974 0.7635646 0.0009460232686251402
975 0.7643449 0.0009460232686251402
976 0.76511234 0.0009460232686251402
977 0.7658671 0.0009460232686251402
978 0.7666363 0.0009460232686251402
979 0.76740277 0.0009460232686251402
980 0.7681641 0.0009460232686251402
981 0.768928 0.0009455836843699217
982 0.76968527 0.0009460232686251402
983 0.77044827 0.0009460232686251402
984 0.77121806 0.0009460232686251402
985 0.771972 0.0009067181963473558
986 0.77267444 0.0009067181963473558
987 0.7733834 0.0009067181963473558
988 0.7740967 0.0009067181963473558
989 0.7748062 0.0009067181963473558
990 0.7755114 0.0009067181963473558
991 0.77621996 0.0009067181963473558
992 0.7769272 0.0009067181963473558
993 0.77763164 0.0009067181963473558
994 0.7783504 0.0009067181963473558
995 0.7790618 0.0009067181963473558
996 0.77976304 0.0009067181963473558
997 0.78046834 0.0009067181963473558
998 0.7811791 0.0009067181963473558
999 0.7818845 0.0009067181963473558
../../../_images/cnn-learn_23_1.png

If we maximize the output of a neuron in a convolutional layer, then the result will differ from the initial guess only in the region the neuron is connected to. All other pixels have no influence on the neuron’s output. Thus, corresponding components of the gradient are zero in each iteration. To see the details we crop the image. For neurons in the first convolution layer, the maximizing input is the corresponding filter.

# mask pixels to keep when cropping
mask_r = np.abs(img_to_show[:, :, 0] - img_to_show[-1, -1, 0]) > 0.09
mask_g = np.abs(img_to_show[:, :, 1] - img_to_show[-1, -1, 1]) > 0.09
mask_b = np.abs(img_to_show[:, :, 2] - img_to_show[-1, -1, 2]) > 0.09
mask = np.logical_or(mask_r, np.logical_or(mask_g, mask_b))

# get active columns    
col_mask = mask.any(0)
bb_col_start = col_mask.argmax()
bb_col_end = img_to_show.shape[1] - 1 - col_mask[::-1].argmax()

# get active rows
row_mask = mask.any(1)
bb_row_start = row_mask.argmax()
bb_row_end = img_to_show.shape[0] - 1 - row_mask[::-1].argmax()
    
# crop image to bounding box
bb_img = img_to_show[bb_row_start:(bb_row_end + 1), bb_col_start:(bb_col_end + 1)]

# show cropped image
fig, ax = plt.subplots()
ax.imshow(bb_img, cmap='gray')
plt.show()
../../../_images/cnn-learn_25_0.png

Maximizing the output of the first output neuron modifies the initial guess to yield output 1 (the maximum value of sigmoid activation function). That is, we obtain an image the net regards as a cat. Starting with a plain image we get some artistic images. Starting with a photo of a dog we get a slightly blurred dog, which the net labels as cat. By modifying images that way CNNs can be fooled. The CNN ‘sees’ a very different thing than a human.

pred = model.predict(img.reshape(1, *img.shape))[0]
print('cat: {:.4f}, dog: {:.4f}'.format(pred[0], pred[1]))
1/1 [==============================] - 0s 24ms/step
cat: 0.9065, dog: 0.2174

The idea of searching for output maximizing inputs is known as dreaming. Google’s DeepDream from 2015 uses the techniques discussed above. A similar application of dreaming CNNs is neural style transfer, also appearing in 2015.

Maximizing Feature Maps#

Instead of maximizing single neuron outputs we could look for feature maps having high values in all components or at least high mean (the latter is easier to differentiate). An input image that maximizes a feature map would show a pattern that is tightly connected to the corresponding filter.

layer = model.get_layer('conv4')
fmap_index = 2

submodel = keras.models.Model(inputs=model.inputs,
                              outputs=tf.math.reduce_mean(layer.output[0, :, :, fmap_index]))


img = 255 * np.random.default_rng(0).normal(0.5, 0.2, size=(img_size, img_size, 3))

# parameters for gradient ascent
img = gradient_ascent(submodel, img, 1000, 1000000)

# show result
img_to_show = 1 / (img.max() - img.min()) * (img - img.min())
fig, ax = plt.subplots()
ax.imshow(img_to_show)
plt.show()
0 0.010571016 2.397079242655309e-06
1 0.012987887 2.784569460345665e-06
2 0.01555704 2.6928230454359436e-06
3 0.018290414 2.2864280708745355e-06
4 0.021174435 2.192614147134009e-06
5 0.024220329 2.4407656837865943e-06
6 0.027513845 2.4178386865969514e-06
7 0.03095166 2.223908268206287e-06
8 0.03454724 2.0893653527309652e-06
9 0.038174365 2.028772314588423e-06
10 0.041862242 2.097629021591274e-06
11 0.045696992 2.3129020974010928e-06
12 0.049698126 2.385100970059284e-06
13 0.05387558 2.3095196866051992e-06
14 0.058383714 2.396672925897292e-06
15 0.063163 2.3852505819377257e-06
16 0.068157524 2.510322474336135e-06
17 0.07321679 2.4331638996955007e-06
18 0.07840866 2.3761479042150313e-06
19 0.08379156 2.492263092790381e-06
20 0.089350894 2.4447754185530357e-06
21 0.09505746 2.397075149929151e-06
22 0.10108615 2.545427378208842e-06
23 0.10727019 2.3443969894287875e-06
24 0.11355108 2.2837014057586202e-06
25 0.11987263 2.431382426948403e-06
26 0.12621516 2.6055895432364196e-06
27 0.13275892 2.548260454204865e-06
28 0.13940799 2.536550937293214e-06
29 0.14614432 2.5595429633540334e-06
30 0.15306608 2.4765708985796664e-06
31 0.1601255 2.4590774501120904e-06
32 0.16713977 2.3455713744624518e-06
33 0.17422041 2.3978557237569476e-06
34 0.18140693 2.474193479429232e-06
35 0.1886376 2.5417300548724597e-06
36 0.19593534 2.533454107833677e-06
37 0.20329633 2.704197186176316e-06
38 0.21070997 2.6751181394502055e-06
39 0.21831936 2.7035023322241614e-06
40 0.2259554 2.7090159164799843e-06
41 0.23365016 2.4669375306984875e-06
42 0.24140108 2.3419049739459297e-06
43 0.24929094 2.5573299353709444e-06
44 0.25731435 2.5564454517734703e-06
45 0.26534614 2.4323860543518094e-06
46 0.2732757 2.296659886269481e-06
47 0.28120157 2.4815644792397507e-06
48 0.28913957 2.3013376448943745e-06
49 0.29706028 2.140596961908159e-06
50 0.30498374 1.9981957848358434e-06
51 0.3129182 2.2039150735508883e-06
52 0.32090297 2.156832579203183e-06
53 0.32904193 2.1195842236920726e-06
54 0.33714753 2.443357743686647e-06
55 0.34530818 2.8032877708028536e-06
56 0.3534469 2.4665589535288746e-06
57 0.36151606 2.3440138647856656e-06
58 0.3696367 2.518916517146863e-06
59 0.37779084 2.2818105662736343e-06
60 0.38606593 2.8909905722684925e-06
61 0.39434585 2.2529472971655196e-06
62 0.4026392 2.309225692442851e-06
63 0.4110172 2.1857999854546506e-06
64 0.41925484 2.1461057713167975e-06
65 0.42749077 2.2192639335116837e-06
66 0.4357061 2.193360160163138e-06
67 0.44399276 2.109282604578766e-06
68 0.45219928 2.2869414806336863e-06
69 0.46037853 2.239397417724831e-06
70 0.4686118 2.8001759346807376e-06
71 0.47689474 2.122434580087429e-06
72 0.48513776 2.1441801436594687e-06
73 0.4933584 2.334981900276034e-06
74 0.5015519 2.120641738656559e-06
75 0.50966257 2.1056046080047963e-06
76 0.5177873 2.0936017790518235e-06
77 0.525847 2.438975798213505e-06
78 0.53396064 2.3880647859186865e-06
79 0.5421345 2.4256362394226016e-06
80 0.5503483 2.482177706042421e-06
81 0.55855685 2.787142193483305e-06
82 0.56671786 3.04928016703343e-06
83 0.5748916 2.9708369311265415e-06
84 0.5830835 2.8886945528938668e-06
85 0.5912361 2.8094241315557156e-06
86 0.59929144 2.7368155315343756e-06
87 0.60734797 2.5096640001720516e-06
88 0.6153681 2.514457037250395e-06
89 0.6233643 2.5291581096098525e-06
90 0.6313867 2.600254674689495e-06
91 0.6394314 2.527786591599579e-06
92 0.64746135 2.4455698621750344e-06
93 0.6554675 2.67075483861845e-06
94 0.6634189 2.3975730982783716e-06
95 0.67137367 2.2373308183887275e-06
96 0.6793155 2.5580250166967744e-06
97 0.68721086 2.279660520798643e-06
98 0.69512206 2.402846348559251e-06
99 0.7030043 2.3763377612340264e-06
100 0.7108531 2.5845868094620528e-06
101 0.71870077 2.334294322281494e-06
102 0.7265177 2.280145736222039e-06
103 0.73432523 2.3041966414893977e-06
104 0.7420898 2.303242581547238e-06
105 0.74978316 2.1794035092170816e-06
106 0.7574304 2.0365685031720204e-06
107 0.76501596 2.0377319742692634e-06
108 0.77257127 2.194588660131558e-06
109 0.78012925 2.131105247826781e-06
110 0.7877227 1.955324023583671e-06
111 0.7952557 2.091439228024683e-06
112 0.80278605 2.1004384507250506e-06
113 0.810318 2.441367769279168e-06
114 0.8178436 2.139418256774661e-06
115 0.825362 2.3440695713361492e-06
116 0.8328828 2.318167616977007e-06
117 0.84033525 2.320716703252401e-06
118 0.8478031 2.3263285129360156e-06
119 0.85527384 2.0725783542729914e-06
120 0.8627522 2.3086456621967955e-06
121 0.8702377 2.137223191311932e-06
122 0.87767583 2.0074051008123206e-06
123 0.88507175 2.003750751100597e-06
124 0.89243096 2.084359493892407e-06
125 0.8997858 2.1222529085207498e-06
126 0.9070997 2.1319103780115256e-06
127 0.9144078 2.0067941477464046e-06
128 0.92168254 2.066610932160984e-06
129 0.9289937 2.0197871890559327e-06
130 0.93623686 2.020472265940043e-06
131 0.9434584 2.035428451563348e-06
132 0.9507076 2.061598024738487e-06
133 0.95796466 2.0523373223113595e-06
134 0.96518636 2.0212710296618752e-06
135 0.97236973 2.0048980786668835e-06
136 0.9794895 2.0497568584687542e-06
137 0.986583 2.0535619569272967e-06
138 0.9936203 2.0658633275161264e-06
139 1.000673 2.0821651105507044e-06
140 1.007712 2.084585275952122e-06
141 1.014711 2.0824995772272814e-06
142 1.0216967 2.0984755337849492e-06
143 1.0286314 1.8881830783357145e-06
144 1.0355525 1.9895196601282805e-06
145 1.0424492 1.8702750139709678e-06
146 1.0493333 2.0059178496012464e-06
147 1.0561991 1.966285481103114e-06
148 1.0630286 1.821309638216917e-06
149 1.0698861 1.9129713564325357e-06
150 1.076735 1.7878787730296608e-06
151 1.0835289 1.8801962369252578e-06
152 1.0903178 1.8124344478565035e-06
153 1.0970829 1.948431417986285e-06
154 1.1038616 1.7839453221313306e-06
155 1.1105878 2.1282348825479858e-06
156 1.1173223 1.8473035652277758e-06
157 1.124048 1.9240574147261214e-06
158 1.1307495 1.90238677078014e-06
159 1.1374056 1.964408056664979e-06
160 1.144039 1.807362195904716e-06
161 1.1506593 1.8224728819404845e-06
162 1.1572381 1.8059943158732494e-06
163 1.1638066 1.8310357745576766e-06
164 1.1703814 1.8234918570669834e-06
165 1.176913 1.7533444633954787e-06
166 1.183428 1.8026402130999486e-06
167 1.1899256 1.757058157636493e-06
168 1.1964225 1.7007130281854188e-06
169 1.2028683 1.8384580471320078e-06
170 1.2093241 1.7406479173587286e-06
171 1.2157739 1.8171124338550726e-06
172 1.2222054 1.7633992683840916e-06
173 1.2286144 1.758349867486686e-06
174 1.2350003 1.7779620975488797e-06
175 1.2413809 1.8715239775701775e-06
176 1.2477415 1.7879774532048032e-06
177 1.2541293 1.7896188637678279e-06
178 1.2604756 1.7246613879251527e-06
179 1.266821 1.7677963342066505e-06
180 1.2731405 1.768632500898093e-06
181 1.2794458 1.7839678321251995e-06
182 1.2857305 1.773675194272073e-06
183 1.2919948 1.7608682583158952e-06
184 1.2982318 1.7746033336152323e-06
185 1.3044622 1.7512152226117905e-06
186 1.3106936 1.7233068092536996e-06
187 1.316934 1.7201651871801005e-06
188 1.3231684 1.742623908285168e-06
189 1.3293878 2.024584318860434e-06
190 1.3355787 1.7420566109649371e-06
191 1.3417751 1.9152628283336526e-06
192 1.3479532 1.7532282754473272e-06
193 1.3541085 1.9044773580390029e-06
194 1.3602808 1.7041833189068711e-06
195 1.366442 1.7418759625797975e-06
196 1.3725847 1.6856592992553487e-06
197 1.3787143 1.696750814517145e-06
198 1.3848201 1.7385125374858035e-06
199 1.3909066 1.6674932794558117e-06
200 1.3969783 1.6927473325267783e-06
201 1.4030663 1.6610508737358032e-06
202 1.4091588 1.7703822550174664e-06
203 1.4152539 1.6312409343299805e-06
204 1.4213343 1.8443898852638085e-06
205 1.4274035 1.7495285646873526e-06
206 1.433469 1.8178127447754377e-06
207 1.4395151 1.7495285646873526e-06
208 1.4455491 1.7731418893163209e-06
209 1.4515649 1.8737998743745266e-06
210 1.4575803 1.6517809626748203e-06
211 1.4635949 1.8592620563140372e-06
212 1.4695919 1.6185745153052267e-06
213 1.4755827 1.761304929459584e-06
214 1.4815725 1.738357582325989e-06
215 1.487561 1.6811599152788403e-06
216 1.4935583 1.715796543066972e-06
217 1.4995418 1.7122054032370215e-06
218 1.5055261 1.7736508652888006e-06
219 1.5115044 1.7891200059239054e-06
220 1.5174745 1.746884549902461e-06
221 1.5234437 1.7592467429494718e-06
222 1.5294116 1.6930725905694999e-06
223 1.5353682 1.746011434988759e-06
224 1.5413431 1.7534750895720208e-06
225 1.547309 1.7895562223202433e-06
226 1.5532707 1.6966677094387705e-06
227 1.5592226 1.7026038676704047e-06
228 1.5651724 1.7099637261708267e-06
229 1.571117 1.7223369468410965e-06
230 1.5770563 1.7227149555765209e-06
231 1.5829679 1.7147704056696966e-06
232 1.5889034 1.7099637261708267e-06
233 1.594825 1.7354402643832145e-06
234 1.6007354 1.7176122355522239e-06
235 1.6066363 1.7144874391306075e-06
236 1.6125311 1.7581785414222395e-06
237 1.6184242 1.7484341015006066e-06
238 1.6243143 1.7495514157417347e-06
239 1.6301879 1.7149026234619669e-06
240 1.6360868 1.6524779766768916e-06
241 1.6419692 1.7078756400223938e-06
242 1.6478531 1.6397100353060523e-06
243 1.6537348 1.7243577303815982e-06
244 1.6596344 1.6890739971131552e-06
245 1.6655271 1.7073930393962655e-06
246 1.6714175 1.7070633475668728e-06
247 1.6773007 1.645634938540752e-06
248 1.6831809 1.619958425180812e-06
249 1.6890433 1.728237066345173e-06
250 1.6949303 1.6089852579170838e-06
251 1.7008134 1.7228853721462656e-06
252 1.7066984 1.6063656858023023e-06
253 1.7125757 1.6878447013368714e-06
254 1.7184291 1.6039915635701618e-06
255 1.7243074 1.6913855915845488e-06
256 1.7301677 1.6213464277825551e-06
257 1.7360325 1.6034778127504978e-06
258 1.7418728 1.6607069710516953e-06
259 1.747716 1.6025653621909441e-06
260 1.7535695 1.586839061928913e-06
261 1.759432 1.7798387261791504e-06
262 1.7653024 1.654791276450851e-06
263 1.771161 1.6225779972955934e-06
264 1.7770302 1.7413586874681641e-06
265 1.782881 1.7419246205463423e-06
266 1.7887278 1.7413586874681641e-06
267 1.7945843 1.7193692656292114e-06
268 1.8004186 1.6063656858023023e-06
269 1.8062667 1.6225779972955934e-06
270 1.8121052 1.7510262750874972e-06
271 1.8179452 1.6642696891722153e-06
272 1.8237971 1.6239438309639809e-06
273 1.8296481 1.6642696891722153e-06
274 1.8354741 1.6678801557645784e-06
275 1.8413198 1.620687726244796e-06
276 1.8471508 1.6626680690023932e-06
277 1.8529727 1.5862756299611647e-06
278 1.8588016 1.5581568959532888e-06
279 1.8646201 1.6088408756331773e-06
280 1.8704267 1.588329268997768e-06
281 1.8762422 1.547453166494961e-06
282 1.8820443 1.6431011999884504e-06
283 1.8878525 1.5395630725834053e-06
284 1.8936607 1.7182350120492629e-06
285 1.8994606 1.5625425930920755e-06
286 1.9052788 1.6618981817373424e-06
287 1.9110817 1.6484990510434727e-06
288 1.9168767 1.5724858712928835e-06
289 1.922693 1.7047034361894475e-06
290 1.9284917 1.714845780043106e-06
291 1.9343079 1.7263245126741822e-06
292 1.940104 1.5685726566516678e-06
293 1.9459344 1.5873513348196866e-06
294 1.9517435 1.6013855201890692e-06
295 1.9575598 1.6020694602048025e-06
296 1.9633615 1.7158174614451127e-06
297 1.9691645 1.7601339550310513e-06
298 1.9749548 1.6021722331061028e-06
299 1.9807627 1.6033654901548289e-06
300 1.986553 1.6478585393997491e-06
301 1.9923509 1.6066078387666494e-06
302 1.9981462 1.5969324067555135e-06
303 2.003934 1.716640213089704e-06
304 2.0097232 1.6126494983836892e-06
305 2.0155315 1.6188232621061616e-06
306 2.0213351 1.6428897424702882e-06
307 2.0271363 1.6122189663292374e-06
308 2.0329351 1.6568342289247084e-06
309 2.0387275 1.6917849734454649e-06
310 2.044552 1.6539599982934305e-06
311 2.0503466 1.6400422282458749e-06
312 2.0561657 1.6703975234122481e-06
313 2.0619702 1.642959659875487e-06
314 2.06779 1.6539599982934305e-06
315 2.0735984 1.6016140307328897e-06
316 2.0794022 1.5999531797206146e-06
317 2.085221 1.6428463140982785e-06
318 2.0910168 1.709260345705843e-06
319 2.096826 1.6943026821536478e-06
320 2.1026254 1.8266432562086266e-06
321 2.1084294 1.7157017282443121e-06
322 2.1142511 1.709772050162428e-06
323 2.1200469 1.614865482224559e-06
324 2.125849 1.6451700730613084e-06
325 2.131642 1.7358287323077093e-06
326 2.1374488 1.7870889905680087e-06
327 2.14325 1.789030761756294e-06
328 2.1490548 1.789030761756294e-06
329 2.154855 1.8155537873099092e-06
330 2.1606586 1.8311978919882677e-06
331 2.166473 1.8155537873099092e-06
332 2.172281 1.8155537873099092e-06
333 2.178079 1.8155537873099092e-06
334 2.183883 1.812017671909416e-06
335 2.1896703 1.8203036233899184e-06
336 2.1954832 1.7114438151111244e-06
337 2.2012768 1.7114438151111244e-06
338 2.2070882 1.748037789184309e-06
339 2.212884 1.7485351690993411e-06
340 2.218683 1.7728050352161517e-06
341 2.2244797 1.7535410279378993e-06
342 2.230282 1.7556739067003946e-06
343 2.2360845 1.7250074506591773e-06
344 2.241873 1.7556739067003946e-06
345 2.2476814 1.7623950725464965e-06
346 2.2534804 1.73460819041793e-06
347 2.2592962 1.737128286549705e-06
348 2.2651088 1.73460819041793e-06
349 2.2709193 1.7476054381404538e-06
350 2.2767446 1.7422609062123229e-06
351 2.2825718 1.7539147165734903e-06
352 2.2883985 1.7437025690014707e-06
353 2.294224 1.7143889863291406e-06
354 2.3000553 1.7928193756233668e-06
355 2.305869 1.7338508087050286e-06
356 2.311704 1.7166065617857384e-06
357 2.3175263 1.7014000377457705e-06
358 2.3233387 1.7084613546103355e-06
359 2.3291824 1.6873108279469307e-06
360 2.3350034 1.7359732282784535e-06
361 2.34083 1.7815876844906597e-06
362 2.3466506 1.6766136923251906e-06
363 2.3524632 1.6829019386932487e-06
364 2.3582797 1.721587750580511e-06
365 2.3641138 1.684452399786096e-06
366 2.369927 1.8488865407562116e-06
367 2.375744 1.789150815056928e-06
368 2.381565 1.813656581362011e-06
369 2.3874013 1.8092271147907013e-06
370 2.3932395 1.8679544382393942e-06
371 2.399063 1.8159023511543637e-06
372 2.404899 1.8193944697486586e-06
373 2.4107351 1.8106647985405289e-06
374 2.4165576 1.8255827853863593e-06
375 2.4223917 1.8103193042406929e-06
376 2.4282436 1.8615978660818655e-06
377 2.4340796 1.7979589301830856e-06
378 2.4399252 1.8405683022137964e-06
379 2.4457798 1.8159023511543637e-06
380 2.4516253 1.8072145167025155e-06
381 2.457479 1.837138029259222e-06
382 2.4633346 1.8013893168244977e-06
383 2.4691906 1.8369254348726827e-06
384 2.4750426 1.8137498045689426e-06
385 2.4808903 1.7914175032274215e-06
386 2.4867392 1.6816944707898074e-06
387 2.4925807 1.8200786371380673e-06
388 2.4984374 1.6695898921170738e-06
389 2.5042825 1.8162896822104813e-06
390 2.510125 1.6782643115220708e-06
391 2.515974 1.8200786371380673e-06
392 2.52181 1.7966257246371242e-06
393 2.527664 1.8048295942207915e-06
394 2.533502 1.8313425016458496e-06
395 2.5393505 1.7238245391126839e-06
396 2.5451915 1.7825421991801704e-06
397 2.5510437 1.8361664615440532e-06
398 2.556896 1.7741521105563152e-06
399 2.562738 1.7606042774787056e-06
400 2.5685985 1.7003198991005775e-06
401 2.5744686 1.7891816241899505e-06
402 2.5803156 1.8211824226455064e-06
403 2.5861924 1.703745624581643e-06
404 2.5920415 1.8082954511555727e-06
405 2.5979273 1.6503026927239262e-06
406 2.6037896 1.8172761429013917e-06
407 2.6096485 1.7681799135971232e-06
408 2.6155293 1.6457153151350212e-06
409 2.6214068 1.777228476385062e-06
410 2.62728 1.6804528968350496e-06
411 2.6331532 1.7856614249467384e-06
412 2.6390297 1.7129739262600197e-06
413 2.644907 1.6397831359427073e-06
414 2.6507964 1.7460699837101856e-06
415 2.6566744 1.73190471741691e-06
416 2.6625571 1.7189885284096817e-06
417 2.6684518 1.725313836686837e-06
418 2.6743395 1.6665145494698663e-06
419 2.6802287 1.7390160564900725e-06
420 2.686115 1.8027202486337046e-06
421 2.6920135 1.7086223351725494e-06
422 2.6979165 1.7450466884838534e-06
423 2.7038124 1.681463118075044e-06
424 2.7097096 1.588900545357319e-06
425 2.7156024 1.7194050769830937e-06
426 2.7214992 1.689453029030119e-06
427 2.72739 1.7903746538650012e-06
428 2.7332873 1.7283655324717984e-06
429 2.739193 1.6751293969718972e-06
430 2.7450848 1.6778164990682853e-06
431 2.7509806 1.6357305412384449e-06
432 2.7568893 1.6976675851765322e-06
433 2.7627888 1.6230202390943305e-06
434 2.7686825 1.6719586710678414e-06
435 2.774582 1.6784836134320358e-06
436 2.7804813 1.6762437553552445e-06
437 2.7863734 1.7071191678041941e-06
438 2.792264 1.6391867347920197e-06
439 2.798154 1.7005889958454645e-06
440 2.804056 1.6307027408402064e-06
441 2.809946 1.7031997003869037e-06
442 2.8158476 1.6785926391094108e-06
443 2.8217416 1.6883423086255789e-06
444 2.8276358 1.6752669580455404e-06
445 2.8335352 1.6647729808028089e-06
446 2.839438 1.7077062466341886e-06
447 2.8453295 1.7174426147903432e-06
448 2.8512273 1.6801551510070567e-06
449 2.8571374 1.8201924376626266e-06
450 2.8630302 1.672326789048384e-06
451 2.868934 1.6624579757262836e-06
452 2.8748403 1.7368059843647643e-06
453 2.8807316 1.7102281617553672e-06
454 2.8866398 1.7216825654031709e-06
455 2.8925488 1.7064105577446753e-06
456 2.8984416 1.6882175941645983e-06
457 2.9043498 1.6590736322541488e-06
458 2.9102578 1.6737456007831497e-06
459 2.9161575 1.6651905525577604e-06
460 2.9220626 1.6969248690656968e-06
461 2.9279609 1.6696956208761549e-06
462 2.933869 1.7771179727787967e-06
463 2.9397695 1.674076656854595e-06
464 2.945677 1.7163570191769395e-06
465 2.951589 1.6731877394704497e-06
466 2.9575086 1.7139578858405002e-06
467 2.963423 1.661029273236636e-06
468 2.9693468 1.7322533949482022e-06
469 2.975268 1.6756764580350136e-06
470 2.9812174 1.6987077060548472e-06
471 2.9871242 1.7214939589393907e-06
472 2.9930735 1.7048627114490955e-06
473 2.9990044 1.669786229285819e-06
474 3.0049403 1.7561002323418506e-06
475 3.010886 1.6786905234766891e-06
476 3.0168402 1.7134439076471608e-06
477 3.0227835 1.8284858924744185e-06
478 3.0287352 1.6514883327545249e-06
479 3.0346859 1.6616103266642313e-06
480 3.0406318 1.7071628235498792e-06
481 3.0465846 1.7420715039406787e-06
482 3.0525298 1.64506320743385e-06
483 3.058484 1.6777397604528232e-06
484 3.064436 1.6965221902864869e-06
485 3.0704076 1.6948403072092333e-06
486 3.0763555 1.7142388060165104e-06
487 3.0823278 1.6574963410675991e-06
488 3.0882726 1.7402866205884493e-06
489 3.0942423 1.6537692317797337e-06
490 3.1002085 1.7189370282721939e-06
491 3.106177 1.703623070170579e-06
492 3.112145 1.641988546907669e-06
493 3.1181276 1.6608723854005802e-06
494 3.1241035 1.6886179992070538e-06
495 3.1300848 1.80542645011883e-06
496 3.1360772 1.6668717535139876e-06
497 3.1420631 1.7484957197666517e-06
498 3.1480522 1.667902779445285e-06
499 3.1540456 1.6968116369753261e-06
500 3.1600385 1.6713271406842978e-06
501 3.166042 1.6710120007701335e-06
502 3.1720457 1.8380768551651272e-06
503 3.1780443 1.6790278323242092e-06
504 3.184061 1.6918279470701236e-06
505 3.1900814 1.6624138652332476e-06
506 3.1960917 1.6688819641785813e-06
507 3.2021148 1.676875285738788e-06
508 3.2081254 1.6905082702578511e-06
509 3.2141535 1.6573727634749957e-06
510 3.220169 1.698192818366806e-06
511 3.2261899 1.7021875464706682e-06
512 3.232219 1.668042045821494e-06
513 3.2382464 1.6760531025283854e-06
514 3.2442691 1.6632249071335536e-06
515 3.250313 1.7215471643794444e-06
516 3.256336 1.6828076923047774e-06
517 3.2623727 1.6920608914006152e-06
518 3.2684093 1.7298826833211933e-06
519 3.274449 1.7009809880619287e-06
520 3.280478 1.6754269154262147e-06
521 3.286532 1.6854327213877696e-06
522 3.2925699 1.708270474409801e-06
523 3.2986016 1.6272686025331495e-06
524 3.3046494 1.6486686718053534e-06
525 3.310684 1.7615728893360938e-06
526 3.3167267 1.6852378621479147e-06
527 3.3227637 1.7030619119395851e-06
528 3.328809 1.6097968682515784e-06
529 3.3348544 1.670290544097952e-06
530 3.340892 1.6916332015171065e-06
531 3.3469334 1.6749883116062847e-06
532 3.352978 1.721573880786309e-06
533 3.3590178 1.7256733144677128e-06
534 3.3650823 1.6826103319544927e-06
535 3.3711236 1.6640146895952057e-06
536 3.3771732 1.6377612155338284e-06
537 3.3832316 1.6585408957325853e-06
538 3.3892872 1.7635908307056525e-06
539 3.3953278 1.5998950857465388e-06
540 3.4013922 1.6487526863784296e-06
541 3.4074502 1.6773368542999378e-06
542 3.4135115 1.8254601172884577e-06
543 3.4195817 1.7112763543991605e-06
544 3.4256518 1.6451551800855668e-06
545 3.4317117 1.7143267996289069e-06
546 3.437784 1.694193542789435e-06
547 3.443861 1.6784805438874173e-06
548 3.449934 1.7635908307056525e-06
549 3.4560049 1.7452287011110457e-06
550 3.4620864 1.5816675613677944e-06
551 3.4681492 1.6864200915733818e-06
552 3.474223 1.7116740309575107e-06
553 3.480306 1.6506276097061345e-06
554 3.4863844 1.6905745496842428e-06
555 3.4924586 1.7918794128490845e-06
556 3.498539 1.7131752656496246e-06
557 3.5046186 1.6392875750170788e-06
558 3.510706 1.6861249605426565e-06
559 3.516794 1.7711198552206042e-06
560 3.5228853 1.6755483329689014e-06
561 3.5289695 1.6604211623416631e-06
562 3.5350573 1.5914338291622698e-06
563 3.541166 1.7293054952460807e-06
564 3.5472608 1.7280317479162477e-06
565 3.5533526 1.71656506608997e-06
566 3.5594554 1.6032978464863845e-06
567 3.5655522 1.6928891000134172e-06
568 3.5716484 1.627438336981868e-06
569 3.577751 1.8018662331087398e-06
570 3.5838428 1.755718699314457e-06
571 3.5899494 1.6494269630129565e-06
572 3.596051 1.7617962839722168e-06
573 3.6021695 1.7485892840340966e-06
574 3.6082754 1.6284676576105994e-06
575 3.6143827 1.6298188256769208e-06
576 3.6204987 1.6975056951196166e-06
577 3.626606 1.7488928278908134e-06
578 3.6327133 1.6309453485519043e-06
579 3.638844 1.7182813962790533e-06
580 3.6449485 1.6578959503021906e-06
581 3.6510751 1.6714745925128227e-06
582 3.6571927 1.6831987750265398e-06
583 3.6633105 1.6187074152185232e-06
584 3.6694348 1.7194745396409417e-06
585 3.6755514 1.7071603224394494e-06
586 3.681671 1.8047642242891015e-06
587 3.687788 1.7210505802722764e-06
588 3.6939116 1.6983469777187565e-06
589 3.7000415 1.636445517760876e-06
590 3.706157 1.6957121715677204e-06
591 3.7122867 1.7668204463916481e-06
592 3.7184098 1.734637976369413e-06
593 3.724531 1.6203589439101052e-06
594 3.7306647 1.7357579054078087e-06
595 3.7367845 1.7006499319904833e-06
596 3.7429194 1.9755648281716276e-06
597 3.749042 1.6935819076024927e-06
598 3.7551737 1.618860665075772e-06
599 3.7613108 1.7006159396260045e-06
600 3.767444 1.7253556734431186e-06
601 3.773571 1.6727385627746116e-06
602 3.779703 1.7998248722506105e-06
603 3.7858403 1.6769959074736107e-06
604 3.7919753 1.631628720133449e-06
605 3.7981172 1.6779396219135378e-06
606 3.804259 1.675853582128184e-06
607 3.810389 1.6736714769649552e-06
608 3.816542 1.8809955690812785e-06
609 3.8226871 1.6862687743923743e-06
610 3.8288317 1.6732579979361617e-06
611 3.8349714 1.7302587593803764e-06
612 3.841117 1.6666998590153526e-06
613 3.8472602 1.6034911141105113e-06
614 3.8534007 1.740515017445432e-06
615 3.8595514 1.764596504472138e-06
616 3.865693 1.6709847159290803e-06
617 3.8718433 1.6161435496542254e-06
618 3.877986 1.8330010789213702e-06
619 3.884143 1.6563634517297032e-06
620 3.890289 1.7164430801130948e-06
621 3.8964326 1.8467457039150759e-06
622 3.90259 1.9118274394713808e-06
623 3.9087365 1.806626642064657e-06
624 3.9148822 1.6197426475628163e-06
625 3.9210389 1.7004207393256365e-06
626 3.92719 1.6171619563465356e-06
627 3.9333296 1.7012167745633633e-06
628 3.939485 1.6737743635530933e-06
629 3.9456491 1.6502413018315565e-06
630 3.9518042 1.7604070308152586e-06
631 3.957956 1.6484343632328091e-06
632 3.9641092 1.842885239966563e-06
633 3.9702735 1.6041195749494364e-06
634 3.9764354 1.7223239865415962e-06
635 3.9825883 1.6795503370303777e-06
636 3.9887533 1.7361360278300708e-06
637 3.994913 1.695530613687879e-06
638 4.001069 1.651663524171454e-06
639 4.0072265 1.624738047212304e-06
640 4.0134077 1.6882809177332092e-06
641 4.0195637 1.7647918184593436e-06
642 4.02574 1.5934554085106356e-06
643 4.0319204 1.7163232541861362e-06
644 4.038089 1.628685936339025e-06
645 4.0442743 1.6988828974717762e-06
646 4.050464 1.6111703189380933e-06
647 4.056641 1.8734641571427346e-06
648 4.0628304 1.6050108797571738e-06
649 4.069009 1.7011698218993843e-06
650 4.0751905 1.6797209809737979e-06
651 4.081372 1.632559360587038e-06
652 4.087567 1.7163202983283554e-06
653 4.09374 1.6684649608578184e-06
654 4.099928 1.6832823348522652e-06
655 4.1061044 1.6589945062150946e-06
656 4.1122856 1.6490472489749664e-06
657 4.118484 1.725449465084239e-06
658 4.124662 1.6525050341442693e-06
659 4.1308465 1.7104114249377744e-06
660 4.1370444 1.5741793504275847e-06
661 4.143232 1.7449405049774214e-06
662 4.149425 1.550896627122711e-06
663 4.1556153 1.8236502228319296e-06
664 4.1618085 1.7070160538423806e-06
665 4.1679993 1.7178124380734516e-06
666 4.174194 1.6629330730211223e-06
667 4.1803927 1.8105811250279658e-06
668 4.186591 1.7740534303811728e-06
669 4.1927905 1.7804288745537633e-06
670 4.198987 1.6653951888656593e-06
671 4.2051773 1.88624676411564e-06
672 4.211372 1.7927103499459918e-06
673 4.21757 1.741463279358868e-06
674 4.2237678 1.7786034050004673e-06
675 4.229956 1.8284563338966109e-06
676 4.236158 1.6502409607710433e-06
677 4.242359 1.7456526393289096e-06
678 4.248543 1.7897558564072824e-06
679 4.2547474 1.7170327737403568e-06
680 4.2609386 1.60837669227476e-06
681 4.2671447 1.7002779486574582e-06
682 4.273341 1.7003785615088418e-06
683 4.2795444 1.6132710243255133e-06
684 4.28574 1.6683542298778775e-06
685 4.2919436 1.6401078255512402e-06
686 4.2981515 1.7814900274970569e-06
687 4.3043685 1.6978412986645708e-06
688 4.310569 1.6302261656164774e-06
689 4.316775 1.766163791216968e-06
690 4.3229656 1.772887003426149e-06
691 4.3291874 1.805934857657121e-06
692 4.3354044 1.6795447663753293e-06
693 4.3416104 1.6885235254449071e-06
694 4.3478255 1.7699204590826412e-06
695 4.354029 1.7617926459934097e-06
696 4.360256 1.7122670215030666e-06
697 4.3664637 1.7800831528802519e-06
698 4.3726835 1.6175806649698643e-06
699 4.378895 1.645407905925822e-06
700 4.385107 1.97333974938374e-06
701 4.39134 1.7292912843913655e-06
702 4.397546 1.6468771946165361e-06
703 4.4037657 1.6599730088273645e-06
704 4.409991 1.7150166513602016e-06
705 4.4162073 1.7266902432311326e-06
706 4.4224424 1.8328626083530253e-06
707 4.4286623 1.6524694501640624e-06
708 4.4348845 1.6990726408039336e-06
709 4.4411087 1.6110396927615511e-06
710 4.447324 1.777778606992797e-06
711 4.4535575 1.980507704502088e-06
712 4.4597797 1.675831072134315e-06
713 4.4660063 1.7191332517541014e-06
714 4.4722314 1.73876469489187e-06
715 4.478453 1.8104384480466251e-06
716 4.4846807 1.8196443534179707e-06
717 4.490916 1.6171507013496011e-06
718 4.4971414 1.8607137235449045e-06
719 4.503376 1.7222489532287e-06
720 4.509608 1.7009537032208755e-06
721 4.515831 1.6891222003323492e-06
722 4.522068 1.7695875840217923e-06
723 4.528312 1.808035563044541e-06
724 4.5345435 1.7160188008347177e-06
725 4.540775 1.6616450011497363e-06
726 4.547013 1.6634945723126293e-06
727 4.5532475 1.655758751439862e-06
728 4.559491 1.6850452766448143e-06
729 4.5657253 1.572742917232972e-06
730 4.5719643 1.7430047591915354e-06
731 4.5782022 1.7566829910720116e-06
732 4.584439 1.7316807543465984e-06
733 4.590691 1.6235982229773072e-06
734 4.596923 1.823119191612932e-06
735 4.603169 1.6479848454764578e-06
736 4.6094174 1.7051301028914168e-06
737 4.61565 1.805746592253854e-06
738 4.621894 1.7050143696906161e-06
739 4.6281376 1.6618547533653327e-06
740 4.634396 1.7940695897777914e-06
741 4.640645 1.822428544073773e-06
742 4.6468854 1.644363237573998e-06
743 4.6531396 1.6407803968832013e-06
744 4.659387 1.7317855736109777e-06
745 4.6656394 1.7253956912099966e-06
746 4.6718917 1.7928567785929772e-06
747 4.6781487 1.732252258079825e-06
748 4.6844163 1.7266588656639215e-06
749 4.690659 1.7194546444443404e-06
750 4.6969137 1.7269716181544936e-06
751 4.7031803 1.7441326463085716e-06
752 4.7094336 1.7580592839294695e-06
753 4.7156887 1.7437578208046034e-06
754 4.721974 1.6934529867285164e-06
755 4.728212 1.764227704370569e-06
756 4.7344856 1.770920221133565e-06
757 4.74074 1.7886527530208696e-06
758 4.747015 1.645152678975137e-06
759 4.7532816 1.686193741079478e-06
760 4.759547 1.8361829461355228e-06
761 4.7658143 1.7834132677307935e-06
762 4.772084 1.8147169384974404e-06
763 4.778359 1.6451288047392154e-06
764 4.7846365 1.7195129657920916e-06
765 4.7909126 1.7350421330775134e-06
766 4.7971773 1.7551217297295807e-06
767 4.803445 1.7297710428465507e-06
768 4.809731 1.720574573482736e-06
769 4.816003 1.7493496216047788e-06
770 4.8222837 1.727334961287852e-06
771 4.8285623 1.6459425751236267e-06
772 4.8348346 1.7159388789877994e-06
773 4.8411174 1.8683374491956783e-06
774 4.847401 1.6251543684120406e-06
775 4.853673 1.6988705056064646e-06
776 4.859957 1.6846032622197527e-06
777 4.866243 1.7799635543269687e-06
778 4.8725348 1.7368907947457046e-06
779 4.8788166 1.573215854477894e-06
780 4.885112 1.7900595139508368e-06
781 4.891404 1.6394740214309422e-06
782 4.8976965 1.6818598851386923e-06
783 4.9039927 1.6216988569794921e-06
784 4.9102807 1.842824758568895e-06
785 4.916579 1.664713295213005e-06
786 4.9228773 1.6767688748586806e-06
787 4.9291778 1.7131501408584882e-06
788 4.935464 1.6453772104796371e-06
789 4.941762 1.6528427977391402e-06
790 4.948056 1.6644481775074382e-06
791 4.954363 1.8117095805791905e-06
792 4.9606543 1.640500386201893e-06
793 4.9669495 1.6357402046196512e-06
794 4.9732513 1.7153088265331462e-06
795 4.9795537 1.693157855697791e-06
796 4.985849 1.6430649338872172e-06
797 4.992156 1.6320084341714391e-06
798 4.9984574 1.697559468993859e-06
799 5.0047445 1.6592917972957366e-06
800 5.0110483 1.6623159808659693e-06
801 5.017361 1.7724245253702975e-06
802 5.023653 1.6340566162398318e-06
803 5.029963 1.7970033923120354e-06
804 5.036264 1.7759978163667256e-06
805 5.042574 1.7013944670907222e-06
806 5.0488806 1.6543017409276217e-06
807 5.05519 1.8300238480151165e-06
808 5.0615 1.6380139413740835e-06
809 5.0678115 1.6407725524913985e-06
810 5.074121 1.656165864005743e-06
811 5.0804253 1.7543821968502016e-06
812 5.086738 1.642948177504877e-06
813 5.093045 1.651580191719404e-06
814 5.0993586 1.7746024241205305e-06
815 5.1056824 1.7024489125105902e-06
816 5.111981 1.639034280742635e-06
817 5.1182985 1.6255378341156757e-06
818 5.12461 1.761962835189479e-06
819 5.130929 1.6022416957639507e-06
820 5.1372495 1.8234849221698823e-06
821 5.1435556 1.7080334373531514e-06
822 5.1498723 1.8549034166426281e-06
823 5.1561894 1.6324186162819387e-06
824 5.162512 1.7546200297147152e-06
825 5.168831 1.763626528372697e-06
826 5.175143 1.8991477190866135e-06
827 5.1814575 1.7334813264824334e-06
828 5.1877775 1.6132496512000216e-06
829 5.1941004 1.6302150243063807e-06
830 5.200423 1.7489427364125731e-06
831 5.2067413 1.636852744013595e-06
832 5.2130594 1.724720505080768e-06
833 5.219387 1.6810116676424514e-06
834 5.2257137 1.6008250440791016e-06
835 5.2320414 1.7502334230812266e-06
836 5.238367 1.6377963447666843e-06
837 5.244691 1.6773300330896745e-06
838 5.251017 1.6209320392590598e-06
839 5.2573433 1.7680218888926902e-06
840 5.263673 1.637284412936424e-06
841 5.270005 1.8438696542943944e-06
842 5.27634 1.6766979342719424e-06
843 5.282678 1.736919671202486e-06
844 5.289015 1.6591635585427866e-06
845 5.2953405 1.7078185692298575e-06
846 5.3016806 1.655257278798672e-06
847 5.3080063 1.7163685015475494e-06
848 5.3143487 1.7763649111657287e-06
849 5.3206854 1.6711087482690345e-06
850 5.3270345 1.6502870039403206e-06
851 5.333374 1.7075091136575793e-06
852 5.3397136 1.6468693502247334e-06
853 5.34606 1.7454465250921203e-06
854 5.3524003 1.579881086399837e-06
855 5.358747 1.77857532435155e-06
856 5.3650904 1.6221921441683662e-06
857 5.371433 1.6678801557645784e-06
858 5.3777833 1.7296210899075959e-06
859 5.3841195 1.8131132719645393e-06
860 5.390481 1.633234433029429e-06
861 5.396835 1.7384630837113946e-06
862 5.403186 1.6590410041317227e-06
863 5.4095407 1.635640614949807e-06
864 5.415895 1.7211859812960029e-06
865 5.4222507 1.5508245496675954e-06
866 5.4286013 1.9029386066904408e-06
867 5.4349575 1.7718562048685271e-06
868 5.441319 1.7395070699421922e-06
869 5.4476905 1.6992609062072006e-06
870 5.454049 1.7201507489517098e-06
871 5.460402 1.7075091136575793e-06
872 5.46676 1.6576551615798962e-06
873 5.473131 1.760751956680906e-06
874 5.479487 1.6266823195110192e-06
875 5.485844 1.6503728375028004e-06
876 5.492205 1.7671486602921505e-06
877 5.498563 1.7708632640278665e-06
878 5.504928 1.7089939774450613e-06
879 5.5112777 1.6250041880994104e-06
880 5.517655 1.6548065104871057e-06
881 5.5240183 1.7017023310472723e-06
882 5.5303836 1.6224463479375117e-06
883 5.5367413 1.7235940958926221e-06
884 5.543112 1.62195044595137e-06
885 5.5494895 1.6322679812219576e-06
886 5.555859 1.7500346984888893e-06
887 5.562226 1.7096535884775221e-06
888 5.568587 1.604781687092327e-06
889 5.574962 1.6294660554194706e-06
890 5.581338 1.780065531420405e-06
891 5.587704 1.8804458932208945e-06
892 5.594076 1.7026867453751038e-06
893 5.6004515 1.6619214875390753e-06
894 5.6068215 1.6469680303998757e-06
895 5.613177 1.6184942523977952e-06
896 5.619562 1.6599205991951749e-06
897 5.625937 1.7797709688238683e-06
898 5.632298 1.891331521619577e-06
899 5.638677 1.7176524806927773e-06
900 5.6450505 1.723349100757332e-06
901 5.6514096 1.6731536334191333e-06
902 5.6577888 1.6086880805232795e-06
903 5.664159 1.8455691588314949e-06
904 5.670536 1.6383276033593575e-06
905 5.6769075 1.7095630937546957e-06
906 5.6832795 1.6461182212879066e-06
907 5.689653 1.654504558246117e-06
908 5.6960125 1.7337039253106923e-06
909 5.7024026 1.778723230927426e-06
910 5.70878 1.8618028434502776e-06
911 5.715159 1.7076870335586136e-06
912 5.7215366 1.754103891471459e-06
913 5.7279105 1.7281404325331096e-06
914 5.7342978 1.8664819663172239e-06
915 5.7406793 1.7003835637297016e-06
916 5.747051 1.7714909290589276e-06
917 5.753442 1.694864067758317e-06
918 5.759825 1.764663466019556e-06
919 5.76621 1.718317435006611e-06
920 5.7725987 1.6749115729908226e-06
921 5.7789946 1.6792032511148136e-06
922 5.7853804 1.6587167692705407e-06
923 5.7917705 1.671293148319819e-06
924 5.7981577 1.6653403918098775e-06
925 5.8045535 1.7507753682366456e-06
926 5.8109474 1.6899822412597132e-06
927 5.817342 1.7267226439798833e-06
928 5.823733 1.7247328969460796e-06
929 5.830119 1.71208273513912e-06
930 5.836513 1.5968960269674426e-06
931 5.842914 1.8610041934152832e-06
932 5.8493123 1.7507753682366456e-06
933 5.8557 1.7530488776174025e-06
934 5.8621154 1.7467413044869318e-06
935 5.868517 1.8935936623165617e-06
936 5.8749228 1.6666591591274482e-06
937 5.881329 1.7613148202144657e-06
938 5.8877196 1.7071403135560104e-06
939 5.8941364 1.708106651676644e-06
940 5.9005475 1.6474068615934812e-06
941 5.906949 1.7909155758388806e-06
942 5.913348 1.8553491827333346e-06
943 5.919756 1.6858411981957033e-06
944 5.9261665 1.7536426639708225e-06
945 5.932576 1.6733692973502912e-06
946 5.938983 1.7633853985898895e-06
947 5.9453936 1.7010621604640619e-06
948 5.9518075 1.670299525358132e-06
949 5.958211 1.6554276953684166e-06
950 5.964619 1.7164958308057976e-06
951 5.9710383 1.719412352940708e-06
952 5.9774466 1.6592796328041004e-06
953 5.9838567 1.8132978993889992e-06
954 5.990267 1.7923658788276953e-06
955 5.9966774 1.7282524140682654e-06
956 6.0030828 1.7527827367302962e-06
957 6.0094995 1.848863917075505e-06
958 6.015902 1.6764336123742396e-06
959 6.022327 1.6923114571909537e-06
960 6.028728 1.723529635455634e-06
961 6.0351386 1.6817120922496542e-06
962 6.0415545 1.7283176703131176e-06
963 6.0479617 1.7984257283387706e-06
964 6.054376 1.7307843336311635e-06
965 6.0607986 1.6838952205944224e-06
966 6.0672064 1.6693428506187047e-06
967 6.0736256 1.7929770592672867e-06
968 6.080048 1.7184443095175084e-06
969 6.086478 1.7510076304461109e-06
970 6.092886 1.686983182480617e-06
971 6.09931 1.775976784301747e-06
972 6.1057267 1.7042441413650522e-06
973 6.112159 1.8057138504445902e-06
974 6.118568 1.749461034705746e-06
975 6.124989 1.737494244480331e-06
976 6.131411 1.697133484412916e-06
977 6.1378384 1.6054913203333854e-06
978 6.144261 1.7779842664822354e-06
979 6.1506705 1.7613322143006371e-06
980 6.1570954 1.735858290885517e-06
981 6.1635146 1.7273670209760894e-06
982 6.1699443 1.8016651210928103e-06
983 6.176364 1.6577887436142191e-06
984 6.182785 1.7000637626551907e-06
985 6.189211 1.6098349533422152e-06
986 6.1956363 1.6708280554666999e-06
987 6.202062 1.6812678040878382e-06
988 6.20849 1.675260136835277e-06
989 6.2149096 1.6073533970484277e-06
990 6.221336 1.7308377664448926e-06
991 6.2277627 1.7490103800810175e-06
992 6.2341957 1.73156490745896e-06
993 6.2406282 1.7092773987315013e-06
994 6.247062 1.7839250858742162e-06
995 6.253493 1.768219590303488e-06
996 6.2599254 1.6105500435514841e-06
997 6.2663507 1.867201490313164e-06
998 6.2727757 1.5852081105549587e-06
999 6.2792096 1.7314188198724878e-06
../../../_images/cnn-learn_29_1.png

Class Activation Maps#

We may ask what regions of an image make the CNN ‘think’ that there is a cat or a dog. A simple approach is to pass an image through the CNN and then look at the gradient of the last convolution layer’s output with respect to an output neuron (cat or dog). By the principle of local connectivity spatial regions of a feature map are strongly related to the same spatial regions of the input image. High positive components in the gradient tell us that increasing the presence of the corresponding feature in the corresponding region would increase the chosen output neuron’s output. Very negative components tell us that the feature in this region lowers output.

To get the gradient of an output neuron with respect to the outputs of a hidden layer we have to remember what TensorFlow’s automatic differentiation routines can do and what they cannot do. What TensorFlow can do is calculating the gradient of some function with respect to a concrete tensor flowing through the graph. But derivatives with respect to some abstract tensor (a kind of placeholder) are not accessible. So we may formulate more precisely: we want to have the gradient of a neuron’s output with respect to the tensor flowing out of a hidden layer when some tensor is pushed through the CNN. The problem is that Keras does not implement accessing interim results. The solution is to create a new model with two outpus. One output is the usual output layer, the other is the hidden convolution layer of interest. This does not change the CNN’s structure, but forces Keras to provide access to the concrete tensor object coming out of the hidden layer and moving on to the next layer.

layer = model.get_layer('conv4')

submodel = keras.models.Model(inputs=model.inputs,
                              outputs=[layer.output, model.output])

Now we load an image and preprocess it as usual.

img = keras.preprocessing.image.load_img(data_path + 'unlabeled/1696.jpg', # 318, 786, 907, 1696
                                         target_size=(img_size, img_size))
img = np.asarray(img, dtype=np.float32)

We want to have two gradients: the gradient of the cat output neuron and the gradient of the dog output neuron. Since we have two outputs in our model, predictions yield a list of two tensors.

img_tensor = tf.convert_to_tensor(img.reshape(1, img_size, img_size, 3))

with tf.GradientTape() as tape:
    tape.watch(img_tensor)
    pred = submodel(img_tensor)
    cat_grad = tape.gradient(pred[1][0, 0], pred[0])

with tf.GradientTape() as tape:
    tape.watch(img_tensor)
    pred = submodel(img_tensor)
    dog_grad = tape.gradient(pred[1][0, 1], pred[0])

fmaps = pred[0].numpy()[0, :, :, :]
cat_grad = cat_grad.numpy()[0, :, :, :]
dog_grad = dog_grad.numpy()[0, :, :, :]

print(fmaps.shape, cat_grad.shape, dog_grad.shape)
(58, 58, 32) (58, 58, 32) (58, 58, 32)

Now we are ready to compute the class activation map (CAM). The CAM has same shape as a feature map in the last convolutional layer (same width and height, depth is 1). The CAM is a weighted sum of all feature maps of the last convolutional layer. The weights are calculated from the gradient by spacial averaging. Thus, for each feature map the weight is something like a mean partial derivative. If the weight is positive, then the feature represented by the corresponding feature map potentially increases class activation. If the weight is negative, then class activation is decreased the more nonzero values in the feature map.

Multiplying mean gradients by the feature map values yields high positive numbers in regions where a class activation increasing feature is present in the input image, but negative values in regions where features are present which potentially decrease class activation.

We scale the CAM to \([0,1]\) such that \(0.5\) corresponds to 0 in the original CAM.

cat_weights = np.mean(cat_grad, axis=(0, 1)).reshape(1, 1, -1)
cat_cam = np.sum(fmaps * cat_weights, axis=2)
dog_weights = np.mean(dog_grad, axis=(0, 1)).reshape(1, 1, -1)
dog_cam = np.sum(fmaps * dog_weights, axis=2)

fac = np.maximum(np.max(np.abs(cat_cam)), np.max(np.abs(dog_cam)))
cat_cam = 0.5 * (1 + cat_cam / fac)
dog_cam = 0.5 * (1 + dog_cam / fac)

print('cat: {:.2f}, dog: {:.2f}'.format(pred[1][0, 0], pred[1][0, 1]))

fig, [ax1, ax2, ax3] = plt.subplots(1, 3, figsize=(12, 6))
ax1.imshow(cat_cam, cmap='gray', vmin=0, vmax=1)
ax2.imshow(img / 255)
ax3.imshow(dog_cam, cmap='gray', vmin=0, vmax=1)
ax1.set_title('cat activation map')
ax3.set_title('dog activation map')
plt.show()
cat: 0.12, dog: 1.00
../../../_images/cnn-learn_37_1.png

For better visual interpretation we overlay the original image with the CAM. Many people do this in a very sloppy way by simply resizing the CAM to image size. But we take the hard and correct one. The difficult part is to find the region associated with a value in the CAM. Going backwards through the CNN’s layers we have to calculate size and position of the region of interest (ROI) for each component of the CAM.

A pixel in the feature map results from a convolution with a 3x3 filter. Thus a 3x3 region is the preimage of the pixel. One layer up we have a 5x5 region (convolution with 3x3 filter again). Then there is a pooling layer. So the ROI’s size before pooling is 10x10. Then again two 3x3 convolutions, yielding a 14x14 ROI.

The CAM is 58x58. The original image is 128x128. Centers of all ROIs have to be placed equally spaced in the 128x128 image such that there is a 7 pixel boundary. Else some ROIs would partially lie outside the image. Distance between ROI centers is \((128-14)/57=2\) pixels.

With this knowledge we create a stack of images. One image per CAM component. Each containing the CAM component’s value in all pixels belonging to the component’s ROI. Then we merge all images in the stack by taking the pixelwise mean. Here we have to take into account that pixels near the boundary belong to fewer ROIs than pixels in the image center.

To overlay CAM image and original image we use a color map with blue for negative CAM values, gray for zero and red for positive CAM values.

def cam_to_img(cam):

    cam_size = cam.shape[0]
    roi_size = 14
    roi_gap = 2
    roi = np.zeros((img_size, img_size, cam_size * cam_size))
    mask = np.full(roi.shape, 0)
    for i in range(0, cam_size):
        for j in range(0, cam_size):
            first_i = roi_gap * i
            last_i = first_i + roi_size
            first_j = roi_gap * j
            last_j = first_j + roi_size
            roi[first_i:last_i, first_j:last_j, i * cam_size + j] = cam[i, j]
            mask[first_i:last_i, first_j:last_j, i * cam_size + j] = 1
        
    return roi.sum(axis=2) / mask.sum(axis=2)

def mix_images(gray, color):

    result = np.empty((img_size, img_size, 3))
    result[:, :, 0] = 0.1 * color.mean(axis=2)
    result[:, :, 1] = result[:, :, 0]
    result[:, :, 2] = result[:, :, 0]
    result[:, :, 0] = result[:, :, 0] + 0.89 * gray
    result[:, :, 1] = result[:, :, 1] + 0.89 * (0.5 - np.abs(gray - 0.5))
    result[:, :, 2] = result[:, :, 2] + 0.89 * (1 - gray)
    
    return result
cat_img = cam_to_img(cat_cam)
dog_img = cam_to_img(dog_cam)

cat_mix = mix_images(cat_img, img / 255)
dog_mix = mix_images(dog_img, img / 255)

fig, [[ax1, ax2], [ax3, ax4]] = plt.subplots(2, 2, figsize=(12, 12))
ax1.imshow(cat_img, cmap='gray', vmin=0, vmax=1)
ax2.imshow(dog_img, cmap='gray', vmin=0, vmax=1)
ax3.imshow(cat_mix)
ax4.imshow(dog_mix)
ax1.set_title('cat activation map')
ax2.set_title('dog activation map')
plt.show()
../../../_images/cnn-learn_40_0.png