Change Keras Regularizer During Training / Dynamic Regularization
I am looking for a proper way to implement a dynamic regularization of weights for a layer during the training. As an example after 10 calls I want to replace the L2 regularization
Solution 1:
We could form your problem not as change the regularizer from L1->L2, but form it as change the L1L2 regularizer's parameters, so we can in principle change the regularizer
And we can't change the hyperparameters after compile the model (you can if you custom the training), because the hyperparameters are built-in to the training function when you compile
We want modify hyperparameters during training, and the way to do it is use backend variables in the training function and update those variables during training
So we could define the following custom regularizer:
class L1L2_m(Regularizer):
"""Regularizer for L1 and L2 regularization.
# Arguments
l1: Float; L1 regularization factor.
l2: Float; L2 regularization factor.
"""
def __init__(self, l1=0.0, l2=0.01):
with K.name_scope(self.__class__.__name__):
self.l1 = K.variable(l1,name='l1')
self.l2 = K.variable(l2,name='l2')
self.val_l1 = l1
self.val_l2 = l2
def set_l1_l2(self,l1,l2):
K.set_value(self.l1,l1)
K.set_value(self.l2,l2)
self.val_l1 = l1
self.val_l2 = l2
def __call__(self, x):
regularization = 0.
if self.val_l1 > 0.:
regularization += K.sum(self.l1 * K.abs(x))
if self.val_l2 > 0.:
regularization += K.sum(self.l2 * K.square(x))
return regularization
def get_config(self):
config = {'l1': float(K.get_value(self.l1)),
'l2': float(K.get_value(self.l2))}
return config
Add your custom object so that when you want to export your model you won't have any issue in reloading it:
from keras.utils.generic_utils import get_custom_objects
get_custom_objects().update({ L1L2_m.__name__: L1L2_m })
Update your variable using the custom object set_l1_l2 method:
class MyLayer(tf.keras.layers.Layer):
def __init__(...)
some code
def build(self, input_shape):
self.regularizer = L1L2_m() # this regularization should be changed after some steps
self.my_weights = self.add_weight(name='myweights', shape=(self.input_dim, ),
initializer=tf.keras.initializers.Constant(1,),
regularizer= self.regularizer, trainable=True)
self.counter = tf.Variable(0, dtype=tf.int32)
...
def call(self, inputs):
... do some processing ...
# for the following code i look for a proper implementation
if self.counter == 10:
self.regularizer.set_l1_l2(0.01,0.)
tf.keras.backend.update(self.counter,self.counter+1)
Reference:
Post a Comment for "Change Keras Regularizer During Training / Dynamic Regularization"