Skip to content Skip to sidebar Skip to footer

Keras: Class Weights (class_weight) For One-hot Encoding

I'd like to use class_weight argument in keras model.fit to handle the imbalanced training data. By looking at some documents, I understood we can pass a dictionary like this: clas

Solution 1:

Here's a solution that's a bit shorter and faster. If your one-hot encoded y is a np.array:

import numpy as np
from sklearn.utils.class_weight import compute_class_weight

y_integers = np.argmax(y, axis=1)
class_weights = compute_class_weight('balanced', np.unique(y_integers), y_integers)
d_class_weights = dict(enumerate(class_weights))

d_class_weights can then be passed to class_weight in .fit.

Solution 2:

I guess we can use sample_weights instead. Inside Keras, actually, class_weights are converted to sample_weights.

sample_weight: optional array of the same length as x, containing weights to apply to the model's loss for each sample. In the case of temporal data, you can pass a 2D array with shape (samples, sequence_length), to apply a different weight to every timestep of every sample. In this case you should make sure to specify sample_weight_mode="temporal" in compile().

https://github.com/fchollet/keras/blob/d89afdfd82e6e27b850d910890f4a4059ddea331/keras/engine/training.py#L1392

Solution 3:

A little bit of a convoluted answer, but the best I've found so far. This assumes your data is one-hot encoded, multi-class, and working only on the labels DataFrame df_y:

import pandas as pd
import numpy as np

# Create a pd.series that represents the categorical class of each one-hot encoded row
y_classes = df_y.idxmax(1, skipna=False)

from sklearn.preprocessing import LabelEncoder

# Instantiate the label encoder
le = LabelEncoder()

# Fit the label encoder to our label series
le.fit(list(y_classes))

# Create integer based labels Series
y_integers = le.transform(list(y_classes))

# Create dict of labels : integer representation
labels_and_integers = dict(zip(y_classes, y_integers))

from sklearn.utils.class_weight import compute_class_weight, compute_sample_weight

class_weights = compute_class_weight('balanced', np.unique(y_integers), y_integers)
sample_weights = compute_sample_weight('balanced', y_integers)

class_weights_dict = dict(zip(le.transform(list(le.classes_)), class_weights))

This results in a sample_weights vector computed to balance an imbalanced dataset which can be passed to the Keras sample_weight property, and a class_weights_dict that can be fed to the Keras class_weight property in the .fit method. You don't really want to use both, just choose one. I'm using class_weight right now because it's complicated to get sample_weight working with fit_generator.

Solution 4:

in _standardize_weights, keras does:

if y.shape[1] > 1:
    y_classes = y.argmax(axis=1)

so basically, if you choose to use one-hot encoding, the classes are the column index.

You may also ask yourself how you can map the column index to the original classes of your data. Well, if you use the LabelEncoder class of scikit learn to perform one-hot encoding, the column index maps the order of the unique labels computed by the .fit function. The doc says

Extract an ordered array of unique labels

Example:

from sklearn.preprocessing import LabelBinarizer
y=[4,1,2,8]
l=LabelBinarizer()
y_transformed=l.fit_transorm(y)
y_transormed
> array([[0, 0, 1, 0],
   [1, 0, 0, 0],
   [0, 1, 0, 0],
   [0, 0, 0, 1]])
l.classes_
> array([1, 2, 4, 8])

As a conclusion, the keys of the class_weights dictionary should reflect the order in the classes_ attribute of the encoder.

Post a Comment for "Keras: Class Weights (class_weight) For One-hot Encoding"