Category: Tips

https://cdn3d.iconscout.com/3d/premium/thumb/tips-3d-icon-download-in-png-blend-fbx-gltf-file-formats–idea-calculate-business-miscellany-texts-pack-miscellaneous-icons-7568369.png

  • Hyperparameter Tuning with Keras Tuner

    Hyperparameter Tuning with Keras Tuner

    • Use Keras Tuner to find the optimal hyperparameters.pythonCopy codefrom kerastuner import HyperModel, RandomSearch class MyHyperModel(HyperModel): def build(self, hp): model = Sequential() model.add(Dense(units=hp.Int('units', min_value=32, max_value=512, step=32), activation='relu')) model.add(Dense(10, activation='softmax')) model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) return model tuner = RandomSearch(MyHyperModel(), objective='val_accuracy', max_trials=10) tuner.search(X_train, y_train, epochs=10, validation_data=(X_val, y_val))
    • Keras Tuner automates the search for optimal architecture and hyperparameters.
  • Using Mixed Precision Training

    • For large models or limited hardware, mixed precision training can improve performance by using 16-bit floats.pythonCopy codefrom tensorflow.keras.mixed_precision import experimental as mixed_precision policy = mixed_precision.Policy('mixed_float16') mixed_precision.set_policy(policy)
    • This helps leverage the speed and memory benefits of reduced precision without losing much accuracy.
  • Handling Class Imbalance

    • If you have imbalanced data (e.g., more samples of one class), adjust the class weights or use oversampling.pythonCopy codeclass_weights = {0: 1., 1: 50.} model.fit(X_train, y_train, class_weight=class_weights)
    • Class weights make the model pay more attention to underrepresented classes.
  • Monitor Overfitting with EarlyStopping

    • Use EarlyStopping to monitor metrics and stop training once overfitting is detected.pythonCopy codefrom tensorflow.keras.callbacks import EarlyStopping early_stop = EarlyStopping(monitor='val_loss', patience=5, restore_best_weights=True) model.fit(X_train, y_train, validation_data=(X_val, y_val), epochs=50, callbacks=[early_stop])
    • This is an essential tool to stop training once the model starts overfitting, saving time and resources.
  • Custom Loss Functions

    • You can create custom loss functions to suit specific tasks.pythonCopy codeimport tensorflow.keras.backend as K def custom_loss(y_true, y_pred): return K.mean(K.square(y_pred - y_true) * K.exp(-K.abs(y_true))) model.compile(optimizer='adam', loss=custom_loss)
    • Custom loss functions are useful when default loss functions don’t meet the exact requirements of the task.
  • Multi-Output Models

    • Build models that predict more than one output by designing multiple outputs in the last layer.pythonCopy codefrom tensorflow.keras.models import Model from tensorflow.keras.layers import Dense, Input input_layer = Input(shape=(20,)) x = Dense(128, activation='relu')(input_layer) # Output 1 output1 = Dense(10, activation='softmax', name='output_1')(x) # Output 2 output2 = Dense(1, activation='linear', name='output_2')(x) model = Model(inputs=input_layer, outputs=[output1, output2]) model.compile(optimizer='adam', loss=['categorical_crossentropy', 'mse'], metrics=['accuracy', 'mae'])
    • This setup is useful for multi-task learning where you have a combination of classification and regression tasks.
  • Data Augmentation for Tabular Data

    • If you’re working with tabular data, augmenting it can improve performance.
      • SMOTE (Synthetic Minority Oversampling Technique) can generate new samples for imbalanced datasets.
      • Noise injection can also create variety in your training data.
  • Model Checkpoints with Conditions

    • Save the model based on the best metric, such as validation accuracy or validation loss, at different points in training.pythonCopy codefrom tensorflow.keras.callbacks import ModelCheckpoint checkpoint = ModelCheckpoint('best_model.h5', monitor='val_accuracy', save_best_only=True, mode='max') model.fit(X_train, y_train, validation_data=(X_val, y_val), epochs=50, callbacks=[checkpo
  • Custom Callbacks

    • Create custom callbacks to execute functions during training at specific events like the end of each epoch.pythonCopy codefrom tensorflow.keras.callbacks import Callback class CustomCallback(Callback): def on_epoch_end(self, epoch, logs=None): print(f"End of epoch {epoch}. Validation Loss: {logs['val_loss']}") model.fit(X_train, y_train, epochs=10, validation_data=(X_val, y_val), callbacks=[CustomCallback()])
    • This is useful for tracking custom metrics or making decisions during training.
  • Regularization Techniques (L1/L2/Dropout)

    • Prevent overfitting by adding L1/L2 regularization or dropout.pythonCopy codefrom tensorflow.keras.layers import Dropout from tensorflow.keras.regularizers import l2 model = Sequential([ Dense(128, activation='relu', kernel_regularizer=l2(0.001), input_shape=(20,)), Dropout(0.5), Dense(64, activation='relu'), Dropout(0.5), Dense(1) ]) model.compile(optimizer='adam', loss='mse')
    • Dropout randomly drops a fraction of the neurons, making the model less dependent on specific paths and helping to generalize better.