Hyperparameters
Hyperparameters are configuration settings that define a machine learning model’s learning process and architecture. Unlike model parameters (such as weights and biases), which are learned during training, hyperparameters are set before training begins and remain unchanged throughout. Selecting the right hyperparameters is crucial, as they significantly impact model performance—especially in complex tasks like computer vision.
QpiAI Pro offers robust support for configuring a wide range of hyperparameters change in Learning rate. This enables users to fine-tune model behavior with precision, optimizing performance to suit specific application needs. The platform’s flexibility empowers users to experiment, iterate, and achieve the best results for use cases.
Object detection:
-
Epochs: The default is set to 100, but you can customize this value anywhere between 1 and 100. Adjusting the number of epochs allows you to fine-tune the training duration based on your dataset’s size, complexity, and the level of model convergence you aim to achieve.
-
Batch Size: you can select a value between 1 and 10 to balance.
-
Learning Rate: Controls how quickly the model updates during training. A lower value ensures stable but slower learning, while a higher value speeds up training but may cause instability. Here you can set both a minimum(-) and maximum (+) learning rate to define the range within which the model adjusts dynamically for optimal convergence.
-
Num Samples: The Num samples parameter defines the number of trials (or experiments) that will be run during hyperparameter tuning. Each trial represents a distinct configuration of hyperparameters being tested. For example, if you set Num Samples=10, the tuning process will train 10 separate models with different hyperparameter combinations sampled from your defined search space. This allows you to explore a broader range of potential configurations and increases the chances of finding optimal hyperparameter values.
Segmentation options :
-
Epochs: Defines how many times the model sees the entire training dataset. Too few may lead to underfitting, while too many can cause overfitting. The default is set to 500, with a customizable range from 300 to 600, allowing you to fine-tune training based on your dataset and model needs.
-
Batch Size: Determines how many samples are processed before the model updates its parameters. Larger batch sizes speed up training but may reduce accuracy and increase resource use, while smaller sizes improve accuracy at the cost of longer training times, and you can set the batch size between 2 and 10.
-
Min Learning Rate: Sets the lower limit for the learning rate, enabling fine-grained updates as the model nears optimal solutions. This helps maintain stability and avoid overshooting. In QpiAI Pro, for segmentation models, users can select a value between 0.0005 and 0.0006.
-
Max Learning Rate: The maximum learning rate sets the upper bound of the learning rate range for hyperparameter tuning or scheduling. It represents the largest step size for model parameter updates, enabling efficient exploration of the loss landscape early in training while avoiding training instability. In QpiAI-pro, for segmentation models, the users can select a value of anywhere between 0.0006-0.0007 for the minimum learning rate.
-
Num Samples: The Num samples parameter in QpiAI-Pro, which can be set between 1 and 10, determines the number of trials (or experiments) conducted during the hyperparameter tuning process. Each trial tests a different hyperparameter configuration, allowing for a wider exploration of potential configurations and increasing the likelihood of finding optimal values. While higher values offer a more thorough search, they also demand more computational resources and time.