Cross-Entropy Loss
Cross-Entropy Loss is a metric used to evaluate how well a model's output matches the true distribution of the data. It measures the dissimilarity between predicted probabilities and true labels. In classification tasks, it determines how a model's confidence in predictions aligns with actual outcomes. Similar to Mean Squared Error in regression, Cross-Entropy Loss is crucial for optimizing model performance regarding prediction accuracy.