Posted inTechnology Insights Fonction d’activation neuronale Softmax softmax(xᵢ) = eˣⁱ / ∑eˣʲ Used for multi-class classification. Outputs probability distribution. Posted by
Posted inTechnology Insights Fonction d’activation neuronale ReLU ReLU(x) = max(0, x) Most popular. Computationally efficient, helps with vanishing gradient problem. Posted by
Posted inTechnology Insights Fonction d’activation neuronale Tanh tanh(x) = (eˣ - e⁻ˣ) / (eˣ + e⁻ˣ) Output range: (-1, 1). Zero-centered, often better than sigmoid. Posted by
Posted inTechnology Insights Fonction d’activation neuronale Sigmoid σ(x) = 1 / (1 + e⁻ˣ) Output range: (0, 1). Used for binary classification and probability outputs Posted by
Posted inTechnology Insights Roman number However, the numeral for four is not IIII. Instead, the number four is written as IV. Posted by