Weighted Precision and Recall Equation


The “weighted” precision or recall score using sciki-learn is defined as,

$$
\frac{1}{\sum_{l\in \color{cyan}{L}} |\color{green}{\hat{y}}_l|}
\sum_{l \in \color{cyan}{L}}
|\color{green}{\hat{y}}_l|
\phi(\color{magenta}{y}_l, \color{green}{\hat{y}}_l)
$$

  • \(\color{cyan}{L}\) is the set of labels
  • \(\color{green}{\hat{y}}\) is the true label
  • \(\color{magenta}{y}\) is the predicted label
  • \(\color{green}{\hat{y}}_l\) is all the true labels that have the label \(l\)
  • \(|\color{green}{\hat{y}}_l|\) is the number of true labels that have the label \(l\)
  • \(\phi(\color{magenta}{y}_l, \color{green}{\hat{y}}_l)\) computes the precision or recall for the true and predicted labels that have the label \(l\). To compute precision, let \(\phi(A,B) = \frac{|A \cap B|}{|A|}\). To compute recall, let \(\phi(A,B) = \frac{|A \cap B|}{|B|}\).

How is Weighted Precision and Recall Calculated?

Let’s break this apart a bit more.
Continue reading “Weighted Precision and Recall Equation”

What is the derivative of ReLU?

How to Compute the Derivative of a Sigmoid Function (fully worked example)

This is a sigmoid function:

\boldsymbol{s(x) = \frac{1}{1 + e^{-x}}}  

The sigmoid function looks like this (made with a bit of MATLAB code):

x=-10:0.1:10;
s = 1./(1+exp(-x));
figure; plot(x,s); title('sigmoid');

sigmoid

Alright, now let’s put on our calculus hats…
Continue reading “How to Compute the Derivative of a Sigmoid Function (fully worked example)”