Member-only story

Machine Learning basics (part 4)

Hang Nguyen
3 min readApr 13, 2022

--

In this part, we would get acquainted with these concepts: expected value, entropy, odds and ratio and the main idea of fitting a line to a data.

Expected value

It shows the return that you can expect for some kind of action. Calculated by the sum of multiplying each of possible outcome with its probability.

Entropy

It is used for many things in Data science such as building Classification Trees, and acting as the basis of Mutual Information (which quantifies the relationship between things). The similarity among these things is that Entropy is used to quantify similarities and differences.

Example: Let’s assume we have a coin that has 2 side: Head and Tail. The probability of getting 1 head is 0.9, while the probability of getting 1 tail is 0.1. Let’s flip this coin 3 times and we get Head, Head, Tail.

The probability of getting 2 heads and 1 tail is: 0.9*0.9*0.1

Inverse relationship of surprise and probability
Surprise of getting 2 heads and 1 tail

--

--

Hang Nguyen
Hang Nguyen

Written by Hang Nguyen

Just sharing (data) knowledge

No responses yet