Important Terminologies in ML/AI

Entropy: It is the measure of the disorder / uncertainty / randomness in the information being processed. Our goal normally in machine learning is normally to reduce this. Mathematical formula of Entropy (sometime also represented as H):

\begin{equation} E(S) = \sum_{i=1}^c – \; p_i \; log_2 \; p_i \end{equation}
Data Science for Business: What You Need to Know about Data Mining and Data-Analytic Thinking

Distance Metric: Link1, Link2

Curse of dimensionality: Curse of dimensionality phenomena occurs in many domains inculding machine learning. The common theme of these problems is that when the dimensionality increases, the volume of the space increases so fast that the available data become sparse. This sparsity is problematic for any method that requires statistical significance. In order to obtain a statistically sound and reliable result, the amount of data needed to support the result often grows exponentially with the dimensionality. [2]

References
1. Entropy: How Decision Trees Make Decisions
2. Curse of dimensionality - Wiki