In mathematics, a **percentage** is a number or ratio expressed as a fraction of 100. It is often denoted using the percent sign, "%", or the abbreviations "pct.", "pct"; sometimes the abbreviation "pc" is also used. A percentage is a dimensionless number (pure number).

For example, 45% (read as "forty-five percent") is equal to ^{45}⁄_{100}, or 0.45. Percentages are often used to express a proportionate part of a total.

(Similarly, one can express a number as a fraction of 1,000 using the term "per mille" or the symbol "‰".)

If 50% of the total number of students in the class are male, that means that 50 out of every 100 students are male. If there are 1000 students then 500 of them are male.

An increase of $0.15 on a price of $2.50 is an increase by a fraction of 0.15/2.50 = 0.06. Expressed as a percentage, this is a 6% increase.

While many percentage values are between 0 and 100, there is no mathematical restriction and percentages may take on other values. For example, it is common to refer to 111% or −35%, especially for percent changes and comparisons.

In Ancient Rome, long before the existence of the decimal system, computations were often made in fractions which were multiples of ^{1}⁄_{100}. For example, Augustus levied a tax of ^{1}⁄_{100} on goods sold at auction known as *centesima rerum venalium*. Computation with these fractions was equivalent to computing percentages. As denominations of money grew in the Middle Ages, computations with a denominator of 100 became more standard and from the late 15th century to the early 16th century it became common for arithmetic texts to include such computations. Many of these texts applied these methods to profit and loss, interest rates, and the Rule of Three. By the 17th century it was standard to quote interest rates in hundredths.

...

Wikipedia

...