Some programing languages, notably Pascal, have a type of numbers called "real".
However, mathematically speaking, these types aren't real. For them to be "real", these types have to be able to represent any real number. Real numbers like 1/3 and irrationals, however, can't be represented in floating point. So why do some programing languages call these types "real"?