Why does the computer industry think 2^10 = 10^3?

And as a result it's as mushy as this.

Apple still thinks a kilobyte = 1,024 bytes
Then they just do a megabyte = 1,048,576 bytes
And a gigabyte = (2^10)^3 = 1.0737 Billion bytes

And AT&T also says 2 Gigabytes data = 2,048MB

According to Wikipedia
In December 1998, the IEC addressed such multiple usages and definitions by creating prefixes such as kibi, mebi, gibi, etc., to unambiguously denote powers of 1024.[10] Thus the kibibyte, symbol KiB, represents 210 = 1024 bytes. These prefixes are now part of the International System of Quantities. The IEC further specified that the kilobyte should only be used to refer to 1000 bytes. However, the kilobyte is still commonly used to refer to 1024 bytes.

They should be using 2^9.965784 = 1,000

Not that the gravity of the exponential issue exceeds 9.8m/s^2 but still


Why does the computer industry think 2^10 = 10^3?