Skip to content

The difference between megabyte and megabit

Megabit (Mbit) and megabyte (MB) sound confusingly similar. That’s why they are often confused. atechbook explains what the terms mean and how to use them correctly.

Mbit and MB, megabit and megabyte. You come across these two terms again and again when dealing with technology. And since smartphones have become an essential part of almost every day life, many people probably know what the words mean. Although the two terms sound similar and “mega” in this case also refers to the same unit of measurement, they mean different things.

What is megabyte (MB)?

In simple terms, megabyte refers to the size of a digital file. The basic unit of this measurement is the byte. At the beginning of computer history, files were only a few bytes in size. With technical progress, file sizes also increased. Thus, 1024 bytes are summarized to one kilobyte (kB) and 1024 kilobytes to one megabyte. This can be compared with meters and kilometers.

In the meantime, photos taken with a smartphone are several MB in size and text files are several kB in size, so that files in the byte range hardly ever occur in the everyday life of a normal PC user. Megabytes and the next largest unit, the gigabyte (1 GB = 1024 MB), are more prevalent than ever. Whether it’s the data volume on a smartphone, the download size of games or the storage capacity of hard disks – in all these cases we’re talking about bytes, or megabytes, gigabytes and even terabytes.

What is megabit (Mbit)?

Megabit or bit, like megabyte, is also a unit of measurement for an amount of data. The ratio is 1:8, because 1 byte in today’s computer systems usually consists of 8 bits. In contrast to bytes, however, bits also serve to represent an information content. Bit is an abbreviation for “binary digit”, which is the smallest possible distinction between two states.

Computer information always consists of two states: “On” and “Off” or 1 and 0. A bit is the carrier of exactly such information. Bits are combined into bytes in order to be able to store at least the smallest amount of data. Historically, 8 bits are necessary to represent a single text letter, which is why this unit has become the standard for a byte. A bit is thus the smallest information and storage unit, while a byte is the smallest amount of data and is therefore used to describe the available space on a storage medium.

While memory sizes are specified in bytes as already mentioned, bits are used mainly in connection with networks to indicate data transfer speeds. But then always in bits per second, because bytes are always sent bit by bit through the Internet.

In everyday life, Internet users mostly encounter bits in connection with the speed of DSL connections – in this case as Mbit/s or Mbps, i.e. megabits per second. This is where the greatest potential for error lies, in confusing the units megabyte and megabit. After all, a 50,000 DSL line promises 50 Mbit/s and not 50 megabytes per second, as some still think. Mbit/s must first be converted to be truly meaningful to the general public.

Also interesting: This is how much data volume WhatsApp consumes

How do I convert Mbit/s into MB/s?

Only very few people know what to do with megabits per second, so the conversion to megabytes per second makes sense. Since a byte consists of eight bits, this is consequently not too difficult. Simply divide the Internet speed (e.g. 100 Mbit/s) by eight and you have already determined the maximum amount of data in megabytes that the line can download in the optimum case. In this example, that would be a maximum of 12.5 megabytes per second.


  • #aldi