This is a very confused and confusing area, based around a series of independently developed standards used for quite different purposes and for different technologies. In this short article I propose to put the standards into an historical and functional perspective ie when and for what purpose a standard was set.
This standard was developed in the early days of telephony and adopted by pro audio, when transformers were used as a matter of course for balancing signals. 0dBm is the voltage developed by 1mW passing through 600 Ohms. This was the characteristic impedance of an open spaced telephone line. It is .775V RMS.
This is .775V RMS regardless of impedance. This was developed as the use of 600 Ohms termination of signal lines, as a standard, was being used less and less, and significantly higher impedances became the norm.
The VU meter was a meter for telephone lines developed by Bell to standardise signals on telephone circuits. Originally defined as 0VU = 1mW into 600 ohms with 1000Hz Sine wave
0VU is now defined as 4dB above 0dBU for nearly all audio purposes as +4dBU has also gradually become an industry standard line up level.
This is 1V RMS,which is the voltage developed by passing 1mW through 1000 Ohms. -10dBV is used as a reference for consumer equipment levels.
This refers to a digital audio signal at full scale ie maximum amplitude. On its own it means absolutely nothing in the analogue domain. There are a number of ‘standards' for the conversion between the full scale digital and the analogue signal level that comes from the Digital to Analogue converter. In addition some broadcasters require the maximum digital level of recordings to be a certain amount below full scale.
Commonly used meters and comparison of scales