The full meaning of UTF is the Unicode Transformation Format. One of the differences between UTF-8 and UTF-16 is that; when it comes to encoding data, they can both use up to 4 bytes (32 bits), but the minimum that UTF-8 can use is one byte (8 bits), while the minimum usable for UTF-16 is 2 bytes (16 bits). This leads to the difference in their sizes when you make use of only ASCII characters, as UTF-16 would be about two times bigger than UTF-8 even when the same file is being coded with them.
When you use UTF-8 to encode a file that makes use of ASCII characters only, the resulting file will still have a similar identity with an ASCII encoded file. For UTF-16, this is not so because each character would become 2 bytes long. Even the legacy software would not be able to open a UTF-16 file, despite that it is not Unicode aware.