This isn't a "how-to" question, but just something that I've wondered about for a while. Say you rip a track from a CD using an MP3 or an AAC encoder at a typical bitrate of 128-192 kb/s. Then say that you convert that track in iTunes to Apple Lossless or some other high-bitrate format. Now, I realize that you don't actually gain any quality, because data was already thrown out when you originally ripped the track to MP3 or AAC. But why, then, does the resulting Lossless file have a much larger size and bitrates in the 800 kb/s range? Where does that seemingly extra data come from?
Again, not a practical question, just idle curiosity.