FLAC vs WAV


I have observed (heard and then tested so as to confirm) the following “condition” as it relates to the widely debated issue of FLAC quality. The purpose of this topic is to gather opinions as to whether or not your observations are similar too – and therefore support – my own.

It is widely understood and accepted that a FLAC file while “compressed” is “lossless” as compared to its corresponding WAV file. Let’s assume (i.e. not debate) this is completely true. What I am noticing is that when the FLAC file is “played” via any FLAC player it sounds different from the sound of the “same” (equivalent decompressed FLAC) WAV file when played back via the same player that was used to play the FLAC file. This is specifically noticeable (to me) in the low frequency spectrum. The WAV has considerably more “sonic energy” that manifests itself as appearing to be a bit louder, wider in frequency range and perhaps even dynamic range as compared to the FLAC equivalent.

I’m curious as to your findings when you compare a FLAC file played natively as compared to the WAV equivalent played via the same player (for example, play both the FLAC and WAV via VLC media player) or practical equivalent, such as if the FLAC was burned to CD and you are comparing the FLAC played via VLC and the CD played via a CD player.

I am further assuming that the WAV file is a more accurate representation of the audio than the FLAC. This is to say that should you agree with the aforementioned, it would be preferable to play the WAV file or decompress the FLAC file before using it.

128x128gdhal
Something interesting (to me anyway) I have discovered. As reported by Windows Properties size (not size on disk), a WAV encoded to Flac (compression level 6 but realistically any) and then back to WAV from that FLAC is not identical. For example, if the WAV is 399,718,326 bytes and converted to FLAC it is 188,443,075 bytes. If I then decode the FLAC back to WAV the size is 399,718,324 bytes. Can that 2 byte difference be explained?

This will be debated as long as digital is in existence.  I have limitless storage capability (10TB) and at the time, all I had was time (was in-between jobs).  I spent endless hours testing CD rips to see which was better before I ripped my entire library.  I ripped dozens of CDs in WAV and FLAC, and also AIFF.  I used dBPoweramp to rip.  As much as I wanted to hear it, I could hear ZERO difference between them all.  Zero, none, nada, zilch, nothing!

If someone can hear the difference, bless you.  My system is incredibly resolving so that's not it.  Just my experience.

What I did hear a huge difference in was the playback software.  JRiver was best, Mediamonkey (?) was horrible, iTunes and a whole bunch of others were tested.

acurus, thank you, however, what I have discovered transcends whether or not you, I or anyone else can "hear" a difference. Question is, "is there a difference". Apparently there is.

By the way, I’ve read great things about JRiver from numerous audiophiles so do not doubt it is very good. Best is subjective. In my case I use Windows Media 12 (on Windows 10) and find it more than adequate.

The difference of 2 bytes is probably just a matter of how each program pads the end of a file with zeros. It depends on the way the program writes the data. It has nothing to do with the audio data. It is just a programming difference.
You can do a binary compare of your two files using, for example,  the Windows fc command at the command prompt.

fc/b  file1 file2

There are other compare options, but this is a place to start if you want to go into the differences in your files in more detail.