And it should be pointed out that the original CD was stamped all at once, and NOT spinning at full speed and having the data burned by a laser into a photo emulsive substrate. So, when a CD-R is "burned", we are talking about it first being read at the full 1x speed, and THEN written at the full 1x speed. And there are a myriad of other factors, such as the light passing thru the "decidedly NOT 'optical lens quality' polycarbonate", TWICE (once to read, once to write, with the writing energy level of light much higher than the reading energy level...all at full speed)...the various reflective properties of the gold or silver reflective layer (those are the only two materials used, no CD-R's use aluminum), the various opacity properties of the different dies used in the photo emulsive substrate. And also, how all of this affects the "jitter performance", which as pointed out above, can be both mechanically-interface-related, and also can take place in the digital domain during the data transfer through the circuitry. My point is, there is NO such thing as a "perfect copy" of anything ANYWHERE, simple physics (and the uncertainty principle) dictate this. Also, my brother is both an EE and a computer programmer, and he informs me that a data CD has much higher tolerance for errors than an audio CD, because of the nature of the error correction of the software (it does perform heavy interpolation to correct for errors...no personal computer would function at all, if this weren't the case).