Whatever distortion introduced in a CDR after a digital to digital reproduction, it is not jitter. By definition, jitter is the noise introduced as a result of clock inaccuracies in the D to A or A to D process.
I suspect there might be some debate about whether noise can be introduced in a D to D copying process where there is no format conversion. I would contend there is no noise introduction by virtue of the copying process alone. It is true that the "bits is bits" school has had to re-evaluate the theory with respect to the introduction of jitter by a digital transport in what is ultimately D to A process (or, more accurately, D to D to A). But that is because a clock inaccuracy in the transport will translate directly to an inherent inaccuracy in the timing of the "bits" that the DAC receives, hence, more jitter.
When copying from D to D, assuming the same format, "bits is bits" still holds. Two CD disks have rigidly located physical "pits" wherein the data is stored. When a copy is made, the disks are essentially identical. Of course, alignment and sampling inaccuracies could result in occasional "drop-outs", but that is not jitter and does not sound like jitter. An isolated "drop-out" would porbably be completely inaudible, and a profusion of drop-outs would sound like a lot of crackle.
Having said all that, if you WERE somehow able to introduce jitter in the D to D copying process, then that which sounds like jitter in the CDR is no longer jitter - it is part of the coded sound on the CDR and won't be dealt with as real jitter would. In fact, a jitter reduction process as part of a D to A process would work to preserve that sound while preventing any new jitter as a result of the current D to A process.