Hi -wondering if someone can help or shed some light on a query I have.

We've generated 320kbps AAC's for itunes supply - which include gapless play between some tracks. As part of the checking QC process, we've tried a variety of methods of checking the AAC's back to check for the correct joins (i.e. no gap) between the tracks.

If the aac's are played from iTunes (PC, RME sound card - so bit-accurate), the gapless play works flawlessly.
If we convert the aac's to Wavs using DMC Batch converter, so they can be played using a standard audio workstation (to see profiles, and look for digital black, etc) the start and end of each wav inserts around 0.035s (@ 44.1kHz, 16-bit) of digital black. Therefore when the files are butted together (zero crossfade time, etc) what should be gapless has had around 0.07s of digital black inserted.

The same is true using Soundforge 10 (which uses a Quicktime plugin for the AAC import to convert to WAVs) - again it's around 0.07s between tracks - which shows as a tiny dropout.

What I've not been able to find out is:
1 - whether the 0.35s of digital black at the start and end of each file is actually encoded in the AAC file at the aac generation stage, but gapless play within iTunes truncates it.
2 - whether there is no digital black in the aac file, and quicktime, and dmc add this.
3 - something else.

FYI - the workstations we use here are pro-audio (soundforge was just a test), so we know that they handle WAV's, headers, etc, correctly.

Any help gratefully received - as we're trying to do things correctly - hence QC is an integral part of the customer experience!

Thanks
Gareth