I think this is a newbie kind of question but it's confusing at best
When I ripped with Apple Lossless output default setting, then add to my iTunes library, then review the file information in iTunes, I see varying bit rates per song (not 1,411). The sample rate is constant at 16bit / 44,100kHz
Is this actually an Apple Lossless norm? The varying bit rates are perhaps the file compression even though the at play back it's lossless?
Any input would be greatly appreciated.
Thanks in advance
When I ripped with Apple Lossless output default setting, then add to my iTunes library, then review the file information in iTunes, I see varying bit rates per song (not 1,411). The sample rate is constant at 16bit / 44,100kHz
Is this actually an Apple Lossless norm? The varying bit rates are perhaps the file compression even though the at play back it's lossless?
Any input would be greatly appreciated.
Thanks in advance
Comment