PDA

View Full Version : AccurateRip - Future Direction



Spoon
02-21-2008, 01:13 PM
It has been brought to my attention that the CRC used in AccurateRip is not doing its job propperly, in laymans terms the Right Channel rolls out of the CRC Calculation every 1.5 seconds (that is 1st sample right channel is used 100%, by the 65535 sample it is not used, 65536 sample it is used 100% again, this repeats over and over). It is estimated that effectively 3% of the data is not getting into the CRC (at a 97% coverage, I stand behind AccurateRip @ 97% is better than most (? all) c2 implementations). Going back over the early AccurateRip code it seems the design of the CRC is fine, just the implementation (L and R channels were supposed to go in seperately, but were optimized to both go in without bringing down the upper 32 bits).

Steve will post his findings in detail on his discovery.

It is a relatively easy fix (detailled below), however this presents an opportunity, which was not around when AccurateRip was first implemented (the understanding of different CD pressings and how they were implemented was almost non-existing).

----------------------------
1. Fix: Fix the algorithm so all the data is used, both new and old CRC are calculated, new checked first, old second (with less Accuracy). New submissions would effectively appear as different pressings in the database.
----------------------------
2. Fix : Change the CRC algorithm to something like CRC32, the reason it was not used in the first place, was tracks 2 to x-1 would match the CRC presented in EAC, but 1 and last would never, causing confusion, the CRC could be XOR'd to avoid this confusion.
----------------------------
3. Fix & Additional Development: Use CRC32 and the old CRC (there is lots of data in the database), new CRC32 would go into a parallel 2nd database, increasing the strength of the CRC to almost 64 bits (not taking into account the flaw). Back end there is little changes to be made, both databases are the same design.
----------------------------
4. Fix & Additional Development: Use a different hash, MD5, sha-1, these would increase storage of the database by 5x (160bits of sha-1).
----------------------------
5. Brainstorm a method of having a hash which would be resistant to pressings, yet still be feasable for a CD ripper to rip track rather than whole CD based (and not have the need to read outside of the track).
----------------------------
6. ???

Bear in mind the existing database before construction takes up some 14 GB.

pls1
02-21-2008, 01:48 PM
Excellent. With Classical CD repacking/re-pressings bordering on random this would really help me since even when my CD is in accuraterip a vast majority of the time at least one track does not match but is labeled Secure.

Since I'm paranoid, I have been testing re-rip on two different machines/ Plextor models and comparing the CRCs in those instances.

Phil

EliC
02-21-2008, 01:52 PM
Would it be time to look at hashing chunks of the tracks, so that it can be identified which chunk has errors? I realize this would increase the db size, but it would also allow more power for secure rippers to know what parts of the track are accurate and which parts are inaccurate.

bhoar
02-21-2008, 04:29 PM
Some of the ideas sound good, others sound good but seem to carry a lot of negative ramifications. I don't envy you. :)

Some other thoughts:

1. Perhaps some of the suggested changes (or variants of them) could allow you to implements that trim or combine submission content more often: e.g. roll the submissions into single entries with counters if they fit certain criteria

2. I like Eli's recommendation of generating and storing what I'd like to call "macro C2 pointers" (heh) to help the ripper decide which part of the track the error is more likely to be in.

Of course, the question is: what problem does the solution solve. In addition, different implementations would address different problems: breaking a song into four "blocks" for smaller checksums would allow you to know which quadrant is broken (but shouldn't c2 errors, suspicious positions, etc. tell you that already?) or generating each block based on data from every fourth sample or even every fourth bit, interleaved instead (perhaps allowing for checking for unusual sample transitions or bit flipping to correct single bit errors...if either was ever a problem).

But with the above mentioned current space requirements, the projected growth of AR could limit the feasibility.

3. Re-examine the "real zero" for offsets based on the research that showed it to be different than originally assumed. I suggested re-zeroing AR last year but you gave two successful arguments why that shouldn't happen: basically, nothing to gain by moving the zero mark that number of frames...and you'd have to throw out all results generated from the current offset "zero". That last part may now happen due to the original post's stated reasons. The first part may still stand, of course, but it might be worth looking at again, just to be sure.

4. "AccurateRip II: Electric Boogaloo!"

-brendan

PS -


5. Brainstorm a method of having a hash which would be resistant to pressings, yet still be feasable for a CD ripper to rip track rather than whole CD based (and not have the need to read outside of the track).

I think the goal and the restriction are mutually exclusive. If you can't "look" outside of a single track's boundaries, and that track's boundaries move together the same amount forward or backward over the same larger data set from pressing to pressing, then there's no way to generate a single hash or checksum for both pressings. If you relaxed that requirement by allowing the ripper to "start early" and "end late" when ripping tracks, perhaps 2x the largest "delta" seen between pressings, then that would allow you to do this. The first pressing submitted would serve as the master, matching but TOC-delta'd discs would have the same entry with an offset delta.

(ignoring problems with disc begin/end overreads, of course, as well as limitations in the MMC command set for working directly with tracks - sounds like more fun)

EliC
02-21-2008, 06:13 PM
Should we also be looking at discs as a whole? Not that I would suggest requiring people to rip all songs, but it would be nice to know more which may give better insight to the "different pressing" issue.

I like the idea of confirming where the REAL ZERO is, and taking the opportunity to do it right.

What about adding information from secure rippers as to if the track was ripped securely? Secure rips could be kept in the db even if there is only one rip of that pressing (though the pressing issue may go away), until it was shown to be inaccurate.

I thought you had mentioned before that you would be able to compare different pressings by knowing the pressing offset?

pls1
02-21-2008, 06:41 PM
Perhaps I 'm confused about this but I'm having two infrequent problems. The first is on CD's where all but one or two tracks has a match in accuraterip and the one or two tracks on the CD does NOT match accuraterip but listed as secure. These first rips have all been on Plextor 760 or 750 drives.

Now usually, from knowing the classical music business, it seems to be mostly re-packaged CDs. But then I have had two different CDs of Madrigals where two of the middle 15 tracks show as not accuraterip and secure. But I know there has been only one pressing world wide. I've have now seen about 10 of these.

More problematic is where the track shows as matching accuraterip but the log shows errors and a re-rip can generate a different sum. Again maybe about 10 tracks. I've been re-ripping on multiple different drives (now Plextor 230A drives) to get a clean rip consistent rip.

While not statistically significant, due to my sample size of a few thousand tracks, I estimate these anomalies have occurred each roughly at about 1/4%.

Perhaps I just don't really understand this and don't get me wrong, dbpoweramp is a great product. 99.75% automated quality confidence is nothing to complain about. However, I'm carefully monitoring full error logs and need to have a personal work-flow to clear these anomalies as I rip my collection.

Phil

Spoon
02-22-2008, 04:38 AM
(have a cup of coffee before reading this...)

6. I think I have the solution! as it stands in the database for each track (forget pressings for the moment) is a track CRC (which has the flaw) and an offset finding CRC (which does not have the flaw).

I will be talking about 2 databases, side by side, the existing database is DB1 and new is DB2

[DB1] Work should be done in EAC and dBpoweramp ASAP to correct the flaw, each program should calulate 2 CRCs , the old one and the new one. Only the new one should be submitted once the fix is implemented. The old CRCs would in time be replaced by the new CRCs in the same database.

[DB2] In addition a 2xCRC32's should be generated:

[CRC1][..............CRC2............][CRC1]

So CRC1 is the first say 5 frames and last 5 frames of the track, CRC2 is all the track. These 2 CRCs could be submitted to a 2nd database, where the CRC1 will go into the current offset finding slot, no changes on the backend! (apart from creating the 2nd database)

Why do this? It would allow a match if your CD is a different pressing and not really in the database, no rolling CRCs are needed as the CRC from the existing database that is used to find offsets of drives can find the offset of the pressing and as long as it is < 5 frames +-, the pressing can be verified. It also has the benifit with track 1 (which currently is only calculated from 5 frames in) for any drive with a + offset it would have the correct CRC1, so all of track 1 could be verified in its entireity (not possible for the last track as majority of drives cannot overread).

When I started AccurateRip the idea of pressings messing the audiodata was not known (to me), if you had 40 different pressings of the same CD (could be with worldwide releases over 10 years) that lowers the 1 in 4 billion of a working 32-CRC routine to 1:100 Million of the chance of a CRC clash, adding the 2nd CRC would boost CRC to 64 bits effectively. Then AccurateRip could return:

Match using old CRC method,
Partial pressing match (10 frames of the file missing)
Match using CRC fix method (32 bit), in additon CRC32 match (on CRC1 and CRC2, so whole track)

All that would need to be done is a method of showing which of the above to the end user.
Changing to MD5 would mean the whole backend being rewritten, and there is about 30x more code on the backend - to keep the database clean from rouge data, such as drives configured with wrong offsets.

Steve_Gabriel
02-22-2008, 06:33 PM
Spoon asked me to post details of the flaw in the Accurate Rip CRC to Hydrogen Audio, so there's a parallel thread going on there about this topic.

http://www.hydrogenaudio.org/forums/index.php?showtopic=61468

I first contacted spoon privately about this bug because I thought it was quite a serious error. I want to thank him for inviting me to bring this up publicly.

Accurate Rip as it now stands is blind to 3% of the possible single bit errors in file. Luckily a CD read error tends to scatter bad bits all over the frame and is very, very likely to be detected even by the current faulty algorithm. However, if there are errors that appear only in certain MSBs of the Right channel, they will not be detected.

This is not nearly as bad as it sounds. If we assume the bit errors are randomly distributed, a big assumption, but not completely crazy since the file has been through 2 layers of error correction (C1 and C2) which tends to randomize errored output, then each additional bit error reduces the undetected probability by 97%.

The crude formula for the probability that an error in your file is not detected is

2 ^ -(5 * number_of_bits_wrong)

This formula is accurate down to about 2 ^ -32, so if there are at least 6 bits wrong in the file, you've reached the full detection power of a 32 bit checkword.

EliC
02-27-2008, 08:48 PM
Any new system should also be built from the ground up to be able to verify lossless rips after the fact, especially as more entries are added to AR2.

funkyblue
02-28-2008, 03:03 AM
If there is going to be a new DB, what about changing the offset as well? Was there not some thoughts that the offset settings we use are off by 30?

Spoon
02-28-2008, 04:30 AM
It would mean EAC and dBpoweramp would have to change, and would create such a confusion about offsets, not worth it.

funkyblue
02-28-2008, 05:22 AM
I forgot about the offset confusion :) But it could still be done since there will be a new database anyway.

Porcus
02-28-2008, 06:04 AM
Some thoughts, which may be technically nonsense (I am not sure if I have understood correctly how AccurateRip works)

1) on backwards compatibility:

[DB1] Work should be done in EAC and dBpoweramp ASAP to correct the flaw, each program should calulate 2 CRCs , the old one and the new one. Only the new one should be submitted once the fix is implemented. The old CRCs would in time be replaced by the new CRCs in the same database.
Is it a good idea to only submit the new one? Wouldn't it be better to submit both, and score the new-CRC up or down according to the accuracy of the old-CRC for the same rip? This would enable you to give a better estimate of accuracy for tracks with one or few new-CRC submissions.

I think one should pay attention to how much time it has taken to populate the AR database. The physical CD format is in decline (which on the other hand might increase the need for secure ripping, as those of us who care about sound quality will might buy second hand collections ...) If the number of AR submission grows exponentially at a high rate, then maybe an AR2 database will be useful in short time. Just think of it.


2) offset issues
2a) a "check this file" feature?
I know it would be hard to prevent multiple submissions though, but if you are to consider an update of AccurateRip, it might be worth to have this in mind. Ideally one should be able to do so even for rips with incorrect offset: take a folder with n wavs, process k CRCs corresponding to offset (takes time, but on user's computer ...), find one which matches, and use this offset to check the other files in the folder. At least it would help to confirm a lonely AR entry, and if one has to "adjust for offset" in the file, then one knows that it is not the same rip as the one in the database.

2b) store offset used?
More generally, a "file ripped with offset x" datapoint in the AR base would certainly require some bits, but if two files ripped with different offsets would match, then one is safer; they are not multiple entries if the same rip, and AFAIK not the same drive or model.

2c) different pressings?
And then: is this a way of dealing with different pressings? Are different pressings usually bit-identical up to different offsets? (Hm, I suspect they would also differ by pressing-specific bit errors, hence a need for secure ripping?)



A suggestion: Would it be an idea to store users' AR entries and lookups locally? (Voluntarily, for privacy reasons.) Could prevent multi-submissions.



Bear in mind the existing database before construction takes up some 14 GB.

Is that much? Not in terms of hard disc cost ...

Fiber
02-28-2008, 06:23 AM
Some thoughts, which may be technically nonsense (I am not sure if I have understood correctly how AccurateRip works)

1) on backwards compatibility:

Is it a good idea to only submit the new one? Wouldn't it be better to submit both, and score the new-CRC up or down according to the accuracy of the old-CRC for the same rip? This would enable you to give a better estimate of accuracy for tracks with one or few new-CRC submissions.

I think one should pay attention to how much time it has taken to populate the AR database. The physical CD format is in decline (which on the other hand might increase the need for secure ripping, as those of us who care about sound quality will might buy second hand collections ...) If the number of AR submission grows exponentially at a high rate, then maybe an AR2 database will be useful in short time. Just think of it.


2) offset issues
2a) a "check this file" feature?
I know it would be hard to prevent multiple submissions though, but if you are to consider an update of AccurateRip, it might be worth to have this in mind. Ideally one should be able to do so even for rips with incorrect offset: take a folder with n wavs, process k CRCs corresponding to offset (takes time, but on user's computer ...), find one which matches, and use this offset to check the other files in the folder. At least it would help to confirm a lonely AR entry, and if one has to "adjust for offset" in the file, then one knows that it is not the same rip as the one in the database.

2b) store offset used?
More generally, a "file ripped with offset x" datapoint in the AR base would certainly require some bits, but if two files ripped with different offsets would match, then one is safer; they are not multiple entries if the same rip, and AFAIK not the same drive or model.

2c) different pressings?
And then: is this a way of dealing with different pressings? Are different pressings usually bit-identical up to different offsets? (Hm, I suspect they would also differ by pressing-specific bit errors, hence a need for secure ripping?)



A suggestion: Would it be an idea to store users' AR entries and lookups locally? (Voluntarily, for privacy reasons.) Could prevent multi-submissions.




Is that much? Not in terms of hard disc cost ...

It's not the harddisk costs, it's the load of the server. ;)

Porcus
02-29-2008, 08:02 AM
It's not the harddisk costs, it's the load of the server. ;)
That's why I asked ;)

EliC
02-29-2008, 09:57 AM
Forgive me, but how does a larger db increase server load? This clearly is an area that I know nothing about.

pls1
02-29-2008, 12:15 PM
Pretty standard stuff for database performance tuning. See the Microsoft SQLServer page two screens down:
http://technet.microsoft.com/en-us/library/bb508963.aspx

tourrilhes
03-08-2008, 02:07 AM
Hi,

While you are thinking of improving AccurateRip, I have a few suggestions.

I think it would be interesting to tell if the AccurateRip match is composed only of drives of the same type, or if there is a diversity of drive types.

This would help in two way :

1) If I re-rip the same disk on the same drive, I would know if I match my previous result or not.

2) If you assume that drives could have repeatable firmware bug, one would be able to gauge the confidence of his rip with more certitude, as more drive diversity is better.

For example, I can assume that Plextor is pretty popular for ripping. If I rip with a Plextor and I match with confidence 4, if all those 4 other match are also done on Plextor, I get less confidence than if all 4 are on different drives.

One quick way to implement that. For each AR record, you associate an offset field. For the first rip, when the AR record is created, you set the record offset to the offset of the drive that did the rip. For subsequent rip, if the offset is the same, you don't change anything. If the offset of the new rip differ from the record offset, you invalidate the offset. Then, when checking the AR record, I can compare the AR record offset with my own offset.

In a similar vein, you could keep in the record some idea of the diversity of ripping program used and ripping mode used. As we know, some program in some mode may introduce repeatable errors. Again, a greater diversity gives a greater confidence.

Thanks again for AccurateRip, and good luck with the new version...

Jean

EliC
03-18-2008, 03:22 PM
What is the current status of AR2? With the revelation that AR is not as 100% fullproof as we thought I think AR2 needs to be a very high priority as well as an option to force secure ripping in dBpoweramp.

Spoon
03-18-2008, 03:33 PM
I am taking my time, no point in rushing into a decision which you cannot change for 5 years...

funkyblue
03-18-2008, 03:39 PM
Nothing in life is 100% perfect....

pls1
03-18-2008, 03:51 PM
Hi,

If I rip with a Plextor and I match with confidence 4, if all those 4 other match are also done on Plextor, I get less confidence than if all 4 are on different drives.

Jean

That is why I'm using a drive that was not in the Accurate Rip database (a B900A) for those 30% of the discs in my collection that are actually in accuraterip. No "false positives" yet.

For those that aren't in accuraterip (or don't match), I'm doing my own rip compares using at least two different drives (three or four different drives if there were re-rips). I don't doubt the existence for these errors as I've read the various threads. However, except for the last track read in/out business, I have not found a repeatable error (after 1500 discs), when dbpoweramp returns "secure". Maybe I will in the next few thousand.

I will be curious after AR2 gets implemented, what the incidence of any "false positives" actually was in AR1. "False negatives" don't bother me since most of my collection isn't in accurate rip. Different concerns of course for a commercial ripping business but these may be impossible to meet from the perspective of validated statistical process control.

Phil

tourrilhes
03-18-2008, 04:31 PM
What is the current status of AR2? With the revelation that AR is not as 100% fullproof as we thought I think AR2 needs to be a very high priority as well as an option to force secure ripping in dBpoweramp.

Trivial. Setting -> Secure Setting. You can disable the check for AccurateRip. In that case, dBpoweramp will do the secure ripping.

But you will have to ask yourself if that is enough diversity. You are still using the same drive and same reading method. To me, that's not enough diversity.

I personally agree with Spoon, he should consider all aspect of the issue and not move too fast.

Jean

EliC
03-18-2008, 04:38 PM
I dont want to disable AR, but nor do I want to rely on it as much as it currently is.

An accurate match in AR, plus a matching re-read on no C2 errors would be good enough, I think.

tourrilhes
03-18-2008, 04:53 PM
That is why I'm using a drive that was not in the Accurate Rip database (a B900A) for those 30% of the discs in my collection that are actually in accuraterip. No "false positives" yet.


The B900A uses a Panasonic chipset. Many LG drive do use a Panasonic chipset. I think the ripping behaviour of your drive would be very close to other drive based on the Panasonic chipset, if not exactly the same. When Panasonic do a new chipset, they just cut'n'paste from the old chipset.

I personally would not worry so much about this AccurateRip "flaw". Most errors are clustered and affect both channels, so very unlikely to not be detected. And I would use a cheaper drive for ripping ;-)


For those that aren't in accuraterip (or don't match), I'm doing my own rip compares using at least two different drives (three or four different drives if there were re-rips).


Are they different chipset ? If I remember, you do, so you should be good. Note that when ripping on the second drive, you could use only a single pass.


I don't doubt the existence for these errors as I've read the various threads. However, except for the last track read in/out business, I have not found a repeatable error (after 1500 discs), when dbpoweramp returns "secure". Maybe I will in the next few thousand.


My experience is that some of the errors did not happen the way I was expected. I was expecting only random errors, I've seen random errors on scratched disks, but I'm also seeing consistent errors on pristine CDs. Those errors are actually harder to detect, and I feel most people would overlook them.


I will be curious after AR2 gets implemented, what the incidence of any "false positives" actually was in AR1. "False negatives" don't bother me since most of my collection isn't in accurate rip. Different concerns of course for a commercial ripping business but these may be impossible to meet from the perspective of validated statistical process control.

Phil

Actually, I'm much more interested in how AR2 will address the new type of consistent errors that seems to be reported. This, I believe, is more crucial.

Have fun...

Jean

tourrilhes
03-18-2008, 04:59 PM
I dont want to disable AR, but nor do I want to rely on it as much as it currently is.

An accurate match in AR, plus a matching re-read on no C2 errors would be good enough, I think.

Anyway, I believe that re-reading on the same drive using the same method is totally useless to detect some class of consistent errors (not enough diversity). You'd be much better off to do your second pass on another drive with the different chipset. Or to use EAC in secure mode for the second pass (not as good IMHO).

So, you leave AR on. If the confidence is too low, just put the CD in the other drive and try again. dBamp will show if the CRC matches or not (or check the logs). This is effectively what I do with EAC.

Of course, now the trouble is that you have to find two good ripping drives instead of only one ;-)

Jean

pls1
03-18-2008, 05:47 PM
Are they different chipset ? If I remember, you do, so you should be good. Note that when ripping on the second drive, you could use only a single pass.

Jean

I've tried to find that elusive diversity and have looked up the chipsets (as well as opinions that offsets matter as much or more). So I've got my primary two models which are common Plextor branded Drives that are definitely different chipsets:
PX750/760
PX230

I do only use a single pass for confirmation and have three varieties of confirming drives: the B900, various Samsungs (same chipsets I believe), and the PX-40TS.

As far as I can, tell this gives me five drives that actually are different, all of which are at least decent. Any other specific suggestions on drives would be welcome.

Phil

Phil

EliC
03-28-2008, 12:14 PM
Will AR ID and Disc TOC be enough to look up current rips in AR2? Can a utility to confirm already ripped files be a priority with AR2?

Spoon
04-04-2008, 05:23 AM
Yes

EliC
05-21-2008, 10:55 PM
Just wondering were AR2 development stands?

Spoon
05-22-2008, 04:16 AM
Once R13, CD Writer and Batch Ripper are released, work can begin.

EliC
06-16-2008, 04:44 PM
Just a follow up, I am about to do a large rip and would like to wait for AR2, so the discs can be added to the new db. Any word?

Spoon
06-17-2008, 07:18 AM
Ar2 will only be of any use when the database becomes populated with AR2 data, so even if it was released tomorrow it would take 6-12 months for AR data to fill the db.

EliC
06-17-2008, 09:52 AM
I understand that. However, I want to help populate it AND I hope to be able to later check my files against AR2 after it is populated, and this may only be possible if ripped with AR2 enabled and an AR2id is assigned.

Spoon
06-17-2008, 10:52 AM
No, they would both use the same disc IDs, so future lookups would be possible from ar1 to ar2.

EliC
06-30-2008, 06:17 PM
Will the release of AR2 solve the different pressing problem on AR1 entries, or only for new entries in AR2? There has been some hint/hope that AR1 would be able to verify across pressings, right?

Spoon
07-01-2008, 06:17 AM
AR1 will not work with different pressings.

EliC
07-01-2008, 08:03 AM
Just clarifying, I was wondering if AR2 will be able to use AR1 entries across different pressings.

Porcus
07-31-2008, 05:36 AM
Bump! Are the algorithms etc. fixed, or are we still in discussion mode? ;)

(And, in the latter case: there might be a bigger brain trust over at Hydrogenaudio?)

EliC
07-31-2008, 08:55 AM
Bump! Are the algorithms etc. fixed, or are we still in discussion mode? ;)

(And, in the latter case: there might be a bigger brain trust over at Hydrogenaudio?)

The issue has been addressed at HA

http://www.hydrogenaudio.org/forums/index.php?showtopic=61468

EliC
09-05-2008, 08:09 PM
Another month, just wondering if there is any word?

Porcus
10-17-2008, 09:02 AM
Spoon,

you are surely aware that there are now independent offset-correction software available for retroactive AccurateRip verification. Not that I find any good documentation on TripleFlac and fooAccRip, but at least the CueTools developer is active at Hydrogenaudio, and I pointed out to him dBpoweramp's choice of tagging for AccurateRipDiscId and result, see http://www.hydrogenaudio.org/forums/index.php?showtopic=66233&st=50 and his response.


Given that these applications are starting to come around, and assuming that AccurateRip2 will be able to identify offset-only-different pressings, I think that now might be the appropriate time to discuss a choice of design of new tags. You may or may not consider the design of AR2 as closed, and you may or may not consider the design of the tag reporting as closed too, and whether it is still open for discussion you may want to rather involve Hydrogenaudio with its bigger braintrust.


But to just state an example, an AccurateRip2Result tag could look like:

AccurateRip2Result=AccurateRip2: Accurate (confidence 10) [B]@ +99, Accurate (confidence 8) [xxxxxxxx] @ -14, ...

where the "+99" and "-14" are changes in offset compared to the drive's standard offset. Now there is an issue: what order should the accuracies be stated in? IMHO there is a difference between a "disc offset" which makes pressing1 = pressing2, and "track offsets" which makes some but not all tracks match (in which case there either is something wrong, or some pressing has extra zeroes somewhere). Maybe you even want (1) the +0 result, then (2) the best discoffset result, then (3) the best trackoffset result?

Spoon
10-17-2008, 03:32 PM
AR2 will be pressing immune, so it does not need to show the pressing offset.

Porcus
10-20-2008, 05:50 AM
Just in case it is still open for discussion, a request: I wish AccurateRip2 be immune to the presence of data tracks too, as some pressings do include data tracks absent in others.

An idea, by the way: As dBpoweramp users have AccurateRip tags in their lossless files, a new version of dBpoweramp might include an "AccurateRip wants your old files" option -- let user point you to music library folder, scan for losslesses, calculate checksum, submit to AccurateRip2. As time goes by and AR2 is populated, include an option to update AccurateRip tags.

Edit: I guess EAC tags could be recognizable too.

EliC
11-16-2008, 07:23 AM
any updates or timelines on AR2 release?

Spoon
11-17-2008, 05:58 AM
In the new year.