illustrate
Products            Buy            Support Forum            Registrations            About           
 

dBpoweramp Batch Ripper: Discussions

Collapse
X
 
  • Time
  • Show
Clear All
new posts

  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Sounds like you're being forced into paging hell. Let's say for a 50 minute CD there are 10 tracks, each track is then ~50MB of audio data. So, each conversion thread needs 50MB of RAM for input data and, say, 35MB of RAM for output data. That's 85MB per thread. You're piling up 20 threads or 1.7GB of data feeding into or out of the core converters. Which is going to cause a lot of paging IO.

    Perhaps there should be a memory-limited clamp on how many transcoding threads/processes can be queued at once?

    -brendan

    Leave a comment:


  • RipTheWorld
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Burst mode is the standard ripping method. Does burst mode have any error detection? My current software implements a halfway solution where it will notify of errors on tracks and do minor correction, but doesn't go down the full Secure rip route. I guess a process more similar to the CD Paranoia libraries. This seems to be the best trade-off for most bulk processors such as myself (speed vs. accuracy)

    OK, have done a fairly extensive set of tests with different codecs and the conclusion is that none of the lossless codecs are performing very well. They appear to use all of the processing cores and seem to perform fairly well at the beginning, but as soon as there is a backlog of CoreConverter processes then things slow down pretty quickly. They quickly start using up a lot of memory (you'll need at least 2GB for 4+ drives) and with 5 drives going I had over 20 encoding processes with WML. Similar thing happens for all lossless codecs. This is all on XP btw.

    I managed to get the program to delivery similar performance to the existing software that I have by using the following:
    MP3 224Kbps CBR - Fast Encoding

    As with any software changing this to higher settings will max out the CPU more and therefore reduce the performance more and powerAMP uses about 10-15% more CPU. That could be down to the LAME build or the way the audio is processed, ripped and encoded as 2 separate processes as opposed to ripped and immediately encoded out to the disc. I guess the first way could be better if the CPU can't cope with the data stream as it could slow the rip process down, but the second way has much lower memory requirements and won't choke the CPU with processes. It would be great to have the option to choose.

    I ran the same set of discs maybe 30 times to do the comparisons and I was often getting discs not matching to a meta data entry. Even stranger is that it was always the same 2 discs, if it had been different ones I could have blamed it on not hitting the database.

    Let me know if there is any other tests you need running.

    Leave a comment:


  • Spoon
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Nothing preventing it, we do not have the code to query disc type in any of the communication methods.

    Leave a comment:


  • EliC
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Originally posted by Spoon
    >Can the ripper be set to reject audio cds with a book type for a CD-R or CD-RW?

    No
    Is there anything from preventing this feature from being added as an option?

    Leave a comment:


  • Spoon
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    If coreconverter is getting stuck, post the audio format you are encoding to and the Windows version you are using.

    Leave a comment:


  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Originally posted by RipTheWorld
    2. Can't cancel the batch job, just seems to sit there and do nothing. I have to go through and kill the processes manually.
    The End Batch button only ends the batch when the currently loaded discs finish ripping.

    To force an End Batch to happen faster, with alpha2, I:

    1. Hit the End Batch button.
    2. For each drive, right click and tell it to skip/cancel the disc.
    2. For each drive, right click and tell it to skip/cancel the track.

    Originally posted by RipTheWorld
    3. Doesn't seem to be running at anything like fullspeed. I have it on a Quad Core with upto 6 drives and the highest the CPU usage goes is 15%. On my other software for the same codec its around 70-80%. I am guessing this is causing the really slow rip speeds I am seeing. I have opened up several copies of the standard CD ripper (as it would appear this is what BatchRipper does) and start several processes and it says that it is using all of the 4 cores and yet task manager shows barely any usage on each core. This machine can do around 100 CDs and hour normally. It would seem that some of the CoreConverter.exe processes get stuck and don't finish properly. It starts off pretty quickly and seems to get bogged down after a minute or so, then everything seems to grind to a halt.
    I suspect it's not the converter that is the delay but the cdgrab process that feeds it with the audio data being backlogged.

    Do you have the drives configured for secure ripping? If so, retry it using burst ripping instead. I've set my testing machines to burst ripping, since I'm looking for pure throughput (at the moment).

    If that goes quickly, then see if you can configure your drives for AccurateRip (if not done already). If you can't, anything but burst is going a lot slower.

    If you can, then enable Secure ripping and retest. Remember, secure ripping purposely rereads and will always take longer, unless the disc is both in the AccurateRip database *and* has matching results on the first burst rip. AccurateRip still has some big holes in it, so be aware that a lot of discs will be slow to rip regardless, at least the first couple of times that you rip them. If you come back months later and try again, they will rip faster if they are clean and defect free.

    Then enable ultra-secure ripping and retest.

    -brendan

    Leave a comment:


  • RipTheWorld
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    OK, realise this is the beta and I will be happy to send some feedback as and when I can.

    Is there a wish list for new features or anything like that?
    I have some requests which I would like to see integrated and some additions to the program lists which I am fairly certain would be of an advantage to everyone.

    Just a few things to start with:
    1. Sometimes doesn't look up a disc. Tried once and it was fine the next time round it didn't recognise it.
    2. Can't cancel the batch job, just seems to sit there and do nothing. I have to go through and kill the processes manually.
    3. Doesn't seem to be running at anything like fullspeed. I have it on a Quad Core with upto 6 drives and the highest the CPU usage goes is 15%. On my other software for the same codec its around 70-80%. I am guessing this is causing the really slow rip speeds I am seeing. I have opened up several copies of the standard CD ripper (as it would appear this is what BatchRipper does) and start several processes and it says that it is using all of the 4 cores and yet task manager shows barely any usage on each core. This machine can do around 100 CDs and hour normally. It would seem that some of the CoreConverter.exe processes get stuck and don't finish properly. It starts off pretty quickly and seems to get bogged down after a minute or so, then everything seems to grind to a halt.

    I'll leave it at those for now as they are the really big problems I can see.

    Just so its not all doom and gloom here is some of the stuff I like:
    1. The beep when there is an error (seriously, this is actually pretty useful)
    2. The way you can see what has happened with the discs that have been processed previously
    3. That it will do formats that alot of the alternatives won't

    Leave a comment:


  • Spoon
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    >Can the ripper be set to reject audio cds with a book type for a CD-R or CD-RW?

    No

    --------

    bhoar:

    If a load fails (ie non audio disc) then it should always unload the disc and reject, you can try with one of those clear plastic things which are at the top and bottom of CD-R spindles.

    Leave a comment:


  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Originally posted by bhoar
    So...would it be possible for you to make the load.exe disc-detection routine a bit more robust?
    Correction: clearly I *meant* to say:

    "So...would it be possible for you to make the post-load.exe disc-detection routine a bit more robust?"

    That is, when load.exe exits back to the batch ripper, the batch ripper should be smarter about recognizing that a disc that was loaded, even if it isn't the type expected or if the drive is still trying to identify the type.

    -brendan

    Leave a comment:


  • EliC
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Can the ripper be set to reject audio cds with a book type for a CD-R or CD-RW?

    Leave a comment:


  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Spoon -

    I'm testing the robustness of the disc recognition routines in the alpha2 batch ripper. Specifically, I've got a stack of 100 data CDs and data CD-Rs to see how the batch ripper handles a "garbage in" situation.

    When the robot loads a non-audio disc, the batch ripper usually notices that the disc is not an audio disc and rejects it, calling the reject.exe. This seems to work fine with the dvd-rw drive in my baxter. Using some older LG CD drives in a Primera Composer Pro (with my own loader routines), sometimes the non-audio disc isn't recognized and instead of rejecting, the batch ripper calls for another load. This can lead to drive and/or disc damage due to multiple discs in a drive.

    I presume this is happening because the batch ripper doesn't think there is a disc in the drive at all. This might be dependent on drive-specific spin up times and format recognition routines, which is why we didn't run into it with the baxter using a Pioneer drive, but do run into it using the LG 52x CD drives.

    In diagnosing this, I updated my substitute-for-baxter executables to add a sleep command line option, and have added a 15 second sleep at the end of the load and tray close process before exiting back to the batch ripper. This seems to have cleared up 95% of the problem. However, I still do get occasional double data disc loads. I suspect that I could reduce it even more by adding more delay, but law of diminishing returns, or something.

    So...would it be possible for you to make the load.exe disc-detection routine a bit more robust? Perhaps some additional commands to the drive to double check the drive is really empty before asking for another load instead of calling for a reject?

    -brendan

    Leave a comment:


  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    As far as I am aware, you're on the "beta list" already: the alpha is a public alpha! You can already test the alpha using non-automated drive towers or the single-drive minicubis/baxter units.

    So far, the alpha is not so good at handling running out of discs (think: Energizer Bunny). But if that's the worst problem I've encountered in my (< 20 disc) testing...

    Spoon's interface to the autoloaders doesn't yet have external support to handle the serial-controlled ones, but he did provide a simple (and robust) way of interfacing with loaders when he created the config screen and interface for the minicubis/baxter. So, if you can write some pretty simple code (e.g. VB or if you're lazy like me, AutoIT), here's how you would do it:

    1. Develop your own load.exe, unload.exe, reject.exe, pre-batch.exe and post-batch.exe executables. I use a single executable, renamed five times.

    2. Duplicate one of the baxter subdirs and rename it as appropriate. Spoon's code looks for some contents here in order to allow the config app to give you access to all five command lines so do this even if you are copying over files you think aren't necessary.

    3. Replace the five exes with your own.

    4. In the batch ripper configuration app, modify the command line invocations when configuring each drive.

    Before going into what I've done personally, I'd like to hear more from spoon on the direction of the device support.

    -brendan

    Leave a comment:


  • RipTheWorld
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    I also run a Bulk Ripping service and have 600 Disc Autoloaders and 300 Disc Auto Loaders from Mediatechnics and MF Digital as well as a whole load of machines setup to do manually loaded rips.

    I have a whole heap of ideas that I would like to see integrated into software such as this. It is good to see a software supplier asking the people who will actually make use of it on their ideas.

    I would be more than happy to discuss some of their ideas and would obviously love to get on the Beta list.

    Feel free to PM me and ask any questions.

    Leave a comment:


  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    "Combined Speed x73"

    Oh yeah, that's what I'm talking about.

    (Two 52x drives, burst mode, clean discs.)

    -brendan
    Last edited by bhoar; September 27, 2007, 08:22 PM.

    Leave a comment:


  • hagak
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    You are correct, the Nakamichi drives show up as seperate LUNs and the software would require knowledge of this setup to work. Which is why I asked if they would work with this ripper .

    The script I wrote in perl used cdparanoia to access the drives. I basically coded the script to sequential handle each drive and defining in code how many that was. Then just ran five instances of this script in parralel. Worked ok but I think the machine I was using (old slow machine) had issues doing that much work at once and would get resource conflicts. Course those conflicts could be a timing issue when switching CDs.

    Leave a comment:

Working...