illustrate
Products            Buy            Support Forum            Registrations            About           
 

dBpoweramp Batch Ripper: Discussions

Collapse
X
 
  • Time
  • Show
Clear All
new posts

  • Spoon
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    We need to clear it up with AMG as well, currently our contract has no differentiation between commercial and non-commercial users of dBpoweramp Music Converter (and ultimately CD Ripper), Batch ripper uses CD Ripper. I will discuss with AMG.

    Leave a comment:


  • RipTheWorld
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    A quick question with regards to the Meta Data access.

    You have the 'Professional' version which has access to AMG for 1 year included. Is this AMG license a commercial license?

    My current software requires that we pay to access the meta data for each hit of the database. It would be really good to clear this up.

    Leave a comment:


  • RipTheWorld
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Metadata- OK, understand that it is only a test setup.

    Ripping Speed- Like I say they rip fine using MP3 in the same program so that is not the problem.

    Disk I/O- If the raw audio (WAV) is being ripped to disc then yes this could definitely cause a problem. The current software I use used to do this so I loaded the program directory into a RAM disk (2GB) as this is where it stored the temp files and it ran a bit quicker. But it does only work on one encoding at a time. Now it rips and encodes straight out to the disk. In linux you can use a pipe to direct the output to achieve this, but I have no idea about windows.

    Encoding speed- Yes a single encoding process might only use 10% CPU, but like I mentioned there was about 20 running. This should easily max out all the cores but doesn't. I also use exactly the same codecs in other programs and they don't exhibit this problem and will max out individual cores. I would love to help further and sort this out, but I know jack about windows programming.

    Well if this is the place for ideas then please find my list below:
    1. Ripping job reports. Professional reports with job references, number of rips and options to add re-rips, rip quality etc. I am sure that there are lots more things that could be added to this list.
    2. Identify when a CD is missing from a box set. Use the meta data to see if there is any missing from a set. (i.e. disc 7 from a 10 disc set).
    3. When a disc is ripped without a bonus disc have the option to remove the disc number from the album title. (I am aware that the bonus disc may be added later though, just might be a useful option).
    4. Make all disc number suffixes the same, just Disc 1 not CD01 etc.
    5. Ability to group discs together in a virtual group as they get ripped. This would help sort out the nightmare of ripping large box sets with discs that have been released as a separate album.
    6. Option to remove version tags such as [Australian Album] or [Limited Edition]. Most people don't care about these.
    7. Feedback changes that have been made to the data sources. Some way to highlight the data as being incorrect and send back the changes that have been made as this means less effort if doing the same disc again (I am fully aware that it is doing their for them, but we have to do it anyway).
    8. Be able to shorten editing lists to just album artist and album names. This makes editing incorrect album names much quicker. (when I ran my own Meta Data database I had this feature and it was extremely useful).
    9. Active highlighting of errors in Meta Data. Have a verified list of artists and highlight any that aren't exactly the same as in the list (again I used to have this feature and it saved a load of time). I think referencing against musicbrainz for this would be good as the way they link the artists to albums etc. means that there is little room to have mistakes in the list. If taken to the extreme it could also include a spell checker (although this may be overkill). Changes for artists that have been incorrect in the past and been updated should be changed automatically so you only have to change 'Fatboy Slim' to 'Fat Boy Slim' once.
    10. If we are not happy with the data ability to check it against the other sources from within a tag editor.

    OK. Think that will do for now.

    Leave a comment:


  • Spoon
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    About meta data:

    We have literally spent 10 minutes plugging in a makeshift meta data system in, if is only partly functional as it is. Soon a big push will go into the meta data system, if someone was to tell me that in its current state it is not too good, non-consistant I can believe that.

    Leave a comment:


  • Spoon
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    If using WMA Lossless, there was a previous older version bug where the encoder would eat up memory, make sure the latest codecs are installed (do dBpoweramp Configuration >> Codecs >> Update Check).

    Leave a comment:


  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Originally posted by Spoon
    If you ripped to the lossless on its own, the CPU usage would only peak at 10%, there is nothing wrong, encoding is being done as fast as possible (ie disc length / 10).

    If you ripped to the lossy, the CPU would be 100% and ripping would take (disc length * 2)
    Spoon -

    I was responding to this post from RipTheWorld which found a bogging down with lossless codecs (not lossy).

    OK, have done a fairly extensive set of tests with different codecs and the conclusion is that none of the lossless codecs are performing very well. They appear to use all of the processing cores and seem to perform fairly well at the beginning, but as soon as there is a backlog of CoreConverter processes then things slow down pretty quickly. They quickly start using up a lot of memory (you'll need at least 2GB for 4+ drives) and with 5 drives going I had over 20 encoding processes with WML. Similar thing happens for all lossless codecs. This is all on XP btw.
    Hence my focus on disk IO issues vs. CPU issues.

    -brendan

    Leave a comment:


  • Spoon
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Take this theory example:

    A drive rips at a constant x10 (1.7MB a second)
    There is one codec which encodes at x100 speed (like a lossless might)
    Another slow lossy codec encodes at x.5

    If you ripped to the lossless on its own, the CPU usage would only peak at 10%, there is nothing wrong, encoding is being done as fast as possible (ie disc length / 10).

    If you ripped to the lossy, the CPU would be 100% and ripping would take (disc length * 2)

    If there is not 100% cpu there might not need to be with lossless codecs, they tend to be efficent at encoding. Check the drive x speeds.
    ------

    >Can I suggest a different forum area to brainstorm on ideas that people might want included?

    This thread is fine.

    Leave a comment:


  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Originally posted by RipTheWorld
    Nope, wasn't paging that much if at all. Once I had put more memory in the system (upto 2GB) there was always a minimum of 600MB free, should only start paging if there isn't enough RAM left. The processes seem to take on average 10-15MB each. The CDGrab processes take up a lot more ~150MB each (5x 150MB = 750MB!). Still there is some real conflicts going on somewhere, do they share a lock file maybe?

    I don't see why disc I/O should really be a problem either. If the tracks are ripped into RAM then the only disk I/O should be straight to the hard drive. Bandwidth from the DVD Drives is definitely not a problem, you need something like 10+ DVD Drives before you will start worrying the I/O bandwidth (assuming your not using IDE of course). That may obviously be a little different for secure rips though. As for encoded files being written to disk if they are being encoded out of RAM to a single SATAII hard drive then it should happily cope with at least 8-9 lossless file writes.

    Like I said this system can throughput 100 CDs/hour normally. This is reduced to about 85 CDS/hour for lossless, but this is more due to CPU constraints (even though it is a Quad Core overclocked to 2.75GHz).

    I do know how these programs work as when I first started nearly 4 years ago I wrote my own software. Unfortunately this was on linux and therefore didn't have the codec support which is why I switched to alternative software based on windows (and also for ease of management and access to alternative Meta Data sources). Just trying to help as I think this program has real potential.
    Ok, it's not swap.

    However, the rip and transcode processes are actual separate processes, not threads in the same process. I've noticed that the cdgrab processes exit in burst mode while the coreconverter processes continues to work on the conversion. If the coreconverter processes themselves aren't taking a lot of memory, then it is clear they are working from files (regardless of whether the file is memory mapped).

    So, if the converter processes take over from where the cdgrab processes left off, the entire wav data of the track being converted has to be stored somewhere. If it's not being stored in memory associated with the coreconverter process, and the cdgrab process has exited, then it is being stored on disk, unless there is some windows-specific piping/socket behavior I am not familiar with. Twenty coreconverter processes and five cdgrab processes means there are twenty five files being written and twenty files being read, all in parallel. Right?

    That might not be easy to avoid, esp. as some codecs are multi-pass and cannot simply throw away the data piped to or otherwise presented to them until at least one pass through has been completed.

    Spoon?

    -brendan

    Leave a comment:


  • RipTheWorld
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Ran a quick test with 1 disc against a few codecs. The lossy codecs all max out (or attempt to) at least 1 core on the processor. None of the lossless ones seem to do this, can I assume that they are handled in a different way?

    As for the multiple sources for Meta Data that doesn't really explain or sort out the problem. Having used AMG, GD3, FreeDB and MusicBrainz as well as our own in-house database I can tell you that having a consistent main source is half the battle. This is not so much of an issue with an album by a single artist, but when you start getting into box sets and compilations this can become quite a big issue as the different data sources have different structures for this data. Is this an issue with hitting the AMG database or the generation of the DiscID?

    Say you have a 10 CD box set and it pulls half the lookups from one database, 4 more from another and the last one from a different database then it can sometimes be tricky putting the box set back together when all you have to rely on is the Meta Data (especially with classical music).

    Can I suggest a different forum area to brainstorm on ideas that people might want included?

    Leave a comment:


  • RipTheWorld
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Nope, wasn't paging that much if at all. Once I had put more memory in the system (upto 2GB) there was always a minimum of 600MB free, should only start paging if there isn't enough RAM left. The processes seem to take on average 10-15MB each. The CDGrab processes take up a lot more ~150MB each (5x 150MB = 750MB!). Still there is some real conflicts going on somewhere, do they share a lock file maybe?

    I don't see why disc I/O should really be a problem either. If the tracks are ripped into RAM then the only disk I/O should be straight to the hard drive. Bandwidth from the DVD Drives is definitely not a problem, you need something like 10+ DVD Drives before you will start worrying the I/O bandwidth (assuming your not using IDE of course). That may obviously be a little different for secure rips though. As for encoded files being written to disk if they are being encoded out of RAM to a single SATAII hard drive then it should happily cope with at least 8-9 lossless file writes.

    Like I said this system can throughput 100 CDs/hour normally. This is reduced to about 85 CDS/hour for lossless, but this is more due to CPU constraints (even though it is a Quad Core overclocked to 2.75GHz).

    I do know how these programs work as when I first started nearly 4 years ago I wrote my own software. Unfortunately this was on linux and therefore didn't have the codec support which is why I switched to alternative software based on windows (and also for ease of management and access to alternative Meta Data sources). Just trying to help as I think this program has real potential.

    Leave a comment:


  • EliC
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Can we have the option to set a temporary encoding directory, separate from the final destination? I store everything on external drives and try to minimize the read/write cycles to those drives, so having a temp directory on the local machine doing the encoding would be idea.

    Leave a comment:


  • EliC
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Originally posted by Spoon
    Metadata access currently is very limited (just relies on AMG), the final release will smart pick from 4 or 5 providers.
    will there be any interface to manually override "smart picks"

    Leave a comment:


  • Spoon
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Lossless codecs also heavily hit the hard discs, for example on a fast machine Wavpack can encode and write as fast as writing the uncompressed wav file, so if you have 4x drives encoding to multiple lossless then your drive will be the slowest point. A gigabit network with fast NAS drive, or some kind of raid array.

    By default dBpoweramp would not multi core any lossless codec beause of the above, but if you are using the multi-encoder it can bypass that.

    Metadata access currently is very limited (just relies on AMG), the final release will smart pick from 4 or 5 providers.

    Leave a comment:


  • bhoar
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    Spoon is working on the changer support.

    -brendan

    Leave a comment:


  • peterfs
    replied
    Re: dBpoweramp Batch Ripper: Discussions

    I just picked up a Sony XL1B3 changer and I'm trying to use this with batch ripper. The drive is recognized and it's able to rip the first CD I inserted, however the ripper is not sending the correct commands to eject the CD and load the subsequent ones; it loops on failure because it sees the CD as a duplicate of the previous.

    I'm running Windows XP and I don't want have to use MCE or Vista in order to get the Sony SW working. Plus I think the dbPoweramp ripper is the coolest... :-) Is there anything I can do to help get this combination working?

    Leave a comment:

Working...