title
Products            Buy            Support Forum            Professional            About            Codec Central
 
Page 4 of 31 FirstFirst ... 2345614 ... LastLast
Results 46 to 60 of 461

Thread: dBpoweramp Batch Ripper: Discussions

  1. #46
    dBpoweramp Guru
    Join Date
    May 2004
    Posts
    1,175

    Re: dBpoweramp Batch Ripper: Discussions

    Can we have the option to set a temporary encoding directory, separate from the final destination? I store everything on external drives and try to minimize the read/write cycles to those drives, so having a temp directory on the local machine doing the encoding would be idea.

  2. #47

    Re: dBpoweramp Batch Ripper: Discussions

    Nope, wasn't paging that much if at all. Once I had put more memory in the system (upto 2GB) there was always a minimum of 600MB free, should only start paging if there isn't enough RAM left. The processes seem to take on average 10-15MB each. The CDGrab processes take up a lot more ~150MB each (5x 150MB = 750MB!). Still there is some real conflicts going on somewhere, do they share a lock file maybe?

    I don't see why disc I/O should really be a problem either. If the tracks are ripped into RAM then the only disk I/O should be straight to the hard drive. Bandwidth from the DVD Drives is definitely not a problem, you need something like 10+ DVD Drives before you will start worrying the I/O bandwidth (assuming your not using IDE of course). That may obviously be a little different for secure rips though. As for encoded files being written to disk if they are being encoded out of RAM to a single SATAII hard drive then it should happily cope with at least 8-9 lossless file writes.

    Like I said this system can throughput 100 CDs/hour normally. This is reduced to about 85 CDS/hour for lossless, but this is more due to CPU constraints (even though it is a Quad Core overclocked to 2.75GHz).

    I do know how these programs work as when I first started nearly 4 years ago I wrote my own software. Unfortunately this was on linux and therefore didn't have the codec support which is why I switched to alternative software based on windows (and also for ease of management and access to alternative Meta Data sources). Just trying to help as I think this program has real potential.

  3. #48

    Re: dBpoweramp Batch Ripper: Discussions

    Ran a quick test with 1 disc against a few codecs. The lossy codecs all max out (or attempt to) at least 1 core on the processor. None of the lossless ones seem to do this, can I assume that they are handled in a different way?

    As for the multiple sources for Meta Data that doesn't really explain or sort out the problem. Having used AMG, GD3, FreeDB and MusicBrainz as well as our own in-house database I can tell you that having a consistent main source is half the battle. This is not so much of an issue with an album by a single artist, but when you start getting into box sets and compilations this can become quite a big issue as the different data sources have different structures for this data. Is this an issue with hitting the AMG database or the generation of the DiscID?

    Say you have a 10 CD box set and it pulls half the lookups from one database, 4 more from another and the last one from a different database then it can sometimes be tricky putting the box set back together when all you have to rely on is the Meta Data (especially with classical music).

    Can I suggest a different forum area to brainstorm on ideas that people might want included?

  4. #49
    dBpoweramp Guru
    Join Date
    Sep 2006
    Posts
    1,173

    Re: dBpoweramp Batch Ripper: Discussions

    Quote Originally Posted by RipTheWorld
    Nope, wasn't paging that much if at all. Once I had put more memory in the system (upto 2GB) there was always a minimum of 600MB free, should only start paging if there isn't enough RAM left. The processes seem to take on average 10-15MB each. The CDGrab processes take up a lot more ~150MB each (5x 150MB = 750MB!). Still there is some real conflicts going on somewhere, do they share a lock file maybe?

    I don't see why disc I/O should really be a problem either. If the tracks are ripped into RAM then the only disk I/O should be straight to the hard drive. Bandwidth from the DVD Drives is definitely not a problem, you need something like 10+ DVD Drives before you will start worrying the I/O bandwidth (assuming your not using IDE of course). That may obviously be a little different for secure rips though. As for encoded files being written to disk if they are being encoded out of RAM to a single SATAII hard drive then it should happily cope with at least 8-9 lossless file writes.

    Like I said this system can throughput 100 CDs/hour normally. This is reduced to about 85 CDS/hour for lossless, but this is more due to CPU constraints (even though it is a Quad Core overclocked to 2.75GHz).

    I do know how these programs work as when I first started nearly 4 years ago I wrote my own software. Unfortunately this was on linux and therefore didn't have the codec support which is why I switched to alternative software based on windows (and also for ease of management and access to alternative Meta Data sources). Just trying to help as I think this program has real potential.
    Ok, it's not swap.

    However, the rip and transcode processes are actual separate processes, not threads in the same process. I've noticed that the cdgrab processes exit in burst mode while the coreconverter processes continues to work on the conversion. If the coreconverter processes themselves aren't taking a lot of memory, then it is clear they are working from files (regardless of whether the file is memory mapped).

    So, if the converter processes take over from where the cdgrab processes left off, the entire wav data of the track being converted has to be stored somewhere. If it's not being stored in memory associated with the coreconverter process, and the cdgrab process has exited, then it is being stored on disk, unless there is some windows-specific piping/socket behavior I am not familiar with. Twenty coreconverter processes and five cdgrab processes means there are twenty five files being written and twenty files being read, all in parallel. Right?

    That might not be easy to avoid, esp. as some codecs are multi-pass and cannot simply throw away the data piped to or otherwise presented to them until at least one pass through has been completed.

    Spoon?

    -brendan

  5. #50
    Administrator
    Join Date
    Apr 2002
    Posts
    43,859

    Re: dBpoweramp Batch Ripper: Discussions

    Take this theory example:

    A drive rips at a constant x10 (1.7MB a second)
    There is one codec which encodes at x100 speed (like a lossless might)
    Another slow lossy codec encodes at x.5

    If you ripped to the lossless on its own, the CPU usage would only peak at 10%, there is nothing wrong, encoding is being done as fast as possible (ie disc length / 10).

    If you ripped to the lossy, the CPU would be 100% and ripping would take (disc length * 2)

    If there is not 100% cpu there might not need to be with lossless codecs, they tend to be efficent at encoding. Check the drive x speeds.
    ------

    >Can I suggest a different forum area to brainstorm on ideas that people might want included?

    This thread is fine.

  6. #51
    dBpoweramp Guru
    Join Date
    Sep 2006
    Posts
    1,173

    Re: dBpoweramp Batch Ripper: Discussions

    Quote Originally Posted by Spoon
    If you ripped to the lossless on its own, the CPU usage would only peak at 10%, there is nothing wrong, encoding is being done as fast as possible (ie disc length / 10).

    If you ripped to the lossy, the CPU would be 100% and ripping would take (disc length * 2)
    Spoon -

    I was responding to this post from RipTheWorld which found a bogging down with lossless codecs (not lossy).

    OK, have done a fairly extensive set of tests with different codecs and the conclusion is that none of the lossless codecs are performing very well. They appear to use all of the processing cores and seem to perform fairly well at the beginning, but as soon as there is a backlog of CoreConverter processes then things slow down pretty quickly. They quickly start using up a lot of memory (you'll need at least 2GB for 4+ drives) and with 5 drives going I had over 20 encoding processes with WML. Similar thing happens for all lossless codecs. This is all on XP btw.
    Hence my focus on disk IO issues vs. CPU issues.

    -brendan

  7. #52
    Administrator
    Join Date
    Apr 2002
    Posts
    43,859

    Re: dBpoweramp Batch Ripper: Discussions

    If using WMA Lossless, there was a previous older version bug where the encoder would eat up memory, make sure the latest codecs are installed (do dBpoweramp Configuration >> Codecs >> Update Check).

  8. #53
    Administrator
    Join Date
    Apr 2002
    Posts
    43,859

    Re: dBpoweramp Batch Ripper: Discussions

    About meta data:

    We have literally spent 10 minutes plugging in a makeshift meta data system in, if is only partly functional as it is. Soon a big push will go into the meta data system, if someone was to tell me that in its current state it is not too good, non-consistant I can believe that.

  9. #54

    Re: dBpoweramp Batch Ripper: Discussions

    Metadata- OK, understand that it is only a test setup.

    Ripping Speed- Like I say they rip fine using MP3 in the same program so that is not the problem.

    Disk I/O- If the raw audio (WAV) is being ripped to disc then yes this could definitely cause a problem. The current software I use used to do this so I loaded the program directory into a RAM disk (2GB) as this is where it stored the temp files and it ran a bit quicker. But it does only work on one encoding at a time. Now it rips and encodes straight out to the disk. In linux you can use a pipe to direct the output to achieve this, but I have no idea about windows.

    Encoding speed- Yes a single encoding process might only use 10% CPU, but like I mentioned there was about 20 running. This should easily max out all the cores but doesn't. I also use exactly the same codecs in other programs and they don't exhibit this problem and will max out individual cores. I would love to help further and sort this out, but I know jack about windows programming.

    Well if this is the place for ideas then please find my list below:
    1. Ripping job reports. Professional reports with job references, number of rips and options to add re-rips, rip quality etc. I am sure that there are lots more things that could be added to this list.
    2. Identify when a CD is missing from a box set. Use the meta data to see if there is any missing from a set. (i.e. disc 7 from a 10 disc set).
    3. When a disc is ripped without a bonus disc have the option to remove the disc number from the album title. (I am aware that the bonus disc may be added later though, just might be a useful option).
    4. Make all disc number suffixes the same, just Disc 1 not CD01 etc.
    5. Ability to group discs together in a virtual group as they get ripped. This would help sort out the nightmare of ripping large box sets with discs that have been released as a separate album.
    6. Option to remove version tags such as [Australian Album] or [Limited Edition]. Most people don't care about these.
    7. Feedback changes that have been made to the data sources. Some way to highlight the data as being incorrect and send back the changes that have been made as this means less effort if doing the same disc again (I am fully aware that it is doing their for them, but we have to do it anyway).
    8. Be able to shorten editing lists to just album artist and album names. This makes editing incorrect album names much quicker. (when I ran my own Meta Data database I had this feature and it was extremely useful).
    9. Active highlighting of errors in Meta Data. Have a verified list of artists and highlight any that aren't exactly the same as in the list (again I used to have this feature and it saved a load of time). I think referencing against musicbrainz for this would be good as the way they link the artists to albums etc. means that there is little room to have mistakes in the list. If taken to the extreme it could also include a spell checker (although this may be overkill). Changes for artists that have been incorrect in the past and been updated should be changed automatically so you only have to change 'Fatboy Slim' to 'Fat Boy Slim' once.
    10. If we are not happy with the data ability to check it against the other sources from within a tag editor.

    OK. Think that will do for now.

  10. #55

    Re: dBpoweramp Batch Ripper: Discussions

    A quick question with regards to the Meta Data access.

    You have the 'Professional' version which has access to AMG for 1 year included. Is this AMG license a commercial license?

    My current software requires that we pay to access the meta data for each hit of the database. It would be really good to clear this up.

  11. #56
    Administrator
    Join Date
    Apr 2002
    Posts
    43,859

    Re: dBpoweramp Batch Ripper: Discussions

    We need to clear it up with AMG as well, currently our contract has no differentiation between commercial and non-commercial users of dBpoweramp Music Converter (and ultimately CD Ripper), Batch ripper uses CD Ripper. I will discuss with AMG.

  12. #57
    dBpoweramp Guru
    Join Date
    Sep 2006
    Posts
    1,173

    Re: dBpoweramp Batch Ripper: Discussions

    Quote Originally Posted by RipTheWorld
    Just trying to help as I think this program has real potential.
    RTW - what hardware (robotic, changer or otherwise) are you using in your environment. Make/Model and if possible, a link to a picture if it's not clear (e.g. MediaForm/MF-Digital uses the name "Scribe" for at least two different hardware types).

    -brendan

  13. #58

    Re: dBpoweramp Batch Ripper: Discussions

    RTW - what hardware (robotic, changer or otherwise) are you using in your environment. Make/Model and if possible, a link to a picture if it's not clear (e.g. MediaForm/MF-Digital uses the name "Scribe" for at least two different hardware types).
    Well I think the best one to start with is the Mediatechnics 4 drive robot with a 600 disc capacity. Picture here: http://www.mediatechnics.com/images/.../new4drive.gif

    It may well be that it has been re-badged from another manufacturer, but the one I have came from MediaTechnics. I am fairly certain if you contact them they will give you the serial command set. Of special interest is the 'Shake' command which picks up a disc and then shakes it to try and free CDs stuck to the bottom.

    The other robot I have is not currently running windows (and may not do in the future) so its best to stick with this one for now.

  14. #59

    Re: dBpoweramp Batch Ripper: Discussions

    Didn't check this the other day, but is there a way to set the ripping so that if the disc looks up in AccurateRip then it will be done in Burst Mode if not it will be done via a secure rip.

    Ripping profiles that could be saved in conjunction with file formats would be very useful. Some sort of if, and, or basic functionality would be useful.

    if [AccurateRip-available]{
    rip in burst mode
    check accurate-rip
    if [burst-mode-error]{
    secure rip
    }
    else{
    secure rip
    if [secure-mode-correct]{
    submit acurate rip
    }
    }

    Automatic AccurateRip submission I think is a must as it would increase the database size pretty quickly.

  15. #60
    dBpoweramp Guru
    Join Date
    Sep 2006
    Posts
    1,173

    Re: dBpoweramp Batch Ripper: Discussions

    Quote Originally Posted by RipTheWorld
    Well I think the best one to start with is the Mediatechnics 4 drive robot with a 600 disc capacity. Picture here: http://www.mediatechnics.com/images/.../new4drive.gif

    It may well be that it has been re-badged from another manufacturer, but the one I have came from MediaTechnics. I am fairly certain if you contact them they will give you the serial command set. Of special interest is the 'Shake' command which picks up a disc and then shakes it to try and free CDs stuck to the bottom.
    Cool. I've got the two drive variation of that robot (Fusion PX 250-disc w/ snub-length/newer picker) and do have an early (2003) revision of the documentation on the serial command set for the Fusion series (might also have later copies). The shake mode has been very nice when dealing with sticky discs.

    I have found the manufacturers/resellers somewhat reluctant to share programming/SDK documentation unless you have a specific product/solution in mind and have an existing business. However, Spoon should have no problem here.

    Quote Originally Posted by RipTheWorld
    The other robot I have is not currently running windows (and may not do in the future) so its best to stick with this one for now.
    Still I am curious about that one as well.

    -brendan

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •