Mastering Red Book-Today-Yesterday

Sparrow wrote on 2/23/2004, 6:31 AM
Many of us made 100's of CD masters with the old 4.5 version of CD
Architect Software. When it went away, we tried to keep older scsi
Windows 98 and 2000 comptuers alive, because 4.5 worked so well
and we knew how to use it.

The new version 5 is cool, but it has taken some time to figure out some
of the changes. The "auto gap problem is one", and we do set the preference to 0, not an auto 2 seconds between cuts.

With Sound Forge 7.0 and in the abitlity to increase the size of the visual
on the wave file, it becomes hard to determine final start times that both
new and older consumer CD/DVD playes can react to, so that songs do
not start late, or cut off the frist partial measure of the track. So should there be some time of safe "gap" to protect the start time of the master, or do we have to do "test" burns with consumer players?

As far as mastering, it still seems better to do it in Version 7, save it, and
not count on a bunch of "fancy stuff" in Version 5. Espeically if there are
several engineers using the same computer. All of these "temp" files, for pan, volumn..etc can get lost on desktops with several large hard drives!

The "clip" detection is nice, but what does it hear? Also a logical description
of what Sound Forge and CD Arch VU meters read would help. If you study
a big hit CD, like Eminem, it lays in the RED..is that clipping or complex, rather
nasty digital compression?

Happy Spring..

Sparrow

Comments

Geoff_Wood wrote on 2/23/2004, 7:07 PM
What's wrong with scsi on XP - my stack of 7 drives works fine ! And I know how to use CDA5.

You set the gap preference to whatever rocks your booty. 2 seconds is the default in any burning app I've ever seen.


"...it becomes hard to determine final start times ". Use CDA5, watch the meters and listen. Do your tonal and dynamics master in SF if you must, but leave final fades and track positioning to CDA.

MAstering being better in one rather than the other rather depends on the depth to which you are mastering.

Clip detection tells you when you have 3(?) or more consecutive samples at FS (that meaning there is a fair chance that the original signal would have gone higher). This is nothing to do with hypercompression whether it is sourced digitally or otherwise. It also won't tell you anything about a previously clipped and now attenuated peak. VU meters themselves will only serve to hide peaks, not show them.

Maybe they'll get the extra bits sorted in CDA5.5 or 6. Ha ha.


geoff