Dear
Steven,
Well, you asked for comments about
the procedure; and I'm afraid the way I was brought up, there are a number
of variables you haven't considered. For coarse-groove discs in the BBC, the
1kHz calibration disc (numbered DOM 2) had its tone recorded at an RMS stylus
velocity of 2 cm/sec. This was used for calibrating the disc-cutting machines in
the BBC, and we were mandated to cut a short track of 2 cm/sec 1kHz tone at the
start of a session (on both machines, if it was likely to continue beyond one
side). Using the BBC standard Peak Programme Meter (PPM), the tone was supposed
to read to zero; and actual programme was supposed to peak 8dB above this
when replayed using the appropriate characteristic (another can of worms I
won't open just now). I will, however, mention that the standard RS/8 playback
stylus had a 2.5thou spherical tip. On a PPM, +8dbs corresponds to the peak transmission voltage, after which
alarm-bells would ring at the A.M. transmitters. So the tone had a precise
function, so discs could be put straight on-air without anyone having to take
level. (They were sometimes put on air even while the disc was being cut -
a technique that totally disappeared until the digital
age!)
I mention this because, when I was a
technical operator (before I started working in studios), we were told to
play cellulose-nitrate test-cuts to check cutting-styli, and under the aforesaid
measurement procedures the surface-noise had to be at least 40dB below zero
level unweighted. The PPM had an attack-time of 10 milliseconds (the definition
of this being that if a 10ms pulse of zero-level tone was put into the meter, it
had to reach to within 1dB of zero level). I don't think there was a
tolerance figure for the "system noise", but to be meaningful it would have to
be at least 10dB better than this.
For microgroove, the reference level
was 1 cm/sec, but this had three different recording characteristics at
different dates
Peter
----Original Message-----
From: Steven Smolian [mailto:smolians@xxxxxxxxx] Sent: 01 August 2001 03:01 To: "ARSCLIST@xxxxxxxxxxxxxxxxx"@galileo.cc.rochester.edu; AV Media Listserv Subject: arsclist Analog system noise Forgive the cross-postings. The question
concerns all of us doing sound preservation from disc sources.
Has anyone come up with a number for the acceptable
amount of noise in an analog system, stylus tip to converter, and how to measure
it?
I've been using my digital meters with a peak hold
feature. I play the old Victor (actually Western Electric) test
record with a 1K continuous tone and bring all levels to zero throughout the
system to the coverter, where I set it at -2 db. (If I don't, then
random shellac noises will push me into the red and I get no useful
number.) I then remove the stylus from the groove, reset, and see what
number comes up. I then subtract 2 to account for my -2 setting, and
assume that is a reading of analog system noise.
I realize that the numbers will differ a
bit,depending on stylus size- larger ones have greater output.
Comments on the procedure?
Any ideas on what would be an acceptable minimum
amount of system noise?
Steve Smolian
********************************************************************* The information contained in this e-mail is confidential and may be legally privileged. It is intended for the addressee(s) only. If you are not the intended recipient, please delete this e-mail and notify the postmaster@xxxxx : The contents of this e-mail must not be disclosed or copied without the sender's consent. The statements and opinions expressed in this message are those of the author and do not necessarily reflect those of the British Library. The British Library does not take any responsibility for the views of the author. ********************************************************************* |