I am often asked “How do you know that you’re making accurate mouthpieces?” This a great question to ask any maker since small, imperceptible differences and deviations can have a huge impact on how a mouthpiece plays - ask any player. Let’s talk about how we do this.
In determining how accurate a mouthpiece dimension is, such as a throat size, outside diameter, shank diameter, etc, we need something to measure the mouthpiece that is ‘capable’ and ‘repeatable’ to perform the measurement.
We could use a tape measure, which has 1/16” (0.0625”) increments printed on it. The best we could hope for is half of that, being 1/32” of an inch or 0.03125”. Differences in the simple outside diameter of a trumpet mouthpiece between a 2 and 3 size is 0.005”, so only having the ability to measure 0.031” is not really good enough to differentiate between a 2 and 3 size mouthpiece. And that’s just on the outside of the rim!
What about the throat of a trumpet mouthpiece? This portion of the mouthpiece is exposed to the highest sound pressure levels present in a trumpet, well in excess of 100db. This constriction that all players rely on for the proper resistance and acoustic impedance matching with the instrument and ultimately the room is very sensitive to small changes.
A standard throat measures 0.1440” in diameter, a #27 size. Changing this to a #26 throat (0.1460”) is only 0.0010” per side of the throat difference. One thousandth of an inch. Consider that a sheet of standard paper is 0.004” thick and a human hair is 0.003”, or so. Our previous example of using a tape measure isn’t going to really give us the ability to determine if what we’re making is really what we think we’re making.
Measuring the outside diameter of a mouthpiece can usually be performed with a set of calipers to get a decent measurement. Analog or digital calipers have a resolution of about 0.001”. The ability of the user then comes into play as to whether they can accurately measure the throat with a set of calipers. Issues include locating the tool in the right place, applying the right pressure and technique, so as to get the ‘right’ answer. Needless to say, this doesn’t result in a very precise measurement. There has to be something better. For measuring outside diameter objects, a micrometer can be used which has a resolution down to 0.0001”. This is a much more repeatable, accurate, and precise measurement approach, but generally only for outside diameters.
A better and much simpler tool is a gauge pin. But even here, not all gauge pins are the same. First, a gauge pin is defined as a ground, cylindrical piece of material that is of a known size. Using a large set of pins of varying sizes, we can insert pins until one fits and then we know about how large a hole is. This requires significantly less skill and is much more accurate and repeatable.
This approach though absolutely depends on knowing the diameter of your gauge pins, right? This is where understanding tolerances makes all the difference in what claims can be made. If you search on ebay for “gauge pins,” you can find a number of different pins, of varying quality and tolerances.
The general documented classes in our world are the following, and their tolerances:
Most importantly: these tolerances only apply if you have a documented set of tools. Often we find handmade tools that have 0.1440” written on them, which would indicate a confidence of +/- 0.0001” or Class Z accuracy. But these usually aren’t certified or documented and controlled with a known standard or measurement, so basing everyday production and measurement controls on that would be a recipe for disaster and significant batch to batch variability, never-mind long term drift.
Here in our shop, we utilize Class XX, certified, and replicable steel gauge pins for our trumpet mouthpieces. These well documented, controlled, traceable, and consistent measurement tools ensure we can repeatability evaluate and adjust our manufacturing processes to obtain a consistent and accurate mouthpiece day in and day out - all based on a hard standard.