Accuracy VS Precision
- Accuracy is the largest allowable error that occurs under specific operating conditions
- A voltmeter with plus or minus 2% will read 100V anywhere between 98V and 102V
- Precision refers to an instrument’s ability to provide the same measurement repeatedly
Resolution VS Range
- Resolution is the smallest increment a tool can detect and display
- Has 1mm increments and 1/16’’ increments
- Range is the maximum value we can measure without overload
- Can measure reliably to “half-increment”
- Always use the instrument at the lowest possible range setting to obtain the most accuracy
Significant Figures
- Go to the nearest “half-increment” of the lowest resolution instrument
- Leading zeroes and trailing zeros are not part of significant figures (they’re just placeholders)
- Keep full accuracy throughout calculations. Only round at the very end
Oscilloscope
- Back in the day: cathode ray oscilloscope (analog oscilloscope)
- CRT Televisions, monitors, raster scans