|
ansi visual go/no go gages
does anyone know of a standard or a best practice for tolerancing a visual gage. the gage is used in conjuction with an individuals sight to determine pass or fail condition. even assuming perfect eyesight can you actually differentiate between 0.1mm, 0.01mm, 0.001mm, and 0.0001mm.
what can the human eye actually see accurately?
we would like to correctly tolerance these sight gages on their prints, keeping a accuracy ratio of 10:1. we don't want to over tolerance the gage and add additional cost to our process.
there has to be some type of standard or best practice for this type of measurement.
i assume you are using an optical reticle for inspection of the part. obviously the higher the magnification the higher the accuracy of the measurement. you want to invest in a calibration block to verify your measuring accuracy.
northern gauge sherwood park ab
i am using a visual gage. we hold it up to the part and it either passes or fails this gage. totally visual. no dimensions taken.
my question is how do i determine the accuracy of the eye. what error am i introducing into my measurement using these devices. they are adequate, but adequate is the question.
this is the job of the calibration. you should get some samples with known dimensions and run a multiple trial, multiple person test and see where they fall.
hypothetically, the accuracy of the gage is essentially the line width of the reticle marking, but eye-relief and other factors could change that.
ttfn |
|