超级版主
注册日期: 04-03
帖子: 18592
精华: 36
现金: 249466 标准币
资产: 1080358888 标准币
|
instrument validation
instrument validation
hi all,
here's my situation: i have to measure the density of a liquid in my process. the specification is 1525 g/l +/- 2.5 g/l. the instrument that i have been given to measure the density reads from 1500-1600 g/l, in increments of 2 g/l.
i do not feel that the instrument is the right one for this process due to the fact that the graduations are 40% of the total spec range. my biggest problem is that i need some very solid proof of this because both the spec and the instrument have been given to me by another of our plants that "invented" the process, and they do not listen to requests for change without some type of proof. are there any tests that i can perform to prove my claim one way or another?
check out our whitepaper library.
the general rule of thumb is this:
if your ruler measures down to 0.01, then you are acurate to 0.1.
having spent years in a quality lab as a metrologist, that is a pretty safe rule to live by.
you could start with astm e 29-02 standard practice for using significant digits in test data to determine conformance with specifications. it is available at:
the proper way to do this is called a gage reliability and repeatability study, if you wish to provide statistical proof that you have the wrong tool for the job. (which you do). basically it involves testing the same samples several times by different operators. i imagine google will find a description.
cheers
greg locock
a gr&r will not tell you if you have the right tool for the job, a gr&r will tell you if you can accurately repeat the same measurement with a tool. for example, i can measure the width of a pencil tip with a meter stick, and repeat the same number every time (good gr&r), but still be using the wrong piece of equipment to measure the value (a caliper or micrometer would be a better choice).
quote:
in increments of 2 g/l
that's the precision, you still have said nothing about the uncertainty.
ttfn
you will probably use the density to describe something else, for example, concentration. do inverse calibration, insert the required confidence limits and you will see what your instrument can really do.the precision given by the specifications is the best that instrument can do under ideal conditions; in a plant it might be different.
m777182
i've found the best way to prove the measurement precision needed is to show how much money is left on the table.
for example, perform a gage r&r study with your current device plus another one that has better resolution. then compare the two. you should see some examples where the current device rejected formulations that were actually good, or vice versa. you can then convert this into $$$$ the business is losing, which is definitely something that will be listened to.
good luck!
__________________
借用达朗贝尔的名言:前进吧,你会得到信心!
[url="http://www.dimcax.com"]几何尺寸与公差标准[/url]
|