Attribute Agreement Analysis - Options
main topic
    
see also      

Stat > Quality Tools > Attribute Agreement Analysis  > Options

Use to display Cohen's kappa and the disagreement table. Also, you may specify a value for the confidence level.

Dialog box items

Calculate Cohen's kappa if appropriate: Check to calculate Cohen's kappa when appropriate. Minitab will calculate Cohen's kappa when two appraisers rate a single trial or when each appraiser rates two trials.

Display disagreement table: Check to display the disagreement table. Minitab displays how often each appraiser's assessments differ from each known standard or attribute value. You must specify a column for Known standard/attribute in the main dialog to enable this checkbox.

Confidence level: Enter the confidence level of the interval estimation of the percentages of the assessment agreement within appraisers and between each appraiser and standard. The default is set at 95.

Minitab help Stat Graph SixSigma DOE Glossary Reliability SPC,MSA,CPK