Bivariate Analysis - Tests of Differences

Bivariate Analysis - Tests of Differences

Research Methods William G. Zikmund Bivariate Analysis - Tests of Differences Common Bivariate Tests Differences among Differences between Type of Measurement three or more two independent groups independent groups Independent groups: One-way Interval and ratio t-test or Z-test ANOVA Common Bivariate Tests Differences among Differences between Type of Measurement three or more two independent groups independent groups Mann-Whitney U-test Ordinal Kruskal-Wallis test Wilcoxon test Common Bivariate Tests Differences among Differences between Type of Measurement three or more two independent groups independent groups Z-test (two proportions) Nominal Chi-square test Chi-square test Type of Differences between Measurement two independent groups Nominal Chi-square test Differences Between Groups • Contingency Tables • Cross-Tabulation • Chi-Square allows testing for significant differences between groups • “Goodness of Fit” Chi-Square Test (Oi − Ei )² x² = Ei x² = chi-square statistics th Oi = observed frequency in the i cell th Ei = expected frequency on the i cell Chi-Square Test R C E i j ij n th Ri = total observed frequency in the i row th Cj = total observed frequency in the j column n = sample size Degrees of Freedom (R-1)(C-1)=(2-1)(2-1)=1 Health Economics Research Method 2003/2 Degrees of Freedom d.f.=(R-1)(C-1) Awareness of Tire Manufacturer’s Brand Men Women Total Aware 50 10 60 Unaware 15 25 40 65 35 100 Chi-Square Test: Differences Among Groups Example (50 − 39 )2 (10 − 21 )2 X 2 = + 39 21 (15 − 26 )2 (25 − 14 )2 + + 26 14 χ 2 = 3.102 + 5.762 + 4.654 + 8.643 = χ 2 = 22 .161 d. f . = (R −1)( C −1) d. f . = 2( − )(1 2 − )1 =1 X2=3.84 with 1 d.f. Type of Differences between Measurement two independent groups Interval and t-test or ratio Z-test Differences Between Groups when Comparing Means • Ratio scaled dependent variables • t-test – When groups are small – When population standard deviation is unknown • z-test – When groups are large Null Hypothesis About Mean Differences Between Groups µ1 − µ 2 OR µ1 − µ 2 = 0 t-Test for Difference of Means mean 1 - mean 2 t = Variabilit y of random means t-Test for Difference of Means Χ − Χ t = 1 2 S X1 − X 2 X1 = mean for Group 1 X2 = mean for Group 2 SX1-X2 = the pooled or combined standard error of difference between means. t-Test for Difference of Means 1 2 X1 − X 2 t-Test for Difference of Means X1 = mean for Group 1 X2 = mean for Group 2 S = the pooled or combined standard error X1-X2 of difference between means. Pooled Estimate of the Standard Error ()n −1S2 +(n − )1 S2) 1 1 S = 1 1 2 2 + X1−X2 n1 +n2 −2 n1 n2 Pooled Estimate of the Standard Error 2 S1 = the variance of Group 1 2 S2 = the variance of Group 2 n1 = the sample size of Group 1 n2 = the sample size of Group 2 Pooled Estimate of the Standard Error t-test for the Difference of Means ()n −1 S 2 + (n − )1 S 2 ) 1 1 S = 1 1 2 2 + X1 − X 2 n1 + n2 − 2 n1 n2 2 S1 = the variance of Group 1 2 S2 = the variance of Group 2 n1 = the sample size of Group 1 n2 = the sample size of Group 2 Degrees of Freedom • d.f. = n - k • where: –n = n 1 + n2 –k = number of groups t-Test for Difference of Means Example (20 )(1.2 )2 + (13 )(6.2 )2 1 1 S = + X1−X2 33 21 14 = .797 16 5. −12 2. 3.4 t = = .797 .797 = .5 395 Type of Differences between Measurement two independent groups Nominal Z-test (two proportions) Comparing Two Groups when Comparing Proportions • Percentage Comparisons • Sample Proportion - P • Population Proportion - Π Differences Between Two Groups when Comparing Proportions The hypothesis is: Ho: Π1 = Π2 may be restated as: Ho: Π1 − Π2 = 0 Z-Test for Differences of Proportions Ho :π1 = π 2 or Ho :π1 − π 2 = 0 Z-Test for Differences of Proportions p − p − π − π Z = 1 2 1 2 S p1 − p2 Z-Test for Differences of Proportions p1 = sample portion of successes in Group 1 p2 = sample portion of successes in Group 2 (π1 − π1) = hypothesized population proportion 1 minus hypothesized population proportion 1 minus Sp1-p2 = pooled estimate of the standard errors of difference of proportions Z-Test for Differences of Proportions 1 1 S = pq − p1− p2 n1 n2 Z-Test for Differences of Proportions p = pooled estimate of proportion of success in a sample of both groups qp = (1- p) or a pooled estimate of proportion of failures in a sample of both groups n1= sample size for group 1 n2= sample size for group 2 Z-Test for Differences of Proportions n p n p p 1 1 2 2 n1 n2 Z-Test for Differences of Proportions 1 1 S p − p = (.375 )(.625 ) + 1 2 100 100 = .068 A Z-Test for Differences of Proportions (100 )( 35. )+ (100 )( 4. ) p = 100 +100 = .375 Type of Differences between Measurement three or more independent groups Interval or ratio One-way ANOVA Analysis of Variance Hypothesis when comparing three groups µ1 = µ2 = µ3 Analysis of Variance F-Ratio Variance − between − groups F = Variance − within − groups Analysis of Variance Sum of Squares SS total = SS within + SS between Analysis of Variance Sum of SquaresTotal n c 2 SS total = (X ij − X ) i = 1 j =1 Analysis of Variance Sum of Squares th Xpiij = individual scores, i.e., the i observation or test unit in the jth group Xpi = grand mean n = number of all observations or test units in a group c = number of jth groups (or columns) Analysis of Variance Sum of SquaresWithin n c 2 SS within = (X ij − X j ) i = 1 j =1 Analysis of Variance Sum of SquaresWithin th Xpiij = individual scores, i.e., the i observation or test unit in the jth group Xpi = grand mean n = number of all observations or test units in a group c = number of jth groups (or columns) Analysis of Variance Sum of Squares Between n 2 SS between = n j (X j − X ) j=1 Analysis of Variance Sum of squares Between th X j= individual scores, i.e., the i observation or test unit in the jth group X = grand mean nj = number of all observations or test units in a group Analysis of Variance Mean Squares Between SS MS = between between c −1 Analysis of Variance Mean Square Within SS MS = within within cn − c Analysis of Variance F-Ratio MS F = between MS within A Test Market Experiment on Pricing Sales in Units (thousands) Regular Price Reduced Price Cents-Off Coupon $.99 $.89 Regular Price Test Market A, B, or C 130 145 153 Test Market D, E, or F 118 143 129 Test Market G, H, or I 87 120 96 Test Market J, K, or L 84 131 99 Mean X1=104.75 X2=134.75 X1=119.25 Grand Mean X=119.58 ANOVA Summary Table Source of Variation • Between groups • Sum of squares – SSbetween • Degrees of freedom – c-1 where c=number of groups • Mean squared-MSbetween – SSbetween/c-1 ANOVA Summary Table Source of Variation • Within groups • Sum of squares – SSwithin • Degrees of freedom – cn-c where c=number of groups, n= number of observations in a group • Mean squared-MSwithin – SSwithin/cn-c ANOVA Summary Table Source of Variation • Total • Sum of Squares – SStotal • Degrees of Freedom – cn-1 where c=number of groups, n= number of observations in a group MS BETWEEN F = MS WITHIN Research Methods William G. Zikmund Bivariate Analysis: Measures of Associations Measures of Association • A general term that refers to a number of bivariate statistical techniques used to measure the strength of a relationship between two variables. Relationships Among Variables • Correlation analysis • Bivariate regression analysis Type of Measure of Measurement Association Interval and Correlation Coefficient Ratio Scales Bivariate Regression Type of Measure of Measurement Association Ordinal Scales Chi-square Rank Correlation Type of Measure of Measurement Association Chi-Square Nominal Phi Coefficient Contingency Coefficient Correlation Coefficient • A statistical measure of the covariation or association between two variables. • Are dollar sales associated with advertising dollar expenditures? The Correlation coefficient for two variables, X and Y is xy Health Economics Research Method 2003/2. Correlation Coefficient • r • r ranges from +1 to -1 • r = +1 a perfect positive linear relationship • r = -1 a perfect negative linear relationship • r = 0 indicates no correlation Simple Correlation Coefficient (X − X )(Y − Y ) r = r = ∑ i i xy yx 2 2 ∑()Xi − X ∑ ()Yi − Y Simple Correlation Coefficient σ r = r = xy xy yx 2 2 σ xσ y Simple Correlation Coefficient Alternative Method 2 σ x = Variance of X 2 σ y = Variance of Y σ xy = Covariance of X and Y Y Correlation Patterns NO CORRELATION X Health Economics Research Method 2003/2. Y Correlation Patterns PERFECT NEGATIVE CORRELATION - r= -1.0 X Health Economics Research Method 2003/2. Correlation Patterns Y A HIGH POSITIVE CORRELATION r = +.98 X Health Economics Research Method 2003/2. Calculation of r − .6 3389 r = (17 .837 )(.5 589 ) − .6 3389 = = −.635 99 .712 Pg 629 Coefficient of Determination Explained variance r 2 = Total Variance Correlation Does Not Mean Causation • High correlation • Rooster’s crow and the rising of the sun – Rooster does not cause the sun to rise. • Teachers’ salaries and the consumption of liquor – Covary because they are both influenced by a third variable Correlation Matrix • The standard form for reporting correlational results.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    61 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us