用户名: 密码: 验证码:
网络调查参与意愿研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
截至2008年底,中国网民规模已达到3.0亿人,互联网普及率达到22.6%,网民规模跃居世界第一。随着互联网的发展和网民数量的大幅攀升,网络调查在中国正被应用得越来越普遍。相比于传统调查,网络调查具有成本低、调查范围广、调查周期短、易于调查敏感性问题等优点。
     但是,调查低参与率和样本覆盖率偏差问题是影响网络调查结果有效性的两个主要原因。由于样本覆盖率偏差可以采取混合调查模式加以缓解,调查低参与率成为网络调查者面临的最主要问题。目前国内外研究主要从调查者问卷设计视角考察网络调查低参与率的原因,很少有被调查者心理因素视角的研究。国内网络调查参与的相关实证性研究尤为缺乏,大部分研究都在介绍和论述网络调查的优缺点,以及对国外相关研究和论述的转述。
     本论文正是针对网络调查低参与率问题,从外在影响的激励因素和内在影响的心理因素视角对网络调查参与意愿的形成机理作了探索性和创新性研究,主要内容和相关结论包括:
     (1)研究了网络调查(基于BBS和基于网站)和传统纸笔调查的测量不变性,给出了在AMOS中如何实现两个潜变量构成的模型的测量不变性检验。研究结果表明虽然三种调查模式下得到的对潜变量的测量具有不同的均值和方差,但是,这三种调查形式具有强因子测量不变性。研究说明在涉及自我评估的实证研究中,恰当设计的网络调查具有和传统纸笔调查相同甚至更优的可靠性,因此可以采用混合调查模式缓解网络调查的覆盖率偏差问题。
     (2)使用荟萃分析较以往研究更加准确地考察了物质性激励,事后激励以及或有激励在影响网络调查参与中的作用。使用荟萃分析技术对以往的相关研究结果进行基于独立于样本量的效应值统计量计算。结果表明物质性激励、事后激励以及或有激励能够促使被调查者完成问卷;和激励在传统调查中对调查参与的促进作用相比,其在网络调查中的作用较小;或有激励是效果最稳定的一种激励策略;精神激励则对调查参与的提升具有最弱的有效性。
     (3)研究了网络调查中所有八种不同形式的激励策略对被调查者调查参与意愿的影响。结果表明不同激励策略在网络调查中对被调查者调查参与意愿提升具有不同的有效性。必然激励对参与意愿的提升最具有效性,且必然激励不同的激励实现手段(现金、实物、虚拟产品)之间不存在显著性的差异。
     (4)研究了调查者心理因素对网络调查激励有效性的调节作用。鉴于激励视角的研究难以解释在控制了激励的类型和激励物价值情况下,激励对调查参与的影响仍然展现出不同有效性,通过被试内设计以及混合设计方法在控制了具体的激励策略之后,研究了心理感知因素(主题卷入、身份认知)对网络调查参与意愿的影响。研究结果证实了心理感知因素负向调节了激励对网络调查参与的有效性,并且这种调节作用的强度甚至会完全抵消激励本身对参与意愿的影响。
     (5)通过设计三个不同的研究方案,采用被试内重复测量技术研究了不同身份认知的调查发起者和实施者对调查参与意愿的影响。结果表明在网络调查中调查发起者身份的重要性远远超过调查实施者,当调查者发起者是低身份认知时委托高身份认知的实施者不一定能对被调查者的调查参与意愿产生显著地提升作用。
     (6)基于理性行为理论(Theory of Reasoned Action, TRA)构建了一个潜在被调查者网络调查参与意愿形成过程的心理模型并在控制了年龄、性别、教育程度和网络使用年限等人口统计学变量之后实证检验了各因素对调查参与意愿的影响。同时,进一步考察了调查参与意愿是否受到人格因素影响。通过引入人格因素,检验了有些潜在被调查者是否天生厌恶参加调查的问题。
With the development of Internet and the soaring number of cyber citizens in China (up to Dec.2008, Netizens and Internet penetration rate in China have reached 3 billion and 22.6%, respectively), Web-based or Web survey is more and more frequently used in such fields as commercial, government, and academic research. However, theoretical research related to web survey is far behind the practice in China. The focus of the thesis is on investigation of factors pertaining to the participation intention of web survey from incentive and psychological perspectives. Research and according findings involving in the thesis as follows:
     Firstly, author considered the measurement invariance between web survey and traditional surveys under the Chinese culture or the collectivistic culture. This thesis employed multi-group confirmatory factor analysis to examine measurement invariance of personal innovativeness scales under traditional paper and pencil survey, Web survey based on BBS, and Web survey based on websites, respectively. The results indicated that the three survey modes had measurement invariance in the domain of the self-evaluation. However, because of situation-dependence caused by culture surroundings, the three modes revealed different means and variances of latent variables. The results demonstrated that a well design web survey had equal psychometrics to, if not more than, paper and pencil survey.
     Secondly, the thesis investigated the effectiveness of incentive in Web survey using meta-analysis. Three meta-analyses examined the effect of incentives in web surveys. The first analysis securitized the impact of material incentives on retention. A significant effect was revealed, indicating that material incentives motivated people to complete a Web survey. The second analysis checked the impact of promised incentives on retention and the results indicated that promised incentives could urge the participants to complete the Web survey. The last analysis examined contingent incentives on retention. The results also demonstrated that offering contingent incentives could exert a positive impact on retention.
     Thirdly, eight types of incentives used in Web surveys were examined, and results indicated that different incentive tactics exerted different effects on web survey participation intention. Postpaid incentive, no matter of cash, real material, virtual material, exerted the most effect on participation intention, while psychological incentive exhibited the least effect on survey participation intention.
     Fourthly, in view of the fact that incentive perspective research cannot explain the phenomenon that even after controlling for the types and value of the incentive, web surveys show distinct participation rates in practice. This thesis considered whether potential respondent's psychological factors (topic involvement and status cognition) could moderate the effect of incentive on participation intention. After controlling for the type of incentive, the results demonstrated that significant negative moderation effects appeared, and the strength of effects even could negate the role of incentive.
     Fifthly, designing three distinct experiments and employing within-subject measurement, the thesis inspected roles of status cognition of survey sponsor and investigator on participation intention. The results illustrated that the role of sponsor's status cognition was far more important than that of investigator. When the survey sponsor possesses low status cognition and assigns a survey to a high status cognition investigator, participation intention cannot be necessarily increased.
     Sixthly, and last, the thesis proposed a research model from a psychological perspective based on theory of reasoned action. We empirically examined the role of attitude, social norm, moral obligation, trust in sponsor of survey, topic involvement, topic sensitivity, and reputation of the sponsor. The results indicated that attitudes, social norm, moral obligation, status cognition, and reputation of sponsor could positively influence the intention to participate in web survey after controlling for demographic factors such as age, gender, education level and net usage (year). Additionally, personality of potential respondents was investigated so as to verify the conjecture that some individuals were naturally unlike to participate in Web surveys. The results vindicated the conjecture.
引文
[1]S. Kiesler, L. S. Sproull. Response Effects in the Electronic Survey. Public Opinion Quarterly, 1986,50(3):402-413.
    [2]L. S. Sproull. Using electronic mail for data collection in organizational research. Academy of Management Journal,1986,29(1):159-169.
    [3]M. P. Couper. Web surveys. A review of issues and approaches. Public Opinion Quarterly, 2000,64(4):464-494.
    [4]D. A. Dillman. Mail & Internet surveys:Total design method. New York:John Wiley,2000.
    [5]方佳明,邵培基.影响网络调查适用性的因素分析.管理学报.2006,3(4):493-497.
    [6]M. Bosnjak, T. L. Tuten, W. W. Wittmann. Unit (non) response in web-based access panel surveys:An extended planned-behavior approach. Psychology & Marketing,2005,22(6): 489-505.
    [7]M. Bosnjak, T. L. Tuten. Prepaid and promised incentives in web surveys—an experiment. Social Science Computer Review,2003,21(2):208-217.
    [8]T. L. Tuten, M. Galesic, M. Bosnjak. Effects of immediate versus delayed notification of prize draw results on response behavior in Web surveys:An experiment. Social Science Computer Review,2004,22(3):377-384.
    [9]R. M. Groves, M. P. Couper. Nonresponse in household interview surveys. New York: Wiley-Interscience publication,1998.
    [10]J. G. Helgeson, K. E. Voss, W. D. Terpening. Determinants of mail-survey responses:Survey design factors and respondent factors. Psychology & Marketing,2002,19(3):303-328.
    [11]K. Wijnen, I. Vermeir, P. Kenhove. The relationships between traits, personal values, topic involvement, and topic sensitivity in a mail survey context. Personality and Individual Differences,2007,42(1):61-73.
    [12]中国互联网络信息中心CNNIC第23次中国互联网络发展状况统计报告.北京:中国互联网信息中心,2009.
    [13]蔡华俭,林永佳,武秋萍等.网络调查和纸笔测验的测量不变性研究—以生活满意度量表为例.心理学报,2008,40(2):228-239.
    [14]Z. Simsek, I. F. Veiga. A primer on internet organizational survey. Organizational Research Methods,2001,4(3):218-229.
    [15]C. Chou. Computer networks in communication survey research. IEEE Transactions on Professional Communication,1997,40(3):196-208.
    [16]J. E. Pitkow, M. M. Recker. Results from the first World-Wide Web user survey. Computer Networks and ISDN Systems,1994,27(2):243-254.
    [17]L. Parker. Collecting data the e-mail way. Training and Development,1992,46(7):52-54.
    [18]B. Schuldt, J. W. Totten. Electronic mail vs. mail survey response rates. Marketing Research, 1994,6(1):36-39.
    [19]J. E. Pitkow, C. M. Kehoe. Emerging trends in the WWW user population. Communication of ACM,1996,39(6):106-108.
    [20]B. E. Mavis, J. J. Brocato. Postal surveys versus electronic surveys:The tortoise and the hare revisited. Evaluation and the Health Professions,1998,21(3):395-408.
    [21]K. B. Sheehan, G. H. Mariea. Using e-mail to survey internet users in the United States: methodology and assessment. Journal of Computer Mediated Communication,1999,4(3): 89-114.
    [22]R. D. Klassen, J. Jacobs. Experimental comparison of web, electronic and mail survey technologies in operations management. Journal of Operations Management,2001,19(6): 713-728.
    [23]J. Burkey, W. Kuechler. A Web-based and mail surveys:A comparison of response, data, and cost. IEEE Transaction on Professional Communication,2003,46(2):81-93.
    [24]C. Cobanoglu, N. Cobanoglu. The effect of incentives in web surveys:Application and ethical considerations. International Journal of Market Research,2003,45(4):475-488.
    [25]J. C. Witte, L. M. Amoroso, P. Howard. Research methodology:Method and Representation in Internet-Based Survey Tools:Mobility, Community, and Cultural Identity in Survey 2000. Social Science Computer Review,2000,18(2):179-195.
    [26]T. L. Tuten, M. Bosnjak, W. Bandilla. Banner-advertised Web surveys. Marketing Research, 2000,11(4):16-21.
    [27]M. Bosnjak, T. L. Tuten. Classifying response behaviors in web-based surveys. Journal of Computer-Mediated Communication,2001,6(3):Retrieved Nov.12,2007 from: http://jcmc.indiana.edu/vol6/issue3/boznjak.html.
    [28]S. G. Rogelberg, A. Luong. Nonresponse to mailed surveys:A review and guide. Current Directions in Psychological Science,1998,7(2):60-65.
    [29]R. A. Robinson, P. Agisim. Making mail survey more reliable. Journal of Marketing,1951, 15(4):415-424.
    [30]C. P. Sosdian, L. M. Sharp. Nonresponse in mail surveys:Access failure or respondent resistance. Public Opinion Quarterly,1980,44(3):396-402.
    [31]D. A. Dillman. Mail and telephone surveys:The total design method. New York:Wiley,1978.
    [32]T. A. Heberlein, R. Baumgartner. Factors affecting response rates to mailed questionnaires: Quantitative analysis of the published literature. American Sociological Review,1978,43(4): 447-462.
    [33]F. J. Yammarino, S. J. Skinner, T. L. Childers. Understanding mail survey response behavior: A meta-analysis. Public Opinion Quarterly,1991,55(4):613-639.
    [34]A. H. Church. Estimating the effect of incentives on mail survey response rates:A meta-analysis. Public Opinion Quarterly,1993,57(1):62-79.
    [35]J. M. James, R. Bolstein. Large monetary incentives and their effect on mail survey response rates. Public Opinion Quarterly,1992,56(4):442-453.
    [36]J. S. Armstrong, E. J. Lusk. Return postage in mail surveys:A meta-analysis. Public Opinion Quarterly,1987,51(2):233-248.
    [37]R. J. Fox, M. R. Crask, J. Kim. Mail survey response rate:A meta-analysis of selected techniques for inducing response. Public Opinion Quarterly,1988,52(4):467-491.
    [38]L. Kanuk L, C. Berenson. Mail surveys and response rates:A literature review. Journal of Marketing Research,1975,12(4):440-453.
    [39]J. Yu, H. A. Cooper. Quantitative review of research design effects on response rates to questionnaires. Journal of Marketing Research,1983,20(1):36-44.
    [40]C. S. Craig, J. M. Mccann. Item nonresponse in mail surveys:Extent and correlates. Journal of Marketing Research,1978,15(2):285-289.
    [41]D. O. Kaldenberg, H. F. Koenig, B. W. Becker. Mail survey response rate patterns in a population of the elderly:Does response deteriorate with age? Public Opinion quarterly,1994, 58(1):68-76.
    [42]C. Lusk, G. L. Delclos, K. Burau, L. A. Aday. Mail versus internet surveys:Determinants of method of response preferences among health professionals. Evaluation & the Health Professions,2007,30(2):186-201.
    [43]R. Franzen, P. E. Lazarsfeld. Mail questionnaire as a research problem. Journal of Psychology, 1945,20(2):293-320.
    [44]J. A. Landry, M. A. Smyer, J. G. Tubman, et al. Validation of two methods of data collection of self-reported medicine use among the elderly. The Gerontologist,1988,28(5):672-676.
    [45]M. G. Dalecki, T. W. Ilvento, D. E. Moore. The effects of multi-wave mailings on the external validity of mail surveys. Journal of the Community Development Society,1988,19(3):51-70.
    [46]J. Sobal, B. R. Deforge, K. S. Ferentz, et al. Physician response to multiple questionnaire mailings. Evaluation Review,1990,14(6):711-722.
    [47]H. Roose, H. Waege, F. Agneessens. Response behavior in audience research:A two-stage design for the explanation of nonresponse. Developments in Social Science Methodology. 2002,18(1):97-123.
    [48]R. R. Hogan. Response bias in student follow-up:A comparison of low and high return surveys. College and University,1985,61(1):17-25.
    [49]R. Rosenthal, R. L. Rosnow. The volunteer subject. New York:John Wiley,1975.
    [50]S. G. Rogelberg, J. M. Conway, M. E. Sederburg, et al. Profiling active and passive nonrespondents to an organizational survey. Journal of Applied Psychology,2003,88(6): 1104-1114.
    [51]D. W. Finn, C. Wang, C. W. Lamb. An examination of the effects of sample composition bias in a mail survey. Journal of the Market Research Society,1983,25(3):331-338.
    [52]P. E. Downs, J. R. Kerr. Recent evidence on the relationship between anonymity and response variables for mail surveys. Journal of the Academy of Marketing Science.1986,14(1):72-82.
    [53]W. H. Jones. Generalizing mail survey inducement methods:Population interactions with anonymity and sponsorship. Public Opinion Quarterly,1979,43(1):102-111.
    [54]A. J. Macek, G. H. Miles. IQ score and mailed questionnaire response. Journal of Applied Psychology,1975,60(2):258-259.
    [55]K. E. Green. Reluctant respondents:Differences between early, late and nonresponders to a mail survey. Journal of Experimental Education,1991,59(3):268-276.
    [56]G. S. Albaum, F. Evangelista, N. Medina. Role of response behavior theory in survey research: A cross-national study. Journal of Business Research,1998,42(2):115-125.
    [57]J. J. Hox, D. E. Leeuw, H. Vorst. Survey participation as reasoned action:A behavioral paradigm for survey nonresponse? Bulletin de Methodologies Sociologique,1995,48(4): 52-67.
    [58]D. Furse, D. Stewart. Manipulating dissonance to improve mail survey response. Psychology & Marketing,1984,1(2):79-94.
    [59]A. Tybout, R. Yalch. The effect of experience:A matter of salience? Journal of Consumer Research.2007,6(4):406-413.
    [60]R. A. Hansen, L. Robinson. Testing the effectiveness of alternative foot-in-the door manipulations. Journal of Marketing Research,1980,17(3):359-364.
    [61]R. J. Lavidge, G. Steiner. A model for predictive measurements of advertising effectiveness. Journal of Marketing,1961,25(6):59-62.
    [62]T. A. Heberlein, R. Baumgartner. Factors affecting response rates to mailed questionnaires: Quantitative analysis of the published literature. American Sociological Review,1978,43(4): 447-462.
    [63]J. S. Armstrong, T. S. Overton. Estimating non-response bias in mail surveys. Journal of Marketing Research,1977,14(3):396-402.
    [64]S. G. Rogelberg, G. G Fisher, D. C. Maynard, M. D. Hakel, M. Horvath. Attitudes towards surveys:Development of a measure and its relationship to respondent behavior. Organizational Research Methods,2001,4(1):3-25.
    [65]E. Suchman, B. Mccandless. Who answers questionnaires? Journal of Applied Psychology, 1940,24(6):758-769.
    [66]R. M. Groves, E. Singer, A. Corning. Leverage-saliency theory of survey participation: Description & an illustration. Public Opinion Quarterly,2000,64(3):299-308.
    [67]B. Bickart, D. Schmittlein. The distribution of survey contact and participation in the United States:Constructing a survey-based estimate. Journal of Marketing Research,1999,36(2): 286-294.
    [68]R. M. Groves, R. B. Cialdini, M. P. Couper. Understanding the decision to participate in a survey. Public Opinion Quarterly,1992,56(4):475-495.
    [69]E. J. Baur. Response bias in a mail survey. Public Opinion Quarterly,1947,11(4):594-600.
    [70]B. Healey, T. Macpherson, B. Kuijten. An empirical evaluation of three web survey design principles. Marketing Bulletin,2005,16(2):1-9.
    [71]J. Hackler, P. Bourgette. Dollars, dissonance and survey returns. Public Opinion Quarterly, 1973,37 (3):276-281.
    [72]D. H. Furse, D. W. Stewart, D. L. Rados, Effects of foot-in-the-door, cash incentives and follow-ups on survey response. Journal of Marketing Research,1981,18(4):473-478.
    [73]L. M. Sharp, J. Frankel, Respondent burden:A test of some common assumptions. Public Opinion Quarterly,1983,47 (1):36-53.
    [74]I. Ajzen. The theory of planned behavior. Organizational Behavior and Human Decision Processes,1991,50 (2):179-211.
    [75]J. Fang, P. Shao, G. Lan. Effects of Innovativeness and Trust on Web Survey Participation. Computers in Human Behavior.2009,25(1):144-152.
    [76]S. Schleifer. Trends in attitudes toward and participation in survey research. Public Opinion Quarterly,1986,50(2):17-26.
    [77]K. B. Sheehan E-mail survey response rates:A review. Retrieved Feb.17,2008 from: http://www.ascusc.org/jcmc/vol6/issue2/sheehan.html.2001.
    [78]M. Limayem, M. Khalifa. Business to consumer electronic commerce:A longitudinal study. Proceedings of the Fifth IEEE Symposium on Computers and Communications. Vol.1: 286-290.
    [79]方佳明,邵培基.基于网络的问卷调查回复率影响因素实证研究.管理评论,2006,1810):12-17.
    [80]T. A. Heberlein, R. Baumgartner. Factors affecting response rates to mailed questionnaires: Quantitative analysis of the published literature. American Sociological Review,1978,43(4): 447-462.
    [81]E. J. Baur. Response bias in a mail survey. Public Opinion Quarterly,1947,11(2):594-600.
    [82]M. Bosnjak, B. Batinic. Online social sciences. Seattle, WA:Hogrefe & Huber,2002.
    [83]B. Marcus, A. Schuetz. Who are the people reluctant to participate in research? Personality correlates of four different types of nonresponse as inferred from self-and observer ratings. Journal of Personality.2005,73(4):959-984.
    [84]M. Galesic, M. Bosnjak. Personality traits and participation in an online access panel.2006 German Online Research Conference, (abstract only)
    [85]T. Bipp. On whom do we rely? Personality traits and nonresponse in an online survey.2007 German Online Research Conference. (abstract only)
    [86]C. Stratmann, C. Verheyen. The influence of regulatory focus and other personality traits on the participation in online surveys if different strategies of persuasion are used.2007 German Online Research Conference. (abstract only)
    [87]R. Agarwal, J. Parsad. A conceptual and operational of personal innovativeness in the domain of information technology. Information Systems Research,1998,9(2):204-215.
    [88]R. Agarwal, E. Karahanna. Time flies when you're having fun: cognitive absorption and beliefs about information technology usage. MIS Quarterly,2000,24(4):665-694.
    [89]S. Y. Hung, C. M. Chang. User acceptance of WAP services:Test of competing theories. Computer Standards & Interfaces,2005,27(4):359-370.
    [90]J. Lu, J. E. Yao, C. S. Yu. Personal innovativeness, social influences and adoptation of wireless internet services via mobile technology. Journal of Strategic Information Systems, 2005,14(33):245-268.
    [91]J. Fang, P. Shao, J. Wan. Intention to participate in web surveys:an extended tpb model. Proceedings of 2007 International Conference on Wireless Communications, Networking and Mobile Computing. Vol 5:3581-3584.
    [92]方佳明,邵培基.影响潜在被调查者参与网络调查的动机研究:一个扩展模型.营销科学学报.2007,3(2):85-99.
    [93]R. Thompson, D. Compeau, C. Higgins. Intentions to use information technologies:An integrative model. Journal of Organizational and End User Computing,2006,18(3):25-46.
    [94]J Fang, P. Shao. An extended tpb model to explain potential respondents' intention to participate in web-based survey. Proceedings of 2007 International Conference on Wireless Communications, Networking and Mobile Computing. Vol 9:6079-6082.
    [95]D. A. Dillman, R. Tortora, D. Bowker. Principles for Constructing Web Surveys. SESRC Technical Report,1998,98-50.
    [96]L. M. Christian, D. A. Dillman, J. D. Smyth. Helping respondents get it right the first time: The influence of words, symbols, and graphics in web surveys. Public Opinion Quarterly, 2007,71(1):113-125.
    [97]F. Conrad, M. P. Couper, S. Crawford, R. Tourangeau. What they see is what we get: Response options for web surveys.2003 American Association for Public Opinion Research. Abstract only.
    [98]M. P. Couper, M. W. Traugott, M. J. Lamias. Web survey design and administration. Public Opinion Quarterly,2001,65(2):230-253.
    [99]E. Deutskens, K. D. Ruyter, M. Wetzels, P. Oosterveld. Response rate and response quality of internet-based surveys:An experimental study. Marketing Letters,2004,15(1):21-36.
    [100]C. Cook, F. Heath, R. L. Thompson. A meta-analysis of response rates in Web-or Internet-based surveys. Educational and Psychological Measurement,2000,60(6):821-836.
    [101]D. Heerwegh. Effects of personal salutations in e-mail invitations to participate in a web survey. Public Opinion Quarterly,2005,69(4):588-598
    [102]D. Heerwegh, T. Vanhove, K. Matthijs, G. Loosveldt. The effect of personalization on response rates and data quality in web surveys. International Journal of Social Research Methodology,2005,8(2):85-99.
    [103]D. Heerwegh, G Loosveldt. Personalizing e-mail contacts:Its influence on web survey response rate and social desirability response bias. International Journal of Public Opinion Research,2007,19(2):258-268.
    [104]A. S. Goritz. Incentives in web studies:Methodological issues and a review. International Journal of Internet Science,2006,1(1):58-70.
    [105]D. Heerwegh. An investigation of the effect of lotteries on web survey response rates. Field Methods.2006,18(2):205-220.
    [106]A. S. Goritz. The impact of material incentives on response quantity, response quality, sample composition, survey outcome, and cost in online access panels. International Journal of Market Research,2004,46 (3):327-345.
    [107]J. Su, P. Shao, J. Fang. Effect of incentives on web-based surveys. Tsinghua Science and Technology,2008,13(3):344-347.
    [108]D. Bachmann, J. Elfrink J, G Vazzana. E-mail and snail mail face off in rematch. Marketing Research,2000,11(4):11-15.
    [109]H. Y. Lee, H. Ahn, I. Han. Analysis of trust in the e-commerce adoption. Proceedings of the 39th Hawaii International Conference on System Sciences.2006, Vol.1:177-187.
    [110]P. Pavlou, M. Fygenson. Understanding and predicting electronic commerce adoption:An extension of the theory of planned behavior. MIS Quarterly,2006,30(1):115-143.
    [111]K. Mathieson. Predicting user intentions:comparing the technology acceptance model with the theory of planned behavior. Information Systems Research,1991,2(3):173-191.
    [112]S. Taylor, P. A. Todd. Understanding information technology usage:A test of competing models. Information Systems Research,1995,6(2):144-176.
    [113]L. Zhou, L. Dai, D. Zhang. Online shopping acceptance model-a critical survey of consumer factors in online shopping. Journal of Electronic Commerce Research,2007,8(3):41-62.
    [114]G Hofstede. Culture's consequences:International differences in work-related values. Newbury Park, CA:Sage,1984.
    [115]G Hofstede. Cultures and organizations:Software of the mind. London:McGraw-Hill,1991.
    [116]P. K. Chau. Reexamining a model for evaluating information center success using a structural equation modeling approach. Decision Science,2002,28(2):309-334.
    [117]C. Park, J. K. Jun. Across-cultural comparison of internet buying behavior. International Marketing Review.2003,14(5):534-554.
    [118]K. L. Wee, R. Ramachandra. Cyber buying in china, Hong Kong and Singapore:Tracking the who, where, why and what online buying. International Journal of Retail-Distribution Management.2000,28(7):307-316.
    [119]蔡华俭,林永佳,武秋萍等.网络调查和纸笔测验的测量不变性研究-以生活满意度量表为例.心理学报,2008,40(2):228-239.
    [120]S. Porter, E. Whitcomb. The impact of lottery incentives on student survey response rates. Research in Higher Education,2003,44(4):389-407.
    [121]T. Tuten, M. Galesic, M. Bosnjak. Effects of immediate versus delayed notification of prize draw results on response behavior in web surveys:An experiment. Social Science Computer Review 2004,22(3):377-384.
    [122]D. H. Mcknight, N. L. Chervany. What trust means in e-commerce customer relationships:An interdisciplinary conceptual typology. International Journal of Electronic Commerce,2002, 6(2):35-60.
    [123]D. Gefen. A practical guide to factorial validity using pls-graph:Tutorial and annotated example. Communications of Association for Information Systems,2005,16(5):91-109.
    [124]D. Gefen, E. Karahanna, D. Straub. Inexperience and experience with online stores:The importance of tam and trust. IEEE Transactions on Engineering Management.2003,50 (3): 307-332.
    [125]C. Cobanoglu, B. Warde, P. Moreo. A comparison of mail, fax, and web-based survey methods. International Journal of Market Research,2001,43(4):441-452.
    [126]K. S. Booth, E. Edwards, P. Rosenfeld. Impression management, social desirability, and computer administration of attitude questionnaires:Does the computer make a difference? Journal of Applied Psychology,1992,77(4):562-566.
    [127]L. Martin, H. Nagao. Some effects of computerized interviewing on job applicant responses. Journal of Applied Psychology,1989,74(11):72-80.
    [128]P. D. Umbach. Web surveys:Best practice. New Directions for Institutional Research,2004, 121(2):23-38.
    [129]K. Braunsberger, H. Wybenga, R. Gates. A comparison of reliability between telephone and web-based surveys. Journal of Business Research,2007,60(7):758-764.
    [130]H. Knapp, S. Kirk. Using pencil and paper, Internet and touch-tone phones for self-administered surveys:Does methodology matter? Computers in Human Behavior,2003, 19(1):117-134.
    [131]W. King, E. Miles. A quasi-experimental assessment of the effect of computerizing noncognitive paper-and-pencil measurements:A test of measurement equivalence. Journal of Applied Psychology,1995,80(6):643-651.
    [132]F. Barbeite, E. Weiss. Computer self-efficacy and anxiety scales for an Internet sample: Testing measurement equivalence of existing measures and development of new scales. Computers in Human Behavior,2004,20(3):1-15.
    [133]J. Charlton. Measuring perceptual and motivational facets of computer control:The development and validation of the computing control scale. Computers in Human Behavior, 2005,21(5):791-815.
    [134]R. Clark, R. Goldsmith. Interpersonal influence and consumer innovativeness. International Journal of Consumer Studies,2006,30(1):34-43.
    [135]S. Raju, J. Laffitte, M. Byrne. Measurement equivalence:A comparison of methods based on confirmatory factor analysis and item response theory. Journal of Applied Psychology,2002, 87.(3):517-529.
    [136]G. Joreskog, D. Sorbom. LISREL 8:User's reference guide. Chicago, IL:Scientific Software, 2006.
    [137]F. Drasgow. Scrutinizing psychological tests:Measurement equivalence and equivalent relations with external variables are central issues. Psychological Bulletin,1984,95(1): 135-135.
    [138]F. Drasgow. Study of measurement bias of two standardized psychological tests. Journal of Applied Psychology,1987,72(1):19-29.
    [139]P. Reise, F. Widaman, H. Pugh. Confirmatory factor analysis and item response theory:Two approaches for exploring measurement invariance. Psychological Bulletin,1993,114(3): 552-566.
    [140]J. Vandenberg, E. Lance. A review and synthesis of the measurement invariance literature: Suggestions, practices, and recommendations for organizational research.. Organizational Research Methods,2000,3(1):4-69.
    [141]H. Lubke, V. Dolan, H. Kelderman, et al. Weak measurement invariance with respect to unmeasured variables:An implication of strict factorial invariance. British Journal of Mathematical and Statistical Psychology,2003,56(2):231-248.
    [142]J. Mellenbergh. Item bias and item response theory. International Journal of Educational Research,1989,13(2):127-143.
    [143]M. Windle, S. Iwawaki, M. Lerner. Cross-cultural comparability of temperament among Japanese and American preschool children. International Journal of Psychology,1988,23(5): 547-567.
    [144]M. Byrne. Structural equation modeling with LISREL, PRELIS, and SIMPLIS:Basic concepts, applications, and programming. Mahwah, NJ:Erlbaum,1998.
    [145]R. Flynn, R. Goldsmith. A Validation of the Goldsmith and Hofacker Innovativeness Scale. Educational and Psychological Measurement,1993,53(4):1105-1116.
    [146]R. Goldsmith. Explaining and predicting consumer intention to purchase over the Internet:An exploratory study. Journal of Marketing Theory and Practice,2002,10(2):22-28.
    [147]J. Lua, J. Yao, C. S. Yu. Personal innovativeness, social influences and adoption of wireless Internet services via mobile technology. Journal of Strategic Information Systems,2005,14(3): 245-268.
    [148]C. Chou Computer networks in communication survey research. IEEE Transactions on Professional Communication,1997,40(3):196-208.
    [149]J. C. Nunnally, H. I. Bernstein. Psychometric Theory. Third Edition, New York:McGraw-Hill, 1994.
    [150]F. Chen. Sensitivity of goodness of fit indexes to lack of measurement invariance. Structural Equation Modeling,2008,14(3):464-504.
    [151]S. C. Hayne, R. E. Rice. Attribution accuracy when using anonymity in group support systems. International Journal of Human-Computer Studies,1997,47(3):429-452.
    [152]K. M. Christopherson. The positive and negative implications of anonymity in internet social interactions "on the internet, nobody knows you're a dog". Computers in Human Behavior, 2007,23(6):3038-3056.
    [153]M. Bosnjak, T. L. Tuten. Prepaid and promised incentives in web surveys-an experiment. Social Science Computer Review,2003,21(2):208-217.
    [154]M. W. Lipsey, D. B. Wilson. Practical meta-analysis. Thousand Oaks, CA:Sage publications, 2001.
    [155]G. V. Glass, Primary, secondary and meta-analysis of research. Educational Researcher,1976, 5(10):3-8.
    [156]暨南大学流行病学教研室.循证医学和Meta分析.2009年5月提取自:http://jpkc.jnu.edu.cn/lxbx/html/jianshe/ppt/13.files/frame.htm.
    [157]何寒青,陈坤.Meta分析中的异质性检验方法.中国卫生统计.2006,23(6):486-490.
    [158]M. D. Shank, B. D. Darr, T. C. Werner. Increasing mail survey response rates:Investigating the perceived value of cash versus non-cash incentives. Applied Marketing Research,1990, 30(3):28-32.
    [159]A. S. Goritz. Incentives in Web-based studies:What to consider and how to decide. Available at:http://www.websm.org/uploadi/editor/goeritz2005-incentives.pdf,2005.
    [160]T. Guin, P. Janowitz, R. Stone. Use of pre-incentives in an Internet Survey. Journal of Online Research,2002,1(3):1-7.
    [161]T. H. Shih, X. Fan. Comparing Response Rates from Web and Mail Surveys:A meta-analysis. Field Methods,2008,20(3):249-271.
    [162]R. Rosnow, R. Rosenthal. Effect sizes for experimenting psychologists. Canadian Journal of Experimental Psychology,2003,57(3):221-237.
    [163]R. J. Harris, M. J. Bradburn, J. J. Deeks, et al. Metan:Fixed-and random-effects meta-analysis. Stata Journal,2008,8(1):3-28.
    [164]R. Rothstein, A. Sutton, M. Borenstein. Publication Bias in Meta-Analysis:Prevention, Assessment and Adjustments. Chichester, UK:Wiley,2005.
    [165]M. Egger, D. Smith, M. Schneider, C. Minder. Bias in meta-analysis detected by a simple, graphical test. British Medical Journal,1997,315(3):629-634.
    [166]J. Fang, P. Shao, L. George. Why do potential respondents decide to participate in web surveys? A psychological perspective.2009, working paper.
    [167]L. E. Preston. Reputation as a Source of Corporate Social Capital. Journal of General Management,2004,30(2):43-49.
    [168]I. Talmud. Corporate social capital and liability:A conditional approach to three consequences of corporate social structure. In R. Leenders and S.M. Gabbay (Eds.) Corporate social capital and liability. Boston:Kluwer Academic Publishers,1999.
    [169]S. W. Huck, E. M. Gleason. Using monetary inducements to increase response rates from mail surveys. Journal of Applied Psychology,1974,59(2):222-225.
    [170]J. Fang, P. Shao. Does material incentive really improve the response rate in web-based survey? A classification model of the potential respondents. Proceedings of 2006 International Conference on Management Science & Engineering (ICMSE2006). Vol.1:74-78.
    [171]F. Coderre, A. Mathieu, N. St-Laurent. Comparison of the quality of qualitative data obtained through telephone, postal and email survey. International Journal of Market Research,2004, 46 (3):347-358.
    [172]D. Cheng, P. Shao, J. Fang. A comparison of domestic and overseas current situation of web-based survey research.2007 International Conference of Management (ICM2007), Vol.2: 326-335.
    [173]J. P. Paolillo, P. Lorenzi. Monetary incentives and mail questionnaire response rates. Journal of Advertising,1984,13(1):46-48.
    [174]G. J. Stigler. The development of utility theory. In Essays in the history of economics (chap.5). Chicago:University of Chicago Press,1965.
    [175]L. Sproull, S. Kiesler. Reducing social context cues:Electronic mail in organizational communication. Management Science,1986,32(11):1492-1512.
    [176]J. L. Zaichkowsky. Measuring the involvement construct. Journal of Consumer Research, 1985,12(3):341-352.
    [177]M. Galesic. American Association for Public Opinion Research,2004. (abstract only)
    [178]P. Kenhove, K. Wijnen, K. Wulf. The influence of topic involvement on mail-survey response behavior. Psychology & Marketing,2002,19(3):293-301.
    [179]A. Goritz. The long-term effect of material incentives on participation in online panels. Field Methods,2008,20(3):211-225.
    [180]A. Goritz, H. Wolff, D. Goldstein. Individual payments as a longer-term incentive in online panels. Behavior Research Methods,2008,40(4):1144-1149.
    [181]D. Gefen. E-commerce:The role of familiarity and trust. Omega,2000,28(6):725-737.
    [182]D. Gefen, D. Straub. Consumer trust in B2C e-Commerce and the importance of social presence:Experiments in e-Products and e-Services. Omega,2004,32(6):407-424.
    [183]R. Gulati. Does familiarity breed trust? The implications of repeated ties for contractual choice in alliances. Academy of Management Journal.1995,38(1):85-112.
    [184]J. Zhang, A. Ghorbani, R. Cohen. A familiarity-based trust model for effective selection of sellers in multiagent e-commerce systems. International Journal of Information Security.2007, 6(5):333-344.
    [185]A. J. Faria, J. R. Dickinson. The effect of reassured anonymity and sponsor on mail survey response rate and speed with a business population. Journal of Business & Industrial Marketing,1996,11(1):66-76.
    [186]J. Fang, P. Shao. The role of trust in affecting potential respondents to participate in web-based surveys. Proceedings of 2007 International Conference on Management Science & Engineering (ICMSE 2007), Vol.1:174-179.
    [187]B. Mittal. A comparative analysis of four scales of consumer involvement. Psychology & Marketing,1995,12(7):663-682.
    [188]J. P. Peter, G. A. Churchill Jr. Relationships among research design choices and psychometric properties of rating scales:A meta-analysis. Journal of Marketing Research,1986,23(1):1-10.
    [189]A. N. Doob, J. Friedman, J. M. Carlsmith. Effects of sponsor and prepayment on compliance with a mailed request. Journal of Applied Psychology,1973,57(3):346-347.
    [190]R. A. Peterson. An experimental investigation of mail survey response. Journal of Business Research,1975,3(3):199-210.
    [191]W. Jones, G. Linda. Multiple criteria effects in a mail survey experiment. Journal of Marketing Research,1978,15(2):280-284.
    [192]D. I. Hawkins. The impact of sponsor identification and direct disclosure of respondents' rights on the quantity and quality of mail survey data. Journal of Business,1979,52(4): 577-590.
    [193]W. Jones, J. Lang. Sample composition bias and response bias in a mail survey:A comparison of inducement methods. Journal of Marketing Research.17(2):69-76.
    [194]A. J. Resnik, R. R. Harmon. The impact of sponsorship, labeling and incentives on mail survey response rates:a foot-in-the-door perspective.1985 AMA Educators'Conference Proceedings. Vol.1:59-64.
    [195]T. Greer, R. Lohtia. Effects of source and paper color on response rates in a mail survey. Industrial Marketing Management.1994,23(1):47-54.
    [196]C. Scott. Research on mail surveys. Journal of the Royal Statistical Society,1961,124(2): 143-205.
    [197]D. Nitecki. Effects of sponsorship and nonmonetary incentive on response rate. Journalism Quarterly,1976,55(3):581-583.
    [198]M. Houston, J. Nevin. The effects of source and appeal on mail survey response patterns. Journal of Marketing Research,1977,14(4):374-378.
    [199]G. Albaum. Do source and anonymity affect mail survey results? Journal of the Academy of Marketing Science,1987,15(3):74-81.
    [200]D. P. Bachmann. Cover letter appeals and sponsorship effects on mail survey response rates. Journal of Marketing Education,1987,46(9):45-51.
    [201]M. Bosnjak, B. Marcus, A. Schuetz, S. Lindner, S. Pilischenko. Beyond response rates:Effects of different (Web-) survey implementation procedures on sample composition in terms of personality.13th European Conference on Personality. (abstract only)
    [202]M. Fishbein, I. Ajzen. Belief, Attitude, Intention and Behavior:An Introduction to Theory and Research. MA:Addison-Wesley,1975.
    [203]于丹,董大海,刘瑞明,原永丹.理性行为理论及其拓展研究的现状与展望.心理科学进展,2008,16(5):796-802.
    [204]R. N. Bontempo, J. C. Rivero. Cultural variation in cognition:The rote of self-concept and the attitude-behavior link.1990, unpublished manuscript.
    [205]R. P. Bagozzi, N. Wong, S. Abe, M. Bergami. Cultural and situational contingencies and the theory of reasoned action:Application to fast food restaurant consumption. Journal of Consumer Psychology,2000,9(2):97-106.
    [206]I. Ajzen. The theory of planned behavior. Organizational Behavior & Human Decision Processes,1991,50(2):179-211.
    [207]I. Ajzen, T. J. Madden. Prediction of goal-directed behavior:attitudes, intentions, and perceived behavioral control. Journal of Experimental Social Psychology,1986,22(5): 453-474.
    [208]J. F. George. The Theory of Planned Behavior and Internet Purchasing. Internet Research 2004,14(3):198-212.
    [209]J. F. George. Influences on the Intent to Make Internet Purchase. Internet Research,2002, 12(2):165-180.
    [210]M. Bosnjak, B. Batinic. Understanding the willingness to participate in online-surveys. Online Social Sciences. Seattle:Hogrefe & Huber,2002.
    [211]R. Butler. Effects of signed and unsigned questionnaires for both sensitive and non-sensitive items. Journal of Applied Psychology,1973,57(3):348-349.
    [212]J. H. Pryor. Conducting Surveys on Sensitive Topics. New directions for institutional research, 2004,121(2):39-50.
    [213]J. Catania, D. Gibson, D. Chitwood. Methodological problems in aids behavioral research: Influences on measurement error and participation bias in studies of sexual behavior. Psychological Bulletin,1990,108(3):339-362.
    [214]T. Childers, S. Skinner. Toward a conceptualization of mail survey response behavior. Psychology & Marketing,1996,13(2):185-209.
    [215]R. C. Mayer, J. H. Davis. An integrative model of organizational trust. Academy of Management Review,1995,20(3):709-734.
    [216]D. Kim, D. Ferrin, H. R. Rao. A trust-based consumer decision-making model in electronic commerce:the role of trust, perceived risk, and their antecedents. Decision Support Systems, 2008,44(2):544-564.
    [217]F. Fukuyama. Trust:The social virtues and the creation of prosperity. New York:The Free Press,1995.
    [218]D. H. Mcknight, V. Choudhury, C. Kacmar. Developing and validating trust measures for e-commerce:An integrative typology. Information Systems Research,2002,13(3):334-359.
    [219]I. Ajzen. Constructing a TpB questionnaire:Conceptual and methodological considerations. Available from http://www.people.umass.edu/aizen/pdf/tpb.measurement.pdf,2006.
    [220]J. J. Francis, M. P. Eccles, M. Johnston, et al. A Manual For Health Services Researchers. Newcastle,2004.
    [221]H. J. Smith, S. J. Milberg, S. J. Burke Information privacy:Measuring individuals' concerns about organizational practices. MIS Quarterly,1996,20(2):167-196.
    [222]C. Fornell, D. Larcker. Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research,1981,18(1):39-50.
    [223]J. Cohen, P. Cohen. Applied multiple regression/correlation analysis for the behavioral science. Second Edition, New Jersey:Lawrence Erlbaum Associates,1983.
    [224]G. W. Allport. Pattern and Growth in Personality. New York:Holt, Rinehart. and Winston, 1961.
    [225]H. J. Eysenck. The biological basis of personality. Springfield, IL:Thomas,1967.
    [226]P. T. Costa, Jr, R. R. McCrae. The NEO Personality Inventory Manual. Odessa, FL: Psychological Assessment Resources,1985.
    [227]G. Saucier. Mini-Makers:A brief version of Goldberg's unipolar big-five makers. Journal of Personality Assessment,1994,63(3):506-516.
    [228]L. R. Goldberg. The development of markers for the Big-five factor structure. Psychological Assessment,1992,4(1):26-42.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700