The Community of Inquiry makes use of cookies. By continuing, you consent to this use. More information.
CoI QUESTIONNAIRE: 2.0?
D. Randy Garrison
July 13, 2021

My goal in this post is to draw attention to an important study that used advanced statistical techniques to analyze the CoI questionnaire (Abbitt & Boone, 2021). While exploring statistical anomalies may not be front of mind for most practitioners, or even those doing research with the CoI framework, this provides significant conceptual and analytical impact regarding the CoI questionnaire. By providing a brief overview of the study I hope to attract interest in pursuing research into the refinement and further validation of the CoI quantitative questionnaire.

However, before addressing suggestions for improvement, it is important to highlight and emphasize the findings of this research regarding the existing strengths of the CoI questionnaire. In this regard, Abbitt and Boone (2021) state that the CoI framework has demonstrated its value in providing insight into online and blended learning environments and the CoI “instrument exhibits strong measurement properties as evaluation of item reliability and person reliability suggested strong reliability” (p. 389).

Recognizing the strengths of the CoI questionnaire is not in contradiction to suggested area for possible improvement. First, Abbitt and Boone (2021) provide important insights concerning a problem with a specific cognitive presence (exploration) item; “Online discussions were valuable in helping me appreciate different perspectives.” Using Rasch measurement techniques, they found there was a possible misfit with this item and the cognitive presence scale. That is, the item may not be consistent with the other items on the cognitive presence subscale. This is notable as another study found this particular cognitive presence item loaded on the social presence factor (Kovanović et al., 2018). Similarly, there was also a possible misfit for a teaching presence (direct instruction) item: “The instructor provided feedback in a timely fashion.” As a result, the authors suggested that consideration may be given to revising or removing these items from the CoI questionnaire. My preference would be to test rewording of these particular items. This would contribute to future research of the CoI framework.

Another issue revealed by this study relates to possible item difficulty where it is easier or harder to agree with an item. For example, by examining the ordering and spacing of the items on the teaching presence scale, the items easiest for respondents to agree with were related to communication while the most difficult related to instructor feedback. Similarly, the easiest items on the social presence scale were related to open communication while the most difficult were related to affective expression. Finally, regarding cognitive presence, respondents found it easier to agree with the application and relevance of course material were the hardest. The implications are:

When viewed through the lens of the CoI framework, it is reasonable to examine and comment upon the item difficulty and spacing of items … for SP, we see that items relating to the dimensions (Affective Expression, Group Cohesion, and Open Communication) fall along distinct and different portions of the scale. For the CP and TP scales, however, this pattern is less distinct. These unique characteristics of item difficulty as well as spacing … provide insights into aspects of construct validity and also the online courses on which future continuous improvement efforts can focus. (Abbitt & Boone, 2021, p. 390)

Without getting into the details, this study identifies areas for productive research with the potential to improve understanding of communities of inquiry and provide opportunities to explore the development of a second generation of the CoI questionnaire. At the same time it should be kept in mind that this research also makes it clear that the CoI questionnaire has proven to be an essential tool to study communities of inquiry and that it offers “strong measurement properties.” Further refining the conceptual integrity of the questionnaire does not discount its current strengths.

To reiterate, the purpose of this post is to draw attention to this research and encourage work on developing a second generation of the CoI survey questionnaire. It would also seem to me that given the complexity of this challenge, it may be effectively addressed through a collaborative approach much like the one that led to the creation of the original CoI questionnaire (Arbaugh, et al., 2008).



REFERENCES

Abbitt, J. T., & Boone, W. J. (2021). Gaining insight from surveydata: an analysis of the community of inquiry survey using Rasch measurement techniques. Journal of Computing in Higher Education, 33, 367–397.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S., Garrison, D. R., Ice, P., Richardson, J., Shea, P., & Swan, K. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education, 11, 133-136.

Kovanović, V., Joksimović, S., Poquet, O., Hennis, T., Čukić, I., deVries, P., et al. (2018). Exploring communities of inquiry in massive open online courses. Computers and Education, 119, 44–58.




Anastasios Katsaris · 2 years ago
Thank you so much professor for bringing to our attention interesting new research on your inspirational work.
Reply
CONTRIBUTE A RESPONDING EDITORIAL


ABOUT THE AUTHOR

D. Randy Garrison
Professor Emeritus, University of Calgary
D. Randy Garrison is professor emeritus at the University of Calgary.Dr. Garrison has published extensively on teaching and learning in adult, higher and distance education contexts. He has authored, co-authored or edited fifteen books; 94 articles; 68 book chapters; 40 conference proceedings; and more than 100 academic presentations. His major books are: Garrison, D. R. (2017). E-Learning in the 21st Century: A Community of Inquiry Framework for Research and Practice (3rd Edition); Garrison, D. R. (2016). Thinking Collaboratively: Learning in a Community of Inquiry; Garrison, D. R., & Vaughan, N. (2008). Blended learning in higher education: Framework, principles and guidelines; Garrison, D. R., & Archer, W. (2000). A transactional perspective on teaching-learning: A framework for adult and higher education. Curriculum vitae


RECENT EDITORIALS

New book: The Design of Digital Learning Environments: Online and Blended Applications of the Community of Inquiry
Stefan Stenbom
January 31, 2024


Shared Metacognition and the Emergence of AI
D. Randy Garrison
November 1, 2023
Artificial intelligence brings increasing attention to critical thinking and discourse. From an educational perspective, my rationale is that the community of inquiry framework, whose

Social Presence Reconsidered
D. Randy Garrison
October 3, 2023
My previous editorial addressed the generic nature of the CoI framework. Given the relevance and validity of the CoI framework in face-to-face settings, this editorial considers the

CoI Framework in Face-to-Face Environments
D. Randy Garrison
August 1, 2023
I think it is safe to say that the general perception of the Community of Inquiry (CoI) framework is that it is specific to an online or at best blended learning environment. The reality

Online Learning and AI
D. Randy Garrison
May 19, 2023
Artificial intelligence (AI) has become a hot topic within many professions in terms of considering the transformational implications for practice. Interest in AI has grown dramatically
The Community of Inquiry is a project of Athabasca University, Mount Royal University, KTH Royal Institute of Technology, and the Canadian Journal of Learning and Technology, as well as researchers and members of the CoI community.