Abstract
Technology dependent, digitally innate students are joining academia. Consequently, traditional pedagogical techniques for achieving desired learning outcomes are not universally sufficient. Digital clickers were introduced in the early 2000 for engaging students and maintaining their attention span during lectures. However, some studies are critical about their usage as they consume valuable time during the class which can result in further compromises concerning achieving learning outcomes. This study aimed to investigate the application of an online Audience Response System (ARCH) a.k.a “clicker” in different academic settings. To achieve this, the researchers conducted an empirical study that identified the effectiveness of using an online ARCH in multiple academic use cases. The use cases consisted of audiences with varying academic backgrounds and levels of academic achievement. All the presented topics were related to research in cybersecurity. The study identified that clickers can be a useful tool for audience engagement in a complex topic like cybersecurity.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
The digitally innate, industrious, collaborative and entrepreneurial learners of Generation Z are entering academia. Institutions that fail to make the connection between student experience and digital engagement [16] risk not achieving desired learning outcomes. This occurs as traditional pedagogical techniques are insufficient for such technology craving learners. Digital ARCH (Audience Response System) were introduced in the early 2000 for engaging learners in class, to collect feedback from audiences during and/or after a presentation, and for maintaining student attention span [15]. Some studies are critical about the usage of ARCH as they can be time consuming and often compromise the already limited time available to achieve desired learning outcomes [14]. As well as ARCH methods, there is ongoing researcher exploring more passive means of activity tracking [18] through the application of image data for identification of human actions in different educational scenarios. Earlier studies [see for example; [7, 10, 13, 20] have looked at the application of online ARCH, also referred to as clickers, and their effectiveness in providing desired learning outcomes for complex academic subjects. In the presented study we aimed to investigate the application of clickers, as a means to support an effective learning experience for students and individuals studying and working in the complex discipline of cybersecurity. To achieve this, we conducted an empirical study which identified the effectiveness of using an online ARCH in multiple academic use cases, consisting of an audience with varying academic levels and different academic backgrounds. The analysis of data indicated that clickers can increase audience engagement related to difficult and complex cybersecurity topics. An identified limiting factor for their usage is that clickers require some additional training and awareness for both presenter and participants if they are to be implemented in a successful manner. The remainder of the paper is organized as follows: first we provide a brief background of ARCH followed by the research methodology. Next we highlight relevant work related to this research and present the experimental results and finding from three use cases. Lastly, we conclude the article whilst also addressing a number of limitations.
2 Methodology
The main goal in this work was to incept change in the pedagogical approach taken in certain academic lectures by applying some alternative active learning techniques [2]. Such change can be evaluated through an evidence-based intervention implemented in the form of an empirical study. This study aimed to identify the effectiveness of using an ARCH in the form of an online tool in multiple academic use cases, consisting of multiple audience levels (bachelor, master, PhD students, researchers, professors, and other academic staff members) with different academic backgrounds (mainly cybersecurity in two use cases and interdisciplinary in the third use case). The topics presented all relate to research in cybersecurity. This empirical study can provide an evidence base for the effectiveness of using an ARCH to ensure learning outcomes are achieved, motivate and facilitate discussion, scaffold critical thinking, and provide instant feedback collection.
The methodology followed in this study is depicted in Fig. 1. The study fits into the phases of an evidence-based intervention as suggested by U.S. Department of Education [1]. For planning purposes, before each lecture teaching materials were prepared and aligned with the ARCH. This meant that the ARCH could provide interactive material relevant to the progress of the presentation and related to teaching material. During and post lecture the usage of the ARCH was communicated to the audience to stimulate their interest and increase their engagement with the content. At the correct time, the presenter described and explained the interactive component by clarifying to the audience the task that is expected from them (answer a question, rate an option, provide feedback, etc.). The audience were given sufficient time to respond. When the outcome of ARCH data was relevant and in direct support of the teaching material, the presenter would reveal the results during the lecture and discuss them with the audience. Finally, on completion of the lecture, results were collected from the ARS, analyzed and the findings documented. A detailed description of the conducted experiments are discussed in Sect. 4. The data for this empirical study was collected in a fully anonymized manner with participant consent, and in accordance with GDPR obligations [22].
3 Related Work
Cybersecurity education requires understanding of complex concepts ranging from technical issues to human errors and weakness. This requires educational institutions provide a pedagogic platform that develops a combination of technical skills, domain specific knowledge and social intelligence among learners [8]. Current research attempting to make cybersecurity education more effective by applying different methods and techniques on both human [24] and machine level [23] can help meet this challenge. While new and more domain focused arenas for cybersecurity education and training are being developed [25], their remains a requirement for developing modes of education that enable students to be actively engaged in cybersecurity educational programs. One way this can be achieved is through operationalised hands-on cuber security exercises [26]. Such exercises are useful for providing dynamic practical cybersecurity skill-set training [23]. For theoretical aspects and imparting the necessary background of cybersecurity concepts, traditional instructivist classroom settings remain the norm. In other academic fields where this is also the case, clickers have been used as a method to increase student engagement and motivation.
According to earlier studies this has yielded positive results for student engagement [7, 10, 11, 13, 20]. Alternative strategies for instruction that follow less formal methods and focus more on student engagement through choice and control [27], have been successful in encouraging students to take greater responsibility for their own learning [17]. This self-regulatory outcome, referring to the degree students are metacognitively, motivationally and behaviorally active in their learning [27], is key to academic motivation and learning, as it leads to improved perceived self-efficacy; a learners belief about their capabilities to learn and perform [3,4,5, 17] . The remainder of this section gives a brief review of five studies where clickers have been applied.
Micheletto et al. [13] in 2011 conducted a study in which they used ARS to elicit feedback from students participating in a complex business ethics class. The researchers used pre- and post-exercise surveys and asked the participants to self-evaluate how ethical they considered themselves to be in the context of their business conduct. At the start of the exercise 92% considered themselves to be ethical. During the exercise the researchers used ARS and asked participants questions about their perceived ethical conduct in certain situations. The participants answered anonymously. The researchers found that post-exercise, 72% of participants considered themselves ethical in their conduct. The researchers attributed this statistical shift to ARS and its effectiveness in engaging students to reflect on the topic, and therefore support intended learning outcomes.
Kazley et al. [10] in 2012 conducted a study on clickers in a course taught at the Medical University of South Carolina. The researchers hoped to identify the effectiveness of clickers in their courses by conducting a survey on a focus group in which they used clickers in the health administration program. The researchers stated that clickers can add additional cost in delivering the course, in terms of buying the digital audience response system and training the instructors on how to use them. However, in terms of student engagement and participation in classroom activities, they are a very effective pedagogic method.
Wang et al. [20] in 2016 conducted a study in which they used surveys to identify the effectiveness of digital games-based clickers for educational purposes. The researchers focused on four research questions which involved measuring student’s motivation, enjoyment, engagement, and learning outcomes. They compared the digital game-based quiz system with a paper-based quiz system and identified statistical tendencies towards the digital game-based quiz system. The researchers stated that they only used one digital game-based quiz system called Kahoot for comparison with paper-based system. In future studies they plan to compare Kahoot with other digital clickers.
Byrne [7] in 2017 performed a study in which he used clickers for eliciting feedback from students concerning complex topics. His intent was to identify knowledge gaps. The problem he faced was the complexity of the topic he was teaching as it involved concepts from physics, chemistry and biology in a single course. This made it hard for students to grasp the presented knowledge and concepts. Clickers provided him the opportunity to get instant feedback from students about the topics and identify knowledge gaps. He further listed some advantages and disadvantage of electronic clickers which involved usability, cost and technical problems.
Khan et al. [11] in 2019 conducted a small case study on third year engineering students whose field of study was ‘Instrumentation and Control Engineering’ at the University of Plymouth. The study was conducted to identify the positive impact of clickers on learning outcomes for students involved in complex fields of study. The researchers modified the lecture content and embedded clickers within the lecture slides in order to test the knowledge and cognitive skills of students. The researchers stated that clickers can have a positive effect on participation, on student learning, engagement and grade attainment.
4 Experiments and Results
The main goal of this study was to increase learning outcomes in several types of academic lectures. To achieve this, an evidence-based intervention method as described in Sect. 2 was conducted across three separate but structurally similar experiments. Each experiment had a different topic; however all were related to cybersecurity and each experiment aimed to evaluate the effectiveness of the online ARS in a different use case. A summary of the conducted experiments are depicted in Table 1. An online ARS called Mentimeter [12] was leveraged in the three experiments. The online nature of Mentimeter enables both instructors/speakers and their students/audience to interact with useful active learning options such as voting, rating, and open-ended discussion. The instructor prepares the interactive slides on a web application, and the audience interact with the slides using their mobile devices once they have been provided with a six-digit code to sign in anonymously [12]. The description of the use cases and the conducted experiments with results are discussed below.
4.1 Use Case 1
The first use case was evaluated by an experiment that was conducted as part of a PhD level university didactics course. The main objective was to conduct a pedagogical intervention that aimed to incept change in the way a weekly department seminar was being conducted. The intended outcome was that a more active approach may inspire new ideas and encourage students to pursue further higher education. This may go some way to ensuring such seminars are held in higher esteem and subsequently become a more valued learning opportunity, as well as contributing to the continuous learning environment for students and researchers.
Therefore, some active learning techniques as suggested by Felder and Brent [9] were applied. The two main techniques that are of interest to this paper are related to actively engaging the audience with engaging activities, as well as achieving variety in these activities. In addition to applying active learning techniques. The experiment we conducted required a feedback method to capture the perceived change in the way these department seminars were conducted. The feedback quality, implementation and evaluation were influenced by the work of Boud [6].
Mentimeter was used to implement the aforementioned active learning techniques and feedback collection. Table 2 shows the number of answers, type of interaction and their objectives. Each slide of the interactive component was carefully constructed and placed to measure different aspects of the experiment. The reported results from the audience supported the conducted experiment with sufficient data. For instance, the second interactive slide was placed after the related material was presented. It was aimed to measure audience comprehension of the discussed material and it was constructed in such a way that would ignite further discussion.
In the last slide of the seminar, feedback was collected to measure the success of the experiment. The interactive component scored a rating of 3.6 out of 5 and was the lowest rating among other parameters as shown in Fig. 2. This leaves room for improvement regarding the interactivity aspect, such as reducing the effort to respond by reducing the displayed text or adding visual aids. However, it is critical to note that a key tenet of improving performance is not to oversimplify and reduce. Instead the aim should be to preserve the complexity [21] and have students learn at the zone of proximal development [19]. It is possible that students need to become more attuned to this mixed-mode of teaching. Although tech savvy, they are likely unfamiliar with ARS enhance teaching techniques. The researchers communicated this ARS approach and findings to future seminar speakers. Some were interested in implementing the interactive component. Unfortunately, both failed to implement it. One reported back that it requires more time than what is affordable, while the other implemented it but faced technical issues due to incorrect setup.
4.2 Use Case 2
The second experiment was conducted in conjunction with a presentation at a workshop that was part of a PhD level university course in research ethics. The aim was to explore the applicability of ARS in motivating critical thinking among attendees, facilitating ethical discussion, and evaluating predefined research questions. A side goal was to conduct an experiment similar to the work of Micheletto et al. [13] to measure the shift in the ethical standing of the participants. Unfortunately, due to time limits and no control group, this aspect was not sufficiently measured.
As depicted in Table 3, the participation level was the highest in comparison to the other use cases and the quality of the responses was sufficient to evaluate the targeted objectives. The ethical discussion aided by ARS successfully reflected a level of critical thinking by the attendees. This was a positive outcome when evaluating the pre-established research questions identified as part of the PhD course. The fact that responses to posed ethical questions in slides 1–4 were almost even, reveals a good outcome for an ethical question designed to split opinion. It shows that participants were engaged with the question, and it encouraged their critical thinking. A disadvantage with this use case was related to the time-consuming interaction, waiting for all the participants to sufficiently comprehend the displayed questions and to digitally participate added time-delay to the presentation. This can be solved with careful planning concerning how the clicker application is introduced and embedded into the presentation. There are practical time keeping issues that presenters need to account for when attempting to connect digital engagement with student participation demands.
4.3 Use Case 3
The third use case was evaluated through an experiment conducted in conjunction with a presentation at a PhD student seminar. The main objectives were to motivate discussion among PhD students and engage them in the presented research topic. Considering the audience’s strong background in cybersecurity, the low amount of participants and their responses can be compensated with the increased value of their input (Table 4).
After analyzing the responses, a clear confusion appeared related to the discussed material. We consider the ability to capture such confusion through the interactive component, as beneficial to improve the presented material. Additionally, although the open-ended interactions generated the lowest responses, the received responses included very beneficial input to the presented research. Thus, the open-ended interaction was found to be useful in settings where there is a relatively high level of skill and domain knowledge among the audience. Lastly, the feedback rating for the interactive component again scored the lowest rating: 3.3 out of 5.
4.4 Interactions Statistics
A cross result statistical analysis was conducted to extract further findings related to the effectiveness of the interactive slide types and the audience engagement level over the teaching period. Table 5 reflects that Multiple Choice (M/C) is the most responsive interaction, and Word Cloud second. This is mostly because M/C is simple and does not require typing, while Word Cloud requires slightly more effort. Slider interaction, where audiences can rate their answer using a slide bar, was not as attractive as the previous two interactions. Sliders were valuable in providing instant feedback. Lastly as expected, Open Ended interactions showed the lowest interest due to the effort required to form sentences and type them. As noted previously, this approach can provide valuable input when the audience has greater levels of certain cognitive competencies, such as metacognitive skill and adaptive thinking proficiency.
An interesting finding was related to the response rate over the period of the learning time. The interactive slides were strategically distributed throughout the presentation to capture audience engagement. Figure 3 reflects how the responses tend to decrease over time. This suggests that for a valuable interaction to support increased learning, then placement of the interactive slides should be considered as most beneficial if they occur at the beginning of the presentation/lecture, and then again immediately after a break.
4.5 Conclusion
The present study aimed to investigate the application of an online Audience Response System (ARS) a.k.a a clicker in different academic settings. This empirical study aimed to identify the effectiveness of using an online ARS in multiple academic use cases. Each use case consisted of a range of audience academic levels with different academic backgrounds. All topics were related to research in the field of cybersecurity. The study included an evidence-based intervention implemented over three different but structurally similar experiments. The three different use cases were: a seminar, a course workshop, and a regular workshop. This study has provided evidence supported with documented experiments showing that many advantages can be realized by using ARS in different academic settings. ARS give presenters alternative techniques for increasing audience engagement, motivating discussion, stimulating critical thinking, evaluating research direction, and collecting instant feedback. ARS showed great promise to reduce the challenges of carrying out lectures in large rooms with a large number of participants. As a utility to facilitate ethical discussion with large audiences, ARS proved effective. Drawbacks of using ARS can be additional workload in preparation and aligning specific skill-sets to learning objectives. There are also time issues that need to be overcome relating to explanation and audience response time, but without using technology as a means to oversimplify, and thus compromise the subjects complexity. Using ARS in small groups did not generate increased utility. Regarding the type of interactions, the use of short interaction exercises (such as multiple choice and words suggestions) showed the largest acceptance levels among the participants. Open ended exercises showed the lowest acceptance level. Lastly, a decrease in the response rate over the period of the presentations was noticed across the three experiments. This would suggest the need greater attention to planning where to placement the ARS segments within the presentation.
References
Non-regulatory guidance: Using evidence to strengthen education investments September 2016. https://www2.ed.gov/policy/elsec/leg/essa/guidanceuseseinvestment.pdf
Ambrose, S.A., Bridges, M.W., DiPietro, M., Lovett, M.C., Norman, M.K.: How Learning Works: Seven Research-Based Principles for Smart Teaching. John Wiley & Sons, Hoboken (2010)
Bandura, A.: Social foundations of thought and action. Englewood Cliffs, NJ 1986, 23–28 (1986)
Bandura, A.: Perceived self-efficacy in cognitive development and functioning. Educ. Psychol. 28(2), 117–148 (1993)
Bandura, A., Freeman, W., Lightsey, R.: Self-efficacy: the exercise of control (1999)
Boud, D.: Feedback: ensuring that it leads to enhanced learning. Clin. Teach. 12(1), 3–7 (2015)
Byrne, C., et al.: Clickers: a learning technology project case study. J. Acad. Dev. Educ. 7, 94–106 (2017)
Dawson, J., Thomson, R.: The future cybersecurity workforce: going beyond technical skills for successful cyber performance. Frontiers Psychol. 9, 744 (2018)
Felder, R.M., Brent, R.: Learning by doing. Chem. Eng. Educ. 37(4), 282–309 (2003)
Kazley, A.S., Annan-Coultas, D.: Use of an audience response system to teach problem-solving in health administration. J. Health Adm. Educ. 29(3), 219–227 (2012)
Khan, A., Schoenborn, P., Sharma, S.: The use of clickers in instrumentation and control engineering education: a case study. Eur. J. Eng. Educ. 44(1–2), 271–282 (2019)
Little, C.: Technological review: mentimeter smartphone student response system. Compass, J. Learn. Teach. 9(13), 64–66 (2016)
Micheletto, M.J.: Using audience response systems to encourage student engagement and reflection on ethical orientation and behavior. Contemp. Issues Educ. Res. 4(10), 9–18 (2011)
Morgan, R.K.: Exploring the pedagogical effectiveness of clickers. Insight J. Sch. Teach. 3, 31–36 (2008)
Paschal, C.B.: Formative assessment in physiology teaching using a wireless classroom communication system. Adv. Physiol. Educ. 26(4), 299–308 (2002)
Povah, C., Vaukins, S.: Generation z is starting university–but is higher education ready. Guardian 10 (2017)
Schunk, D.H.: Self-regulation of self-efficacy and attributions in academic settings (1994)
Ullah, M., Ullah, H., Alseadonn, I.M.: Human action recognition in videos using stable features (2017)
Vygotsky, L.: Interaction between learning and development. Readings Dev. Child. 23(3), 34–41 (1978)
Wang, A.I., Zhu, M., Sætre, R.: The effect of digitizing and gamifying quizzing in classrooms. Academic Conferences and Publishing International (2016)
Ward, P., Gore, J., Hutton, R., Conway, G.E., Hoffman, R.R.: Adaptive skill as the conditio sine qua non of expertise. J. Appl. Res. Mem. Cogn. 7(1), 35–50 (2018)
Warström, J.: Gdpr and personal data protection in mentimeter. =https://help.mentimeter.com/en/articles/1937769-gdpr-and-personal-data-protection-in-mentimeter
Yamin, M.M., Katt, B.: Inefficiencies in cyber-security exercises life-cycle: a position paper. In: AAAI Fall Symposium: ALEC, pp. 41–43 (2018)
Yamin, M.M., Katt, B.: Cyber security skill set analysis for common curricula development. In: Proceedings of the 14th International Conference on Availability, Reliability and Security, pp. 1–8 (2019)
Yamin, M.M., Katt, B., Gkioulos, V.: Cyber ranges and security testbeds: scenarios, functions, tools and architecture. Comput. Secur. 88, 101636 (2019)
Yamin, M.M., Katt, B., Torseth, E., Gkioulos, V., Kowalski, S.J.: Make it and break it: an iot smart home testbed case study. In: Proceedings of the 2nd International Symposium on Computer Science and Intelligent Control, p. 26. ACM (2018)
Zimmerman, B.J.: Dimensions of academic self-regulation: a conceptual framework for education. Self-Regul. Learn. Perform.: Issues Educ. Appl. 1, 21–33 (1994)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Amro, A., Yamin, M.M., Knox, B.J. (2020). Applications of an Online Audience Response System in Different Academic Settings: An Empirical Study. In: Stephanidis, C., et al. HCI International 2020 – Late Breaking Papers: Cognition, Learning and Games. HCII 2020. Lecture Notes in Computer Science(), vol 12425. Springer, Cham. https://doi.org/10.1007/978-3-030-60128-7_13
Download citation
DOI: https://doi.org/10.1007/978-3-030-60128-7_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-60127-0
Online ISBN: 978-3-030-60128-7
eBook Packages: Computer ScienceComputer Science (R0)