Introduction
Human information behavior has developed along with the digital environment. People have created new practices to easily access desired information. Human information behavior is defined as "the study of how people need, seek, give, transfer, and use information in different contexts, including the workplace and everyday living" (Furi and Balog, 2016: 62). To gain insight into the quality of the teaching-learning process, higher education institutions have been using the Student Evaluation of Teaching (SET) (Ching, 2018: 64). SET is a standard that facilitates student-centered educational design, as teachers collect data to improve the learning process and course structure (Abbas et al., 2022: 12050).
The use of computer and Internet facilitated the development of information search skills in the current and future generation of students (Israel, 2015). Gentina (2020) noted that people representing Generation Z have grown up in a world with unprecedented access to technology, for this reason, the name of digital natives is attributed to them. This generation consists of independent learners, with good multitasking skills and innovative and sustainable thinking; they prefer to work independently, can actively solve problems, and spend quite a lot of time online (Dobrowolski, Drozdowski and Panait, 2022; Gentina, 2020).
Dobrowolski, Drozdowski and Panait (2022) pointed out a very interesting fact about Generation Z: they can discover information, but they lack the skills to evaluate it critically. This tendency also exists among students who collect information without processing it and take over the work of others ignoring or not understanding that using information without citing sources constitutes plagiarism (Erguvan, 2022). Access to information is facilitated by ingressing to various search engines that are readily available, however, the information must be processed. To overcome these problems and process information properly, our students require training through engaging methods like debates, discussion and reflection, and teamwork learning.
The use of information is a complex activity that requires searching for information and using it for argumentation; it is essential to identify and follow certain steps to reach Information Literacy (Haider and Sundin, 2022). Some researchers find both access to information and agregation of information as processes that bring a sense of well-being, since they provide students necesary competencies and training for the workplace (Hicks and Lloyd, 2022). over time, there has been numerous research focused on the stages of the information process. That is to say, the skills of using information are built through time and such competences are essential for the 21st Century (Laar et al., 2017).
The education received in various educational stages -primary school, secondary school and university- provides each of us with the basis for professional qualification and specialization and, at the same time, develops the necessary skills to access information (Hicks and Lloyd, 2022). The library and its employees provide students with the appropriate environment to access information sources and librarians should act as facilitators and educators for young people in general, not only just for those attending the library.
There are some questions in literature regarding Information Literacy courses. The American Library Association's Information Literacy Competency Standards for Higher Education were last updated in 2000. Bruce (2003) suggests that IL in higher education holds seven faces:
Informed Citizenship: The ability to participate in a democratic society by critically evaluating information and making informed decisions.
Scholarship: The skills needed for academic success, including the ability to find, evaluate, and use information effectively in an educational context.
Research: The capacity to identify an information need, locate, evaluate, and use information for various research purposes.
Learning: Information Literacy as an integral part of the learning process, emphasizing the ability to learn independently.
Workplace Readiness: The skills required to succeed in the workplace, including the ability to find and use information effectively.
Everyday Life: Information Literacy skills applied to daily life situations, such as making informed consumer decisions or maintaining personal health.
Knowledge of Knowledge: The understanding of how information is organized, produced, and disseminated across various sources.
Given these points, the novelty of our study lies in the important data collected by a survey applied to Bachelor and Master students from Transilvania University of Brasov regarding access to and use of information obtained both by documentation and in an Information Literacy course. The research hypothesis was to find out if the learning outcomes, the subjects taught in the Information Culture course, would meet the expectations of Generation Z, today's students with multiple digital skills.
Generation Z is often considered to include individuals born roughly between the mid-1990s and the early 2010s. However, the exact boundaries of this generation can vary slightly depending on different sources and perspectives. A common definition is that Generation Z encompasses individuals born around 1997 or 1998 up until around 2012 or 2015. Hereunder are the reasons why a new questionnaire was created to assess Information Literacy courses:
Need for customization: An IL course may have specific learning objectives and outcomes that are not covered by existing questionnaires. In such cases, creating a new questionnaire that is customized to the course objectives and learning outcomes can be beneficial.
Updated content and teaching methods: If an IL course is updated with new content or teaching methods, it may be necessary to develop a new questionnaire that is relevant and reflects these changes.
Specific target audience: If the target audience of the IL course is different from that of the existing questionnaires, a new questionnaire may need to be generated to ensure that it is relevant and appropriate for the target audience.
Cultural and linguistic differences: If the IL course is offered in a different cultural or linguistic context, it may be necessary to formulate a new questionnaire that is sensitive to these differences and can accurately assess the learning outcomes of the course.
In summary, the creation of a new questionnaire to assess an IL course was necessary due to the need of customization, updated content and teaching methods, and because of the specificity of the target audience, and their cultural and linguistic differences.
Review of the literature using scientometric methods
Scientometric methods help us review literature quickly, since they use algorithms to automatically select the most relevant articles. Scientometric methods helped us to create an image of a particular field within a specific database. Using the research question 'Information Literacy' AND 'Courses', we obtained 237 results in the Web of Science database. Then, we downloaded said database in a tab-delimited file format, full record, and cited references. The data were analysed with VOSviewer, a software for scientometric analysis. It is important to say that 327 terms used in document descriptions occured at least ten times. The software calculated relevance and 196 terms were identified. As can be seen in Figure 1, the term map was generated; four clusters were pinpointed:
Cluster in blue: Information Literacy; Pedagogy; Employment Courses
Cluster in red: Digital Technology; Secondary School; Competencies
Cluster in green: Society; False News; News; Press Releases
Cluster in yellow: Literature; Review; Criteria; Definitions
As a result, four research directions were generated in the field:
Courses in Information Literacy
Information Literacy AND Digital Technology
Information Literacy AND Society
Information Literacy; Literature
The most cited documents are highlighted on the dissemination map in Figure 2:
The journals that published these articles are presented in Figure 3:
The most compelling articles on this topic delve into digital education's role in equipping students with the skills to discern and counteract false information. opinion polls served as the foundational data for hypothesis analysis and the interpretation of results. The pervasive issue of fake news infiltrates our information streams through both traditional mass media sources and social networks. Educating young people on how news is created and how to change it -as well as digital education- represents the solution to learn and detect the negative nature of news (Jones-Jang, Mortensen and Liu, 2021).
Students acquire digital competences from an early age through IT and computer classes at school. Such skills are improved by statistics and Information Literacy preparation. These young students accumulate information and develop cognitive skills especially if they have computers at home. These material circumstances not only help to complete homework, but also enables them to identify information. Research monitoring computer skills in young students in Australia, the Czech Republic, Chile, Croatia, Denmark, Korea, Lithuania, Germany, Poland, the Russian Federation, and Slovakia has shown that the economic evolution of a country, as well as a technology-providing familiar environment play a big role in the development of digital skills and high scholarity rates (Hatlevik et al., 2018). However, teenagers may encounter risks in the online environment, such as discussions with strangers or phishing, which is why accessing certain websites should be closely monitored by parents. This is rather a child-parent coparticipation, because the dangers of online media and online addiction should be prevented.
Assessment of the scientific activity of university professors by field and in terms of digital competences and analysis of sources is another way of research today. This includes analysing important databases like Web of Science, Scopus, ERIC, or data received from an educational centre. Research methods, subject analysis and identifying the research methods of university professors were part of the assessment criteria. Training teachers in digital skills is essential for teaching, research, and adaptation in order to keep pace with students, a fact which today is a reality (Spante et al., 2018).
In the case of research in the social sciences and humanities, algorithms are deemed as a set of information-basic notions specific to a field and developed over time. These algorithms can lead to the development of social theories with which researchers can analyse and outline certain phenomena in society and its changes within. This path requires the responsibility of teachers, researchers and, above all, a critical approach to information (Hicks and Lloyd, 2022).
Cluster 1 in blue related to Information Literacy, Pedagogy, and Employment Courses, identified in the scientometric research indicates that the authors have published many articles with reference to IL courses. This divide contributed to identify the seminal works and the evolution of ideas within the Information Literacy field to map its intellectual structure. Although a direct correlation with the immediate objectives may not be immediately perceived, the insights gained from this scientometry analysis significantly inform us about the methodology to choose, reinforce the theoretically foundation, and, ultimately, enhance the overall robustness of our study.
Materials and method
Approximately 10 years ago, the Information Culture course was introduced with various difficulties at Transilvania University, since quality standards and field disciplines were to be respected for the different specializations, Information-Documentation courses, Communication courses, Research Methodology courses, Academic Writing courses and Information Culture courses. For the purposes of this research, the specializations that had an IL course in their curricula were identified and their subject files were analysed. These seminars were taught to second semester freshman of Mechatronics Engineering and of Optometry and Medical Engineering in the Faculty of Product Design and Environment, as well as to majors of Communication and Public Relations and Digital Media in the Faculty of Sociology and Communication. IL courses consisted of 7 courses and 7 seminars (Laboratory) for each specialization, each of 2 hours.
The qualitative research study focused on the perceptions and experiences of students regarding Information Culture courses. It was based on the data collected through an online questionnaire that interprets the students' perspectives on the relevance of the existing IL course. Of the data collected from the specialization groups, it was identified that 260 students were enrolled in the course; 65% were female. The response rate was 77% (201 respondes out of 260). The criteria for inclusion in the research were the condition to attend the course, as well as the consent of the student to participate in the research. The hypothesis aimed to discover if the learning outcomes -the themes taught in the Information Literacy course- could meet the expectations of Generation Z, today's students with multiple digital skills. The questionnaire had 12 questions (Q1-Q12), which are described in detail below. The link <https://www.surveymonkey.com/r/N9JXD3C> and QR code of the online survey generated in SurveyMonkey were sent through the e-learning platform, Moodle.
Research Validation
There were found significant differences between the proportion of females and the sample; a significance level of 0.05 was used and the Z-test was performed. Calculated Z of 1.141146712 did not exceed the critical value of 1.644853627 for one tail, or of 1.959963985 for two tail tests. Therefore, we do not reject the null hypothesis. Also, the p value was 0.253675587, which is greater than the significance level of 0.05. The results suggest that there is no significant difference between the sample mean and the hypothesized mean of zero at the 0.05 significance level.
The results were analysed using a quantitative method, while Emotion Analysis method was used for open questions. Emotion Analysis uses social media comments to identify people's attitude and opinions in various aspects of life, such as health monitoring, political events, election trends, product reviews, movies, medications, as well as the analysis of social networks and terrorist activities. Similarly, it can be useful to improve educational policies by monitoring student performance (Bibi et al., 2022).
To apply Emotion Analysis to Twitter data can be a challenge, since more than 1 billion new tweets are posted every three days (Sunitha et al., 2022). For several decades, Twitter Emotion Analysis has gained much attention among researchers due to the progress recorded in machine learning and deep learning techniques (Bibi et al., 2022). Thus, it is essential to use intelligent machine learning techniques to perform Twitter Emotion Analysis.
For this article, we used the Excel add-in, Azure Machine Learning. After installing the application, we selected the 'Twitter Text Analysis' option. The preliminary data processing involved the elimination of special characters, punctuation marks, numbers, repeated words, non-English characters, and unnecessary spaces (Bibi et al., 2022). The next step involved renaming the column to be analysed with the term 'Tweet Text,' according to the analysis matrix. The analysed data was selected beforehand, as well as the columns where the output was generated. The last step was to select the 'Predict' command.
Result analysis
Question 1: We found that a large part of the students considered that this course had a high degree of novelty (Figure 4). Thus, 43% believed that the information in the course was 'New' (Response 2) and 22.50% considered that it was 'Very new' (Response 1). 7.50% of the total respondents were more prepared in this field and considered that this course 'Did not offer any novelty' (Response 5); however, 11.50% considered that 'A low degree of novelty' was discovered in this course (Response 4). A percentage of 15.50% believed that 'This course brought neither new nor known information' (Response 3).
Question 2: The average of results reveals in Figure 5 that all the topics and information addressed during the seminar were rated with a score of 3 in terms of novelty; 1 was the lowest rating and 5 the highest. Analysis of the results revealed that the highest novelty rating was recorded for Carbon Emissions with a 3.91 average. The next topics with a high degree of novelty were Boolean operators with a 3.86 average, and Creative Commons Licences with a 3.82 average.
Other topics regarded as new were Automatic Bibliography Management Softwares, Databases, and Search Strategies.
Question 3: As can be seen in Figure 6, the topics of highest interest to students were Search Strategies and Automatic Bibliography Management Software with an average response of 3.93. These were followed by Copyright Protection (3.91), Creative Commons Licences (3.89), and Databases (3.82). The topics of least interest were Document Types, which had an average response of 3.36, Carbon Emissions, and Boolean operators.
Question 4: Most students evaluated the teaching module as explanatory enough, interactive, and consisting of modern teaching methods. Based on the average response, we found that three characteristics had an average score above 4. There is a 2.3 average score evaluating the course as boring (Figure 7).
Question 5: Figure 8 shows that of a total of 201 students, 43.28% considered that this course influenced their information behaviour to 'A great extent' (Response 4), while 19.90% considered this course to be 'Very useful' in the development of their information behavior (Response 5). 28.86% of respondents concluded that their behaviour remained 'Neutral' in relation to the information received during the course (Response 3). Only 6.97% of the students believed that their behaviour was influenced 'To a small extent' by their attendance to this course (Response 2). Only 1% believed that their informational behaviour was 'Not influenced at all' by their attendance to this course (Response 1).
Question 6: Using Emotion Analysis, Azure Machine Learning, we found that the general feeling of the students was positive in relation to the course (Figure 9). Emotion Analysis method, through Azure Machine Learning to assess student sentiments, involved data from open questions and integration with Azure ML Sentiment scores were assigned to each text. The aggregate sentiment analyses provided a score. The average number of positive responses given by respondents was 0.66 and the average number of negative responses was 0.55. Where the value comes closer to +1, it shows a positive feeling, those that exceed or are equal to 0.50 represent a neutral feeling. Those closer to -1 reflect a negative feeling.
To acknowledge the absence of a 'negative' option clarifies that this was not an oversight, but a result of collected data. It was to ensure transparency by explaining that respondents might not have chosen the negative option or that there may have been a minimal number of negative responses.
Analysing the students' answers, we can see that 25% of the respondents noticed that the teaching methods of teachers were well-suited for this course. Furthermore, 22.09% believed that they discovered new and useful information during this course. 16.86% of 173 students that replied to this question appreciated everything related to this course. 11.05% were pleasantly impressed by the interactive nature of the course and the teacher-student relationship, while 12.79% appreciated the explanations provided by the teacher during the seminar. other positive aspects of the course mentioned by the students were the applicability of what was learned, the pleasant atmosphere, freedom, professionalism, and the information structuring (Figure 10).
Question 7: Through Emotion Analysis method one can easily observe that the students' general feeling toward the course was positive. of 150 recorded answers, 132 were positive with an average 0.66 of positive responses. The values closer to +1 show a positive feeling, those that exceed or are equal to the value of 0.50 represent a neutral feeling, and those closer to -1 reflect a negative feeling. Therefore, 15 students with an average response of 0.55 had manifested neutral feelings towards the course, whereas 3, with an average response of 0.44, were not satisfied (Figure 11).
Analysing the data provided by respondents, we can state that most of them believed everything went well during the course. Also, some believed that the seminar presented too much information, which made them lose focus or get bored. They also mentioned the uncomfortable hour of the day in which the seminar was held, as well as the internet connection, which performed poorly.
Question 8: Following the question 'Which topics would you like to discuss more?', the students chose Creative Commons Licences to the greatest extent, followed by Copyright and the opinion that all topics were already sufficiently addressed. other topics that the students would have liked to be further discussed were Databases, Internet Information Assessment, Citation Methods and Styles, Boolean operators, Search Strategies, Plagiarism, Future Specialisation, Google Services, and Creative Writing.
Question 9: In Figure 12, it may be seen that 20.24% of respondents believed they will use all the skills acquired during the course: 13.69% believe they will use Search Strategies and 11.90% believe that Database search skills will be helpful. 10.71% of students participating in the study said that Citation Methods and Styles will be useful to them, while 7.14% believe that Creative Commons Licenses will be of utility. other habilities acquired during the course, which are deemed as practical in the future, are Generating Bibliographies, Internet Information Assessment, Copyright, and Boolean Operators.
Question 10: 95.94% of the students believe it is necessary to possess information skills and only 4.06% believe that such skills are necessary to a smaller extent (Figure 13).
Question 11: As Figure 14 shows, 74% of 201 respondents were females and 26% were males.
Question 12: As far as the field of specialisation of the respondents, 37% were students at the Faculty of Communication and Public Relations (CRP), 26.50% studied Digital Media (MD), 16.50% studied at the Faculty of Medical Engineering (IMED), 10.50% studied Mechatronics (MT), and 9.50% Optometry (OPTO).
Discussion
Teaching evaluation helps the educational system to develop and urges teachers to be more innovative of their teaching methods and course content structuring (Abbas et al., 2022). Students' evaluation can aid teachers to improve their work and qualify if quality education was provided or not (Abbas et al., 2022). SET is a relatively new term, and is used as a synonym for student evaluation of teaching performance, student course evaluation, or student course satisfaction. Gregory Ching (2018) claims that students can use SET as a tool to take revenge on teachers and that, in recent studies, effective teachers have been evaluated poorly by students.
In the conducted study, we were able to analyse the satisfaction of students towards the IL course using the Emotion Analysis, Azure Machine Learning. Therefore, we found that the general feeling of the students in relation to the course was positive. In terms of negative aspects, the students underlined that too much information was presented during the seminar, which made them lose focus or get bored. They also mentioned the uncomfortable time of the day in which the seminar was held and the internet connection, which performed poorly. We thus confirmed the research hypothesis. Hereunder we display some viable solutions or discussion triggers to take into account, these constitute the six pillars of ACRL Framework (2015):
Authority Is Constructed and Contextual
Information Creation as A Process
Information Has Value
Research as Inquiry
Scholarship as Conversation
Searching as Strategic Exploration
These frames provide a conceptual foundation for Information Literacy instruction and guide educators in designing learning experiences that foster the development of information critical skills.
The systematic review by Haider and Sundin highlights the importance of developing critical digital literacy in higher education. One possible strategy to incorporate this into IL courses is to focus on critical evaluation of sources, particularly in the context of digital media. Students can be taught to look for bias, assess the reliability of sources, and identify potential threats to privacy and security (2022).
Additionally, Khanagar et al. (2021) study suggests that algorithms present a challenge to students' Information Literacy skills. One solution to this challenge is to provide students with more guidance on how search engines and algorithms work. By understanding the mechanisms behind personalized search results, students can learn to be more critical of the sources they encounter and avoid being influenced by biased or misleading content.
Equally important is Bruce's (2003) framework for the seven faces of Information Literacy, since it provides a useful starting point for understanding the different dimensions of Information Literacy in higher education. However, it may be helpful to consider other dimensions as well, such as the role of technology in Information Literacy, as well as its intersections with other literacies (Media Literacy, Digital Literacy, among others).
Conclusions
The conclusions drawn from this study concerning Generation Z and the design of new IL courses underscore the importance of tailoring an equational approach to align with the distinctive characteristics, preferences, and information processing capabilities of this generation. We present some key conclusions of the course: diverse learning preferences, emphasis on novelty, technological competence, desire in-depth exploration, and discovery-oriented learnings.
Generation Z exhibits diverse learning preferences, with an inclination towards interactive and modern teaching methods. Therefore, IL courses should incorporate dynamic and engaging instructional strategies to capture and sustain their interest. In this light, Generation Z values novelty and relevance; novel topics, such as Creative Commons and Copyright, gathered increased interest. Course designers should prioritize content that resonates with contemporary issues or aligns with the dynamic information landscape of the generation.
Students demonstrated an elevated level of technological competence. IL courses should leverage and build upon their existing digital skills emphasizing topics such Automatic Bibliography Management software and Search Strategies, which align with their technological proficiency. Generation Z students value courses that facilitate the discovery of new and useful information. Future Information Literacy courses should incorporate elements that encourage self-directed exploration that allows students to actively uncover what aligns with their immediate needs and interests.
In conclusion, the design of IL courses for Generation Z should prioritize interactivity, relevance, and in-depth exploration of topics that resonate with their digital native attributes. The acknowledgment of their inclination towards discovery-oriented learning and leveraging of their technological competencies will contribute to the effectiveness and positive impact of these courses. Researchers may find this article valuable for its insights into adapting Information Literacy courses for Generation Z. It explores their preferences for modern teaching methods, technological skills, and the impact of IL courses on their learning processes. This critical evaluation may interest those studying trends in education and students' needs. A limitation of the study is that the results are specific to the Information Literacy course offered at the Transilvania University of Brasov. The study's outcome provides a set of questions and indicators that can be replicated or applied in similar evaluation processes.