Authors: Munna Giuseppe Davide, Cangialosi Salvatore, Belnome Christian, Provenzano Carla, Severoni Mario
1. Introduction
The Learning Unit Plan (LUP) project aims to educate students on critical topics such as Artificial Intelligence (AI), AI-generated content, and fake news. The primary goal is to develop critical awareness regarding the impact of these technologies and phenomena on contemporary society. Through interactive activities, discussions, and informational materials, the LUP addresses the following themes:
- Artificial Intelligence: Understanding the basics of AI, its applications, and ethical implications.
- AI-Generated Content: Analyzing how AI can create visual and textual content, including deepfakes, and the risks associated with such technologies.
- Fake News: Exploring the phenomenon of fake news, its societal consequences, and strategies to recognize and combat it.
Activity Objective
The primary goal of the awareness activity carried out as part of the LUP is to equip students with the necessary tools to navigate a world increasingly influenced by digital information and advanced technologies. It is essential for students to understand how to identify fake content and critically evaluate the information they encounter daily.
This initiative is significant not only for students but also for the entire school community because it:
- Promotes Critical Awareness: Helps students develop a critical mindset towards the media and the information they receive.
- Provides Practical Tools: Equips students with practical strategies to recognize fake news and use AI responsibly.
- Fosters an Informed School Environment: Contributes to creating a school culture that values information verification and digital responsibility.
2. Project Phases
Research and Survey:
- Objective: A survey was conducted to gather data on students’ understanding of Artificial Intelligence (AI) and fake news.
- Target Audience: The survey was distributed among a sample of students from various classes at ITET G. Caruso in Alcamo, covering ages 14 to 19.
- Main Questions:
- Understanding AI: “What do you know about Artificial Intelligence?”
- Perception of Fake News: “How dangerous do you consider fake news?”
- Recognition Skills: “Can you distinguish between real news and fake news? What tools do you use to verify news?”
Data Analysis:
The analysis of collected data revealed significant insights:
- Awareness of AI: 70% of students demonstrated a good understanding of AI, while 30% showed limited knowledge.
- Perception of Fake News: 80% of respondents considered fake news a concerning phenomenon, particularly highlighting the risks posed by deepfakes.
- Recognition Skills: Only 40% of students reported being able to distinguish between true and false news, with most relying on superficial methods for verification.
Poster Creation:
- Development Process: The process began with brainstorming to identify key messages. Clear and impactful messages were chosen, such as “Check your sources!” and “AI can deceive you!”
- Visual Elements: To grab attention, the poster used vibrant colors and engaging graphics, including visual examples of fake news and deepfakes. QR codes linking to online resources for further information were also incorporated.
- Poster Objective: The main goal was to educate and spark students’ curiosity about the importance of information verification and awareness of the risks associated with AI usage.
3. Poster Content and Key Messages
Essential Themes:
The poster developed by the group addresses several key topics related to fake news and Artificial Intelligence (AI):
- Dangers of Fake News:
The poster highlights how fake news can influence public opinion, manipulate information, and create confusion. It underscores the risk of misinformation, especially in political and social contexts.
- Tips for Recognizing Fake Content:
Strategies for identifying fake news include:
○ Checking the source: Verify if the news comes from a reliable organization. ○ Cross-referencing with other sources.
- Responsible Use of AI:
The poster emphasizes understanding how AI can generate content, including deepfakes and manipulated images. It encourages reflection on the impact such technologies can have on the perception of reality.
Practical Suggestions:
- Verify Sources: Always check the credibility of the source before believing or sharing a news item.
- Use Fact-Checking Tools: Leverage websites like Snopes or FactCheck.org to verify information.
- Critically Analyze Content: Pay attention to suspicious images and videos; use reverse image search tools to verify their origin.
- Continuous Education: Attend workshops or online courses on media literacy to improve critical media skills.
- Responsible Sharing: Before sharing information on social media, ensure it’s verified and consider its potential negative impact.
- Promote Awareness: Foster a culture of information verification among peers by encouraging discussions about the risks of fake news and irresponsible AI use.
4. Learning Experience
Lessons Learned:
- Importance of Critical Thinking: Students understood the crucial need to analyze and critically evaluate information before accepting it as true. They learned that critical thinking is essential in navigating a world saturated with digital content.
- Source Verification: Students realized that verifying sources is indispensable in combating misinformation. They learned to recognize reliable sources and use fact-checking tools to confirm news accuracy.
Challenges Encountered:
- Difficulty in Identifying Fake Content: Students faced challenges in recognizing fake content, particularly when it appeared plausible or was professionally presented. The complexity of deepfakes and manipulated images made this distinction especially difficult.
- Use of Digital Tools: Some group members found it challenging to use digital tools for information verification, such as reverse image search techniques. Mastering these tools required time and practice.
Collaboration and Teamwork:
- Collaboration Skills: Students learned to work together, sharing ideas and resources. This process fostered a collaborative environment where everyone contributed their skills and knowledge.
- Effective Communication: The need to discuss and compare opinions improved the group’s communication skills. Students learned to articulate their ideas clearly and respect others’ viewpoints, facilitating constructive dialogue.
5. Conclusions and Final Reflections
Project Impact:
The project had a significant impact on both the students involved and the entire school community. Students developed greater awareness of fake news and Artificial Intelligence, learning the importance of source verification and critical thinking. This experience provided them with practical tools to recognize and combat misinformation, contributing to a more informed and responsible school environment. Additionally, students are now more motivated to share this knowledge with their peers, promoting a culture of verification and digital responsibility.
Future Perspectives:
- Interactive Workshops: Organize awareness events where experts discuss emerging technologies like AI and their societal impact. These workshops could include practical activities to teach students how to recognize manipulated content.
- Teacher Training Courses: Develop training programs for teachers on integrating media literacy into their curricula, providing them with tools and resources to address misinformation.
- Partnerships with External Experts: Establish collaborations with organizations specializing in media literacy and misinformation to create educational projects involving students in practical activities and critical discussions.
Awareness Campaigns: Launch campaigns within the school encouraging students to become truth ambassadors, using social media to promote positive messages about information verification and responsible AI use.
0