GRIAL resources
Permanent URI for this communityhttps://repositorio.grial.eu/handle/123456789/1
Browse
3 results
Search Results
Item Data-Driven Learning Analytics and Artificial Intelligence in Higher Education: A Systematic Review(IEEE, 2025-09-29) González-Pérez, Laura Icela; García-Peñalvo, Francisco José; Argüelles-Cruz, Amadeo JoséThe responsible integration of Artificial Intelligence in Education (AIED) offers a strategic opportunity to align learning environments with the principles of Society 5.0, fostering human–technology synergy in support of quality education and social well-being. This study presents a systematic review of 36 peer-reviewed articles (2021–2025) focused on educational appli-cations that employ learning analytics (LA) through data-driven approaches and integrate machine learning (ML) models as part of their empirical evidence. Each study was analyzed according to three key dimensions: the context of AIED application, the data-driven approach adopted, and the ML model implemented. The findings reveal a persistent disconnect between the AI models employed and the available educational data, which in many cases are limited to access logs or manually recorded grades that fail to capture deeper cognitive processes. This limitation constrains both the effective training of ML models and their pedagogical utility for delivering meaningful interventions such as personalized learning pathways, real-time feedback, early detection of learning difficulties, and monitoring and visualization tools. Another significant finding is the absence of psychopeda-gogical frameworks integrated with quality standards and data governance, which are essential for advancing prescriptive and ethical approaches aligned with learning goals. It is therefore recommended that educational leaders foster AIED applications grounded in data governance and ethics frameworks, ensuring valid and reliable metrics that can drive a more equitable and inclusive education.Item Evaluating Learning Outcomes Through Curriculum Analytics: Actionable Insights for Curriculum Decision-making: A Design-based research approach to assess learning outcomes in higher education(ACM, 2025-03-05) Hernández-Campos, Mónica; Hilliger, Isabel; García-Peñalvo, Francisco JoséLearning analytics (LA) emerged with the promise of improving student learning outcomes (LOs), however, its effectiveness in informing actionable insights remains a challenge. Curriculum analytics (CA), a subfield of LA, seeks to address this by using data to inform curriculum development. This study explores using CA to evaluate LOs through direct standardized measures at the subject level, examining how this process informs curriculum decision-making. Conducted at an engineering-focused higher education institution, the research involved 32 administrators and 153 faculty members, serving 9.906 students across nine programs. By utilizing the Integrative Learning Design Framework, we conducted three phases of this framework and present key results. Findings confirm the importance of stakeholder involvement throughout different design phases, highlighting the need for ongoing training and support. Among the actionable insights that emerged from LOs assessments, we identified faculty reflections regarding the need to incorporate active learning strategies, improve course planning, and acknowledge the need for education-specific training for faculty development. Although the study does not demonstrate whether these insights lead to improvements in LOs, this paper contributes to the CA field by offering a practical approach to evaluating LOs and translating these assessments into actionable improvements within an actual-world educational context.Item Filling the gap in K-12 data literacy competence assessment: Design and initial validation of a questionnaire(Elsevier, 2025-03-01) Donate-Beby, Belén; García-Peñalvo, Francisco José; Amo-Filva, Daniel; Aguayo-Mauri, SofíaAs the integration of AI-powered technologies in education grows, data literacy has become a key competence for educators, shaping their ability to navigate and utilize vast amounts of educational data. This study details the development of the Educators Data Literacy Self-Assessment (EDLSA), a questionnaire designed to assess perceived data literacy among K-12 teachers, focusing on its behavioural implications. The development of the EDLSA was rigorous. It involved an exhaustive qualitative review of frameworks and a pilot test in a teachers' Spanish sample (n = 66) provided relevant insights for refining the instrument. Finally, we conducted a comprehensive statistical analysis, which confirmed the instrument's robust reliability (α = 0.976) in measuring teachers' data management competence. The results of the factorial analysis in piloting primary and secondary education samples led to the readjustment of the proposed dimensions into three categories: comprehensive educational analytics, educational problem-solving through data, and promoting meta-learning students through data and ethical implications. Stemmed from the assessed competencies, the EDLSA instrument provides a comprehensive understanding of the human-computer interaction over data in educational settings. Overall, this self-assessment tool presents robust psychometric properties and a framework definition that paves the way for further development among teachers and researchers.