Publications

Permanent URI for this collectionhttps://repositorio.grial.eu/handle/123456789/34

Browse

Search Results

Now showing 1 - 2 of 2
  • Thumbnail Image
    Item
    Evaluating Learning Outcomes Through Curriculum Analytics: Actionable Insights for Curriculum Decision-making: A Design-based research approach to assess learning outcomes in higher education
    (ACM, 2025-03-05) Hernández-Campos, Mónica; Hilliger, Isabel; García-Peñalvo, Francisco José
    Learning analytics (LA) emerged with the promise of improving student learning outcomes (LOs), however, its effectiveness in informing actionable insights remains a challenge. Curriculum analytics (CA), a subfield of LA, seeks to address this by using data to inform curriculum development. This study explores using CA to evaluate LOs through direct standardized measures at the subject level, examining how this process informs curriculum decision-making. Conducted at an engineering-focused higher education institution, the research involved 32 administrators and 153 faculty members, serving 9.906 students across nine programs. By utilizing the Integrative Learning Design Framework, we conducted three phases of this framework and present key results. Findings confirm the importance of stakeholder involvement throughout different design phases, highlighting the need for ongoing training and support. Among the actionable insights that emerged from LOs assessments, we identified faculty reflections regarding the need to incorporate active learning strategies, improve course planning, and acknowledge the need for education-specific training for faculty development. Although the study does not demonstrate whether these insights lead to improvements in LOs, this paper contributes to the CA field by offering a practical approach to evaluating LOs and translating these assessments into actionable improvements within an actual-world educational context.
  • Thumbnail Image
    Item
    Filling the gap in K-12 data literacy competence assessment: Design and initial validation of a questionnaire
    (Elsevier, 2025-03-01) Donate-Beby, Belén; García-Peñalvo, Francisco José; Amo-Filva, Daniel; Aguayo-Mauri, Sofía
    As the integration of AI-powered technologies in education grows, data literacy has become a key competence for educators, shaping their ability to navigate and utilize vast amounts of educational data. This study details the development of the Educators Data Literacy Self-Assessment (EDLSA), a questionnaire designed to assess perceived data literacy among K-12 teachers, focusing on its behavioural implications. The development of the EDLSA was rigorous. It involved an exhaustive qualitative review of frameworks and a pilot test in a teachers' Spanish sample (n = 66) provided relevant insights for refining the instrument. Finally, we conducted a comprehensive statistical analysis, which confirmed the instrument's robust reliability (α = 0.976) in measuring teachers' data management competence. The results of the factorial analysis in piloting primary and secondary education samples led to the readjustment of the proposed dimensions into three categories: comprehensive educational analytics, educational problem-solving through data, and promoting meta-learning students through data and ethical implications. Stemmed from the assessed competencies, the EDLSA instrument provides a comprehensive understanding of the human-computer interaction over data in educational settings. Overall, this self-assessment tool presents robust psychometric properties and a framework definition that paves the way for further development among teachers and researchers.