Academics should understand the distinction between GenAI & PredAI because it fundamentally affects research methodology, ethical considerations, and practical applications in their fields. Generative AI raises questions about authorship, originality, and academic integrity, while Predictive AI involves concerns about bias, fairness, and decision-making transparency. Predictive AI is very useful in teaching because it can analyze individual student learning patterns, performance data, and engagement metrics to identify knowledge gaps and predict which concepts each student will struggle with most. This enables academics to proactively customize lesson sequences even in a large class, adjust pacing, and provide targeted interventions before students fall behind, creating truly personalized learning pathways. #artificialintelligence #GenerativeAI #PredictiveAI #AIfluency
Educational Data Analysis
Explore top LinkedIn content from expert professionals.
-
-
“HIPAA doesn’t apply to public schools.” That line has misled EdTech teams, confused school leaders, and put student data at risk for over two decades. It’s true that FERPA usually governs student records in K–12. But here’s what most people miss: HIPAA still matters, especially when schools use third-party platforms to handle health-related data. From school-based telehealth to IEP services to student mental health screening tools, the line between educational records and medical records is blurrier than ever. And when the wrong line gets crossed? It’s not just a compliance issue. It’s a breach. Here’s what I tell EdTech companies and school leaders: - Understand which privacy law governs each data set (there are many permutations of who owns and governs the data) - Avoid assuming FERPA coverage is a catch-all (again think of all the permutations in this hyper-connected, thirty-party platform driven world) - Build governance into your products and partnerships - Train teams on real-world data-sharing risks, not just legal theory Because when student privacy is mishandled, it’s not just trust that erodes. It’s the entire foundation of the EdTech ecosystem. #EdTech #FERPA #HIPPA #AIGovernance #StudentPrivacy
-
One of the challenges that many healthcare organizations face is how to make the huge volume of data they generate work for them. Uses depending on the organization include research, clinical operations, clinical trials, learning health systems, business research, innovation, development, strategic planning and decision making, policy planning etc. Some have figured it out but many still struggle with being ‘data rich but insights/wisdom poor’ due to poor data strategy to aggregate data across sources, data structures and types, multiple practices or institutions, fragmented technology systems, multiple EHRs, connecting non-clinical data etc. This publication on NIH’s All of Us Data and Research Center which summarizes the principles and lessons learned from creating an ecosystem for biomedical research. The guiding principles, the multilevel access for a balance of transparency and privacy, and use of published standards including HL7 FHIR, OHDSI OMOP CDM Standards for health data and the Global Alliance for Genomics and Health standards for Genomic data, are part of industry best practices. https://lnkd.in/eKdEVhcq Links to learn more about each standards are included (in addition you may like this amazing introductory video to HL7 FHIR by Russell Leftwich MD FAMIA at this link https://lnkd.in/epQrRYdV). Kudos to the NIH All of Us teams, participants, and contributors for the ongoing work and taking the time to share their experience with the community. #datamanagement #biomedicalresearch #interoperability #healthcareinformatics #dataanalytics #realworldevidence #datascience #innovation #researchanddevelopment #learninghealthsystems #aiandml
-
Nonprofit friends, planning to collect data soon? Remember: Your questions shape your data—but they don’t always get you what you need. Imagine this: You are filling out a border form, and it asks: "Do you exceed duty-free allowances per person?" The only answers are Yes or No. For someone who didn't bring any goods, selecting No implies they did get something but stayed within the limit. The question doesn't account for people for whom the question is irrelevant, forcing them to provide inaccurate information. Now think about your data collection tools (say, your last survey): ● Are your questions boxing people into answers that don't reflect their reality? ● Are you assuming experiences that don't apply to everyone? ● Are you unintentionally excluding voices by limiting response options? Poorly worded questions = bad data = flawed decisions = a loss of trust. Here are three examples of common pitfalls: ● Assumptions baked into questions Example: “What barriers prevent you from attending our events?” assumes the respondent knows about your events and faces barriers. A better question: “Have you heard of our events?” followed by, “What barriers, if any, prevent you from attending?” ● Excluding relevant options Example: “Which of these programs have you used?” but leaving out “I haven’t used any.” Guess what happens? People pick a random answer or leave it blank, and now your data is a mess. ● Vague questions Example: “On a scale of 1-5, how satisfied are you with our communication?” Without specifying—emails? Social media? In-person?—responses will be all over the place. Your questions are your bridge to listening and understanding. Two things to remember here (and by no means this is the complete list): ● Plan your survey – the why, what, how, when, what-next… before jumping to design ● Use inclusive language, providing options like "Does not apply.", wherever relevant. Ensuring people responding to it can see themselves in the questions and responses is the only way to give them the true choice of what and how much they want to share with us. Please reach out if you want to plan a Survey Kaleidoscope workshop with your team on your upcoming survey (for context, it's a workshop where we solely plan the survey collectively - every single element of how to ensure a successful survey happens) #nonprofits #nonprofitleadership #community
-
📢 Finally out in Economic Policy 🚨 “The Legacy of Covid-19 in Education” w/ Katharina Werner Longitudinal evidence from 2 German school lockdowns on how the pandemic affected education & skill development of school children Free link 👉 https://lnkd.in/dzAr6m57 Permanent link 👉 https://lnkd.in/dUGQVxhx Ungated version 👉 https://lnkd.in/dZiTxemC Will the pandemic leave a lasting legacy in human capital? Our survey of >2,000 parents during 2nd German school lockdown provides 1⃣ new measures of socio-emotional development 2⃣ panel evidence on how students’ time use & educational inputs adapted over time Students’ learning time was cut in half during 1st closures (from 7.5 to 3.7 hours per day) Only modest rebound (to 4.6h) one year later Particularly substituted by gaming & online activities Parents see effectiveness of an hour of learning at home lower than in school Only 7% of schools provided daily online instruction during 1st closures 25% one year later Value-added model: Introducing daily online instruction strongly related to increased student learning time (by 1.1h per day) No relation of learning time w/ other school activities Parental assessments of children’s socio-emotional development are mixed Increasing share (from 36% to 48%) saw school closures as great psychological burden for their children 55% think that school closures harmed social skills But most parents do not report changes in most dimensions of their children’s socio-emotional wellbeing (SDQ) compared to pre-pandemic times Reduction in bullying as strong positive aspect Deterioration in ability to concentrate 59% of parents think their child learned much less during school closures Some positive effects on students’ digital skills & self-regulated learning Unless remedied, skill losses may leave substantial lasting legacy in reduced economic outcomes: ⬇️skill development ⬇️lifetime income ⬇️economic growth ⬆️inequality
-
+1
-
Measuring Research and Innovation Outputs Research and Innovation are key drivers of progress in academia, leading to new discoveries, technologies, and ways of thinking that can have a profound impact on the world. However, measuring the research and innovation capacity and output of a university can be a complex challenge. What metrics should be used, and how can universities effectively track and assess their research and innovative activities? One important factor to consider is research productivity. The number and quality of publications, patents, and other intellectual property generated by a university's faculty can be a strong indicator of innovative thinking and problem-solving. Citation impact, or how frequently a university's research is referenced by others in the field, is another useful metric. Universities can also track the commercialization of their innovations, such as the number of startup companies spun out or licensing deals made. Beyond traditional research outputs, universities should also look at more holistic measures. This could include the number of interdisciplinary collaborations, number and quality of doctoral programs, number and quality of international conferences, number and quality of international academic partnerships, joint publications, quality of research labs, amount of internal funding, the diversity of research topics and methodologies, the speed of knowledge transfer to real-world applications, and the university's ability to attract top talent and external funding (from industry and research funding agencies) for innovative initiatives. Student-led projects, hackathons, and entrepreneurship programs are other important indicators of a culture of innovation. In addition to academic impact through publications and citations, the social, economic, health, environmental, and quality of life impact should also be measured. Qualitative assessments can supplement quantitative metrics. Interviews, case studies, and peer reviews can provide valuable insights into the quality, creativity, and impact of a university's innovations. Gathering feedback from industry partners, community stakeholders, and other external collaborators can also shed light on the university's ability to drive meaningful change. Ultimately, a multifaceted approach is needed to accurately gauge a university's research and innovative capacity. By tracking a balanced set of quantitative and qualitative measures, institutions can identify their strengths, pinpoint areas for improvement, and ensure they are delivering on their mission to advance knowledge and positively transform society. At ADU, Research and Innovation is led by my esteemed colleague Professor Montasir Qasymeh and all the above measures are taken into account when measuring our research and innovation outputs. Please provide your views if I have missed any important measures. #Research #Innovation #ADU Hamad Odhabi Khulud Abdallah Abu Dhabi University
-
Unpacking the impact of digital technologies in Education This report presents a literature review that analyses the impact of digital technologies in compulsory education. While EU policy recognizes the importance of digital technologies in enabling quality and inclusive education, robust evidence on the impact of these technologies is limited especially due to its dependency from the context of use. To address this challenge, the literature review presented here, analyses the focus, methodologies, and results of 92 papers. The report concludes by proposing an assessment framework that emphasizes self-reflection tools, as they are essential for promoting the digital transformation of schools. The literature review on the impact of digital technologies in education revealed several key findings: - Digital technologies influence various aspects of education, including teaching, learning, school operations, and communication. - Factors like digital competencies, teacher characteristics, infrastructure, and socioeconomic background influence the effectiveness of digital technologies. - The impact of digital tools on learning outcomes is context-dependent and influenced by multiple factors. - Existing evidence on the impact of digital tools in education lacks robustness and consistency. The assessment framework proposed in the report offers a structured approach to evaluating the effectiveness of digital technologies in education: 1. Identify contextual factors influencing technology impact. 2. Map stakeholders and their characteristics. 3. Assess integration into learning processes and practices. 4. Utilize self-reflection tools like the Theory of Change. 5. Provide evaluation criteria aligned with the framework. 6. Adapt existing tools for technology assessment. 7. Consider digital competence frameworks for organizational maturity. Implications and recommendations for policymakers and educators based on the report findings include: - Recognizing the contextual nature of technology use. - Focusing on creating rich learning environments. - Adopting a systems approach to studying technology impact. - Ensuring quality implementation and professional development. - Developing policies for monitoring and evaluation. - Encouraging further research on technology impact. By following these recommendations, stakeholders can leverage digital technologies effectively to improve teaching and learning outcomes in educational settings. https://lnkd.in/eBEN5XQg
-
This new Stanford study might change how we think about AI in education. Everyone’s talking about AI that writes lessons. But what about AI that understands students? The study, from Stanford University and Carnegie Learning, found that just 2–5 hours of student interaction with an edtech tool can predict end-of-year test performance with surprising accuracy. In some cases, it matched the predictive power of full-year data or even a formal pre-test. AI’s real value in education might not be content generation (e.g. lesson planning or rubric generation). It might be early prediction—the ability to identify struggling students before any test is given. That’s the bet we’re making at Flint. We’re not just helping teachers generate materials. We’re helping them understand where students are, how they’re progressing, and what to do next. All in real time via an army of AI teaching assistants. The next generation of AI edtech tools will focus on what students need—and when. Full study (and overview) linked in the comments 👇 #ai #edtech #aiedtech #flint
-
📊 Only 5 percent of genAI pilots deliver fast revenue gains. The other 95 percent do not move the P&L. The question is not does AI work, it is are we setting it up to work. 🧩 MIT’s new analysis shows heavy investment with light returns, especially when projects stay at the demo stage. The winners embed AI into real workflows, adapt systems over time, and measure business outcomes, not novelty. 🎓 For education and EdTech this matters even more. If revenue led use cases struggle to show quick wins, learning led use cases will need patient design, teacher training, strong data governance, and clear guardrails. Quick demos do not equal durable classroom impact. 👩🏫 As EdTech Specialist & AI Lead, I focus on long term value. I am building AI literacy pathways for staff and students, running practical PD tied to lessons, and aligning tools with GDPR and the EU AI Act. We track time saved, feedback quality, and student outcomes, not hype. 💡 Short term metrics can underprice long term transformation. The real gains show up in better feedback loops, improved planning, and consistent assessment, plus safer data practices that unlock responsible innovation. That takes strategy, not just spend. 💬 How are you balancing quick wins with long term AI investment in your school or organisation. Which 2 or 3 metrics prove value in the first 12 months without chasing vanity numbers. Share your approach below!
-
This paper discusses the implementation of an effective data strategy at Centre Léon Bérard (CLB), a French comprehensive cancer center, to enhance cancer research and care. 1️⃣ Utilizes electronic medical records, clinical trial data, and patient-reported outcome measures for data collection, applying advanced data analysis techniques like natural language processing. 2️⃣ Emphasizes secure and compliant data sharing, crucial for collaboration across cancer centers, with adherence to regulations like GDPR. 3️⃣ Discusses the CONSORE project under the UNICANCER initiative, aiming to develop a structured and standardized repository of patient data for cancer research and improved patient outcomes. 4️⃣ Highlights the significance of Real-World Data studies in cancer research, providing insights into patient outcomes and treatment patterns. 5️⃣ Addresses the importance of data interoperability for effective data exchange and integration, mentioning standards like FHIR and OMOP. 6️⃣ Illustrates how a well-designed data strategy can contribute to improved patient care and advance research in cancer treatment and management. This paper is valuable for its detailed examination of the practical application of data strategies in a cancer center, offering insights into the benefits and challenges of implementing such strategies. It's a must-read for healthcare professionals and researchers interested in data-driven approaches to cancer care and research. ✍🏻 Pierre Heudel, Hugo Crochet, Thierry Durand, Philippe Zrounba, Jean-Yves Blay. From data strategy to implementation to advance cancer research and cancer care: A French comprehensive cancer center experience. PLOS Digital Health. 2023. ✅ Subscribe to my newsletter and stay at the forefront of groundbreaking studies. Get started here: https://lnkd.in/eR7qichj.