When DP World Foundation, Dubai's Knowledge and Human Development Authority (KHDA), and MIT announced in 2026 that every Grade 6-8 student in the emirate's private schools would receive AI literacy education, they didn't build the curriculum from scratch. They partnered with MIT RAISE, the same initiative whose tools and research are already used by educators in over 175 countries. That decision tells you something about where the global standard for K-12 AI education is being set.
MIT RAISE (Responsible AI for Social Empowerment and Education) is an MIT-wide initiative directed by Cynthia Breazeal, founder of the MIT Media Lab's Personal Robots Group, and co-directed by Hal Abelson, Eric Klopfer, and Hae Won Park. What started as a collection of research projects in AI education has grown into the largest open-access AI literacy infrastructure in the world: free curricula spanning ages 5 to 18, a block-based app development platform with 25 million learners, teacher professional development programmes active on every continent, and a research portfolio of over 30 peer-reviewed publications studying how children actually learn AI concepts.
The numbers alone are striking. But what makes RAISE worth understanding is not its scale. It's the pedagogical philosophy that produced the scale.
The Core Idea: Students as Designers, Not Consumers
Most approaches to AI education treat students as end-users. They learn what AI is, how it works in broad terms, and perhaps discuss the ethics. RAISE operates from a fundamentally different premise rooted in MIT's constructionist tradition, which traces back to Seymour Papert's insight that people learn most deeply when they are building something meaningful.
In RAISE's model, students don't just study AI. They design AI systems. A middle schooler in the DAILy curriculum doesn't read about neural networks. She plays a game where she physically acts as a node in a three-layer neural network, making classification decisions in real time. A group of 9-year-olds using PoseBlocks don't watch a video about computer vision. They program body-tracking AI that responds to their dance movements. A high school student in the FutureBuilders programme doesn't take a quiz on machine learning. She builds a working mobile app with MIT App Inventor that uses image classification to solve a problem in her community.
This distinction between consuming AI knowledge and constructing with AI is not just philosophical. Research from the RAISE team consistently shows it produces different outcomes. In a study of 31 middle school students, 87% from groups underrepresented in STEM, the 30-hour DAILy curriculum produced measurable gains not only in technical AI understanding but in ethical reasoning and awareness of AI career pathways. In another study, 72 middle schoolers who role-played as generators and discriminators in an interactive GAN game developed intuitive understanding of adversarial networks, a concept that challenges university students. The hands-on approach works because it connects abstract AI mechanics to things students can see, touch, and manipulate.
"AI is not only a tool to be understood, but also a force for creative action."
What the Curricula Actually Cover
RAISE operates several distinct but interconnected programmes, each designed for different age groups and contexts.
Day of AI (Ages 5-18)
The flagship curriculum, now used in all 50 US states and 175+ countries, offers 12 units organised as a spiral curriculum aligned with the UNESCO AI Competency Framework for Students. Core AI concepts are introduced at age 5 and revisited with increasing depth through to age 18.
For the youngest learners (ages 5-7), units like "How We Teach Machines" use entirely device-free activities. Children sort objects into artificial and natural categories, act out what it means for a machine to make a prediction, and physically walk through what an algorithm does. No screens required.
For ages 8-10, students begin working with tools like Google's Teachable Machine and Quick Draw, training their own image classifiers and investigating how bias enters datasets. By ages 11-13, units tackle surveillance, deepfakes, misinformation, and algorithmic bias, but always through interactive simulations rather than lectures. One unit, "From Data to Decision," places students inside an AI surveillance simulator at a fictional airport where they must evaluate facial recognition scores, distinguish prediction from decision-making, and design human safeguards.
For ages 14-18, units cover applied machine learning, generative AI in the creative arts, and AI's impact on the workforce. A separate RAISE-developed curriculum, "AI and the Creative Arts," was showcased at the 2025 Day of AI celebration at Boston's Museum of Fine Arts, where high school students presented generative self-portraits and mixed-media animations they had created.
DAILy Curriculum (Grades 5-8)
The Developing AI Literacy curriculum is a deeper dive: 22 lessons across five units totalling approximately 32 hours of instruction. Developed by MIT's STEP Lab and Media Lab researchers, it can be taught as a standalone unit or woven into existing courses.
The structure moves from foundations (What is AI? How can algorithms encode opinions? What does bias look like in a decision tree?) through technical concepts (supervised learning, neural networks, the distinction between classification and generation) into creative applications (GANs, AI-generated art and stories) and societal consequences (deepfakes, misinformation, environmental impact of AI). The final unit, still in pilot, teaches large language models through unplugged simulations of word embeddings, attention mechanisms, and reinforcement learning from human feedback, designed specifically for students who may not have access to LLM tools.
Each lesson runs 45-50 minutes with 15-20 minutes of teacher preparation. The modular design is deliberate: a mathematics teacher can pull in the decision tree lesson, a language arts teacher can use the AI-generated story unit, and a social studies teacher can run the misinformation and deepfakes sequence.
RAICA (Middle and High School)
Responsible AI for Computational Action is the most ambitious curriculum, currently in pilot. It covers image classification, facial recognition, natural language processing, affective computing, reinforcement learning, and data science. Where Day of AI and DAILy focus on understanding, RAICA pushes students to build AI-powered projects that address real problems in their communities.
The pilot is running in an unusual setting: the Dzaleka Refugee Camp in Malawi, through a partnership with Africa Deep AI (ADAI) Circle. MIT researchers and ADAI educators co-designed the materials through weekly meetings, classroom recordings, and student project reviews. The collaboration forced meaningful curriculum adjustments: vocabulary supports for multilingual learners, culturally resonant examples, materials that work both online and in print across different device types. Three modules are being tested with a 15-student cohort: Picture This (computer vision), Social Robots, and a capstone project.
The Platform: MIT App Inventor
Underlying much of RAISE's work is MIT App Inventor, the free block-based mobile app development platform that has become one of the most widely used programming tools in education.
The statistics are remarkable: 25 million learners, 120 million apps built, active in over 200 countries with 48% of usage coming from the developing world. The platform supports 19 languages. More than 11,000 schools use it monthly, and nearly 85% of users are students.
For AI education specifically, App Inventor provides components for image classification, text classification (using TensorFlow), pose detection for body, hand, and face tracking, conversational AI through Alexa integration, and a complete data science toolkit with IoT sensor connectivity, anomaly detection, charting, and linear regression. A newer tool, Aptly, lets students generate mobile applications from natural language descriptions.
The platform's significance for AI literacy is straightforward: it lets students build working AI applications without writing traditional code. When a student trains an image classifier and deploys it in a mobile app that she can show to her family, AI stops being an abstract concept and becomes something she made.
What the Research Shows
RAISE is not just a curriculum provider. It is a research programme, and the publications in its archive provide an unusually detailed evidence base for what works in K-12 AI education.
Several findings stand out:
Embodied learning transfers to abstract understanding. In the "Contour to Classification" game, middle school students physically acted as nodes in a neural network, developing accurate mental models of forward propagation and decision boundaries. When 46 students aged 9-14 used PoseBlocks to create movement-based AI projects, they demonstrated understanding of feedback loops between human action and AI response. The consistent finding: making AI concepts physical and interactive produces deeper learning than explanation alone.
Ethics must be embedded, not bolted on. A 2022 study of three project-based AI curricula found that integrating ethical reasoning throughout the technical content, rather than adding it as a separate module, produced stronger outcomes in both domains. Students who built AI systems while simultaneously considering their social implications developed more nuanced ethical thinking than those who studied ethics in isolation.
Identity matters in computing education. The Data Activism Programme, working with African American high school and college students in Boston and Cambridge, found that integrating an "Archaeology of Self" framework with data science education helped students develop reflexive consciousness about their identity in relation to technology. Students who explored who they were before exploring what AI is showed stronger engagement and deeper critical thinking about algorithmic bias.
Teachers need to design with AI, not just learn about it. A co-design study with 15 K-12 teachers found four core priorities: practical evaluation methods, student engagement that connects to existing subjects, logistical feasibility within real school constraints, and collaborative learning structures. The critical insight was that teachers don't want standalone AI courses. They want AI literacy integrated into the subjects they already teach.
Going Global: The Dubai Model
This brings us back to Dubai. The DP World Foundation, KHDA, and MIT RAISE partnership represents perhaps the most ambitious deployment of RAISE's work to date.
The AI Literacy Programme targets all Grade 6-8 students in Dubai's private schools, with a four-year rollout from 2026 to 2030. The curriculum is designed as short-format, cross-subject lessons integrated into six existing subjects: Mathematics, Science, Computing, Art, English, and Arabic. Approximately five AI literacy lessons per subject, delivered through curriculum materials, assessments, and curated AI tools via a digital portal developed by MIT RAISE.
This cross-subject integration directly reflects the research finding that teachers want AI woven into their existing disciplines. A mathematics teacher delivers AI lessons through the lens of data and algorithms. An art teacher explores generative AI and creative expression. An Arabic teacher examines how language models handle different languages and scripts. The AI literacy doesn't compete with existing subjects. It enriches them.
Teachers receive professional development workshops, training on contextualising AI literacy for their classrooms, access to MIT RAISE curriculum materials, ongoing support from a dedicated local team in Dubai, optional access to MIT's Generative AI for Educators online course, and certification from MIT RAISE upon completion.
No prior AI knowledge is required from teachers or students. The programme is free. And student learning is assessed through formative evaluation embedded in classroom implementation (observations, teacher feedback, and monitoring of skill development) rather than high-stakes testing.
Alongside the AI Literacy Programme, Dubai is also adopting the FutureBuilders Programme, adapted from MIT's FutureMakers model, for high-performing high school students. This is an intensive enrichment programme focused on technical AI skills, entrepreneurial thinking, and leadership, running as a hybrid format (four weeks online, one week in person) during school breaks. Students work in teams on capstone projects, building real AI applications and presenting them in entrepreneurial-style pitches. Tracks may include AI and Mobile App Development, Data in Action, Applied Deep Learning, and emerging areas like Agentic AI. Serving 40 to 100 students per session, the programme is designed to identify and develop the next generation of AI builders in the region.
Why This Matters Beyond Dubai
The Dubai programme is significant not because it is unique, but because it represents a pattern. Governments and education systems worldwide are recognising that AI literacy can no longer be treated as an elective, an enrichment activity, or a future consideration. When Sierra Leone's Minister of Communication, Technology, and Innovation Salima Bah addressed the 2025 MIT AI and Education Summit, she posed the question that policymakers everywhere are grappling with: "Who defines what responsible AI in education looks like? Whose values are embedded in the algorithms?"
RAISE's answer has been to build systems that are open, free, multilingual, and designed for co-creation with local communities rather than top-down imposition. The Malawi partnership showed that curriculum materials built at MIT need significant adaptation to work in a refugee camp. Professional development programmes run through the MIT Hong Kong Innovation Node have shown that teacher confidence grows rapidly when training is practical and hands-on. The Dubai deployment will test whether the approach works at scale within a highly diverse, multilingual school system.
At the 2025 Summit, Cynthia Breazeal offered what amounts to RAISE's thesis: "Whether you're a learner, a parent, a policymaker, AI and education now go hand in hand. To build with AI, to use it responsibly, that can happen only in learning environments that encourage both creative exploration and being mindful of the ways that AI directly impacts you, your family, your community, and beyond."
The 25 million students who have already used RAISE's tools suggest the thesis is being tested at meaningful scale. The question now is whether education systems will treat AI literacy as foundational, as reading and mathematics are foundational, or continue to treat it as optional. Dubai has made its choice. The evidence from MIT RAISE suggests more will follow.
MIT RAISE resources are freely available at raise.mit.edu. The Day of AI curriculum is at dayofai.org. The DAILy curriculum is at everyday-ai.org. MIT App Inventor is at appinventor.mit.edu.