DoodleIQ
AI-Powered STEM Learning, UDL, AR, and Teacher Analytics
Product strategy, learning design, UX design, and prototype development for a mobile-first STEM learning platform designed to support differentiated instruction through AI-assisted practice, AR-enhanced learning activities, multiple means of assessment, and teacher-facing analytics.
AI in education • learning design • STEM • UDL • accessibility • adaptive scaffolding • AR learning • teacher analytics • formative assessment • student engagement • product strategy • UX/UI design • vibe coding • Codex • capstone prototype • mobile-first design • inclusive learning
The Problem
STEM instruction often requires teachers to support a wide range of learner needs, abilities, confidence levels, and engagement styles within the same classroom.
Students may struggle when content is presented in only one way or when assessments rely too heavily on one response format. Teachers also need clearer, faster insight into where students are struggling so they can adjust instruction before gaps become larger.
For my capstone, I wanted to explore how an AI-powered learning experience could help students build STEM understanding through emerging technologies while giving teachers actionable formative data.
The challenge was to design a product that could:
Support students with different learning needs and preferences.
Use AI to provide scaffolded practice without replacing teacher judgment.
Incorporate AR and hands-on learning to make STEM concepts more concrete.
Give teachers usable analytics that support instructional decisions.
Feel playful and approachable without becoming childish or overwhelming.
Discovery and Research
To define the product vision, I started with research rather than features.
My discovery process included:
A literature review focused on Universal Design for Learning, experiential learning, situated learning, constructivism, scaffolding, formative assessment, and AI-supported instruction.
A market analysis of current STEM, AI tutor, AR lab, and classroom practice tools.
Competitive research across direct, adjacent, and aspirational edtech products.
A feature comparison matrix to identify patterns, gaps, and opportunities.
Exploration of app store, web, and AI search results using terms such as “AR lab,” “AI tutor,” “accessible STEM,” “UDL,” “teacher analytics,” and “adaptive learning.”
Review of how existing products positioned AI, accessibility, assessment, personalization, and classroom implementation.
Synthesis of academic and market findings into a product opportunity statement.
This process helped me avoid designing from personal preference alone. The product direction came from the overlap between learning theory, teacher needs, student needs, technical possibility, and market gaps.
Literature Review: Learning Theory as Product Strategy
The literature review shaped the product’s instructional model.
Universal Design for Learning became the primary foundation because the product needed to support multiple ways for students to access content, engage with practice, and demonstrate understanding.
Constructivism and situated learning influenced the hands-on and contextual nature of the activities. I wanted students to build understanding through interaction, application, and meaningful examples, rather than simply receiving explanations.
Experiential learning influenced the inclusion of AR and practice-based activities. STEM concepts often become clearer when students can manipulate, observe, test, and apply ideas in concrete ways.
Scaffolding and the Zone of Proximal Development shaped the AI support model. The AI should not simply give students answers. It should provide the right level of support at the right time, helping students move from what they can do independently toward what they can do with guidance.
Formative assessment research influenced the teacher analytics strategy. The product needed to help teachers identify patterns of misunderstanding while there was still time to intervene.
Market Analysis and Competitive Review
The market analysis showed that many products solve part of the problem, but few combine the full set of needs I was exploring.
Some tools focus on gamified practice. Others focus on tutoring, classroom management, AR science labs, or analytics. However, the opportunity for DoodleIQ emerged from the combination of:
AI-guided practice with teacher-controlled guardrails.
UDL-aligned response flexibility.
AR-supported STEM activities.
Teacher-facing formative analytics.
A playful but credible classroom experience.
Accessibility-conscious design from the start.
The competitive review helped me identify where DoodleIQ could be meaningfully different. I was not trying to build “another AI tutor.” I was defining a learning support tool that helps students practice STEM concepts in flexible ways while helping teachers understand what is happening across the classroom.
Product Opportunity
The research and market analysis led to a clear product opportunity:
Teachers need an accessible, flexible STEM practice tool that supports different learners, provides scaffolded feedback, and turns student activity into actionable instructional insight.
This became the foundation for DoodleIQ.
The product would not replace teachers or classroom instruction. Instead, it would extend the teacher’s ability to support students during and after learning activities.
The product vision focused on three connected goals:
Help students build confidence through flexible, scaffolded STEM practice.
Help teachers identify where students are struggling through actionable formative data.
Use AI and AR only where they support learning, not as novelty features.
Product Vision
DoodleIQ is a mobile-first STEM learning platform that helps students explore, practice, and demonstrate understanding through AI-assisted scaffolding, AR-enhanced activities, and multiple ways to respond.
The teacher experience gives educators control over modules, assessment settings, response types, and classroom data. The student experience provides a supportive, playful environment where learners can practice concepts, receive feedback, and revisit missed ideas over time.
The core product idea became:
Where doodles become do-ables.
That phrase captured the intent of the product: moving students from curiosity and early ideas into confident application.
Feature Strategy
Once the product opportunity was clear, I translated the research into a focused MVP feature set.
The MVP included:
Teacher onboarding and login.
Class creation.
Student upload by bulk entry or individual entry.
Module creation.
Assessment item creation.
Teacher-configurable response modes.
Student-facing learning modules.
AI-assisted practice and scaffolded feedback.
AR-supported STEM activities.
Practice mode for reinforcement.
Class-level analytics.
Student-level analytics.
Misconception and error pattern insights.
Each feature had to connect back to the learning strategy. If a feature did not support student understanding, teacher decision-making, accessibility, or classroom feasibility, it was deprioritized.
Product Definition and User Flows
After defining the MVP, I mapped the experience into teacher and student workflows.
The teacher flow included:
First-time user experience.
Login and password reset.
Dashboard review.
Class creation.
Student roster upload.
Module creation.
Assessment item creation.
Analytics review.
Student-level performance exploration.
The student flow included:
Entering a module.
Engaging with instructional content.
Completing practice activities.
Responding through teacher-enabled formats.
Receiving scaffolded feedback.
Revisiting missed concepts.
Completing checks for understanding.
Continuing practice at home or in class.
This helped turn the product vision into a buildable experience.
Prototype Development
I built the working MVP using Codex as part of an AI-assisted development process.
This allowed me to move beyond static screens and create a functional prototype that demonstrated the product experience across teacher and student workflows.
The prototype process included:
Translating product requirements into user stories.
Creating feature narratives and interaction flows.
Iterating on UI structure and component behavior.
Building the teacher dashboard and classroom flows.
Building student module and practice experiences.
Creating analytics views.
Testing, debugging, and refining the prototype.
Deploying the prototype to Vercel.
Creating a product website to explain the concept and market positioning.
Using AI in the build process let me expand beyond my formal training. I was able to combine my UX, product strategy, learning design, and research background with AI-assisted implementation to create something closer to a real early-stage product.
The Final Outcome
The final capstone was not just a prototype. It was a research-backed product concept developed through a full early-stage product process.
The work included:
Literature review.
Market analysis.
Competitive research.
Product opportunity framing.
Learning theory synthesis.
Feature strategy.
MVP definition.
User flows.
UX/UI design.
Brand direction.
AI-assisted prototype development.
Product website development.
Deployment to the web.
DoodleIQ became a demonstration of how research, learning science, product thinking, and AI-assisted development can come together to create a more inclusive STEM learning experience.