Measurable Outcomes
Outcome Framework
The El Segundo AI Academy employs a comprehensive measurement framework aligned with Stanford CREATE AI Challenge evaluation criteria. All outcomes are measurable, time-bound, and supported by validated instruments.
Primary Outcomes
1. Gender Parity in Participation
| Metric | Baseline | Target | Measurement Method |
|---|---|---|---|
| Female participation in AI programming | 22% (industry) | 50% | Enrollment records |
| Female completion of AI course sequences | 35% (national) | 50% | Completion tracking |
| Girls expressing AI career interest | 18% (survey) | 45% | Pre/post career surveys |
| Female representation in showcases | Variable | 50% | Event participation data |
Measurement Instruments:
- Enrollment and completion tracking in student information system
- Pre/post career interest surveys (validated STEM Career Interest Survey)
- Event participation demographics
2. Accessibility Compliance
| Metric | Baseline | Target | Measurement Method |
|---|---|---|---|
| WCAG 2.1 AA compliance | Audit required | 100% | Third-party accessibility audit |
| Assistive technology compatibility | Not tested | 100% | User testing with AT users |
| Student disability participation rate | District average | Equal or higher | Enrollment comparison |
| Accommodation request fulfillment | Not tracked | 100% | Request tracking system |
Measurement Instruments:
- Professional accessibility audit using WAVE, axe, and manual testing
- User testing sessions with students using assistive technology
- Comparison of disability participation rates to district demographics
3. Engagement Metrics
| Metric | Baseline | Target | Measurement Method |
|---|---|---|---|
| Overall engagement score | New program | 90%+ positive | Student engagement survey |
| Underrepresented learner engagement | New program | Equal to peers | Disaggregated survey data |
| Voluntary participation in optional programming | N/A | 40%+ of eligible | Participation tracking |
| Return rate for multi-session programs | N/A | 80%+ | Attendance records |
Measurement Instruments:
- Validated student engagement survey (PERTS Engagement Survey)
- Disaggregated analysis by gender, disability status, ELL status
- Attendance and participation tracking
4. Learning Outcomes
| Metric | Baseline | Target | Measurement Method |
|---|---|---|---|
| AI concept mastery (by grade band) | Pre-assessment | 80%+ proficiency | Standards-aligned assessments |
| Ethical reasoning development | Pre-assessment | Significant growth | Ethical reasoning rubric |
| Portfolio completion (9-12) | N/A | 90%+ | Portfolio submission tracking |
| Peer collaboration quality | N/A | 80%+ positive | Collaboration rubric |
Measurement Instruments:
- Standards-aligned assessments mapped to curriculum learning objectives
- Ethical reasoning rubric adapted from AAC&U VALUE rubrics
- Portfolio evaluation using curriculum-aligned rubric
- Peer collaboration rubric with student self-assessment
Secondary Outcomes
Teacher Capacity
| Metric | Target | Measurement Method |
|---|---|---|
| Teachers completing PD | 80%+ of AI-teaching staff | Training records |
| Teacher confidence in AI instruction | 80%+ confident | Teacher self-efficacy survey |
| Teacher use of accessibility features | 100% | Classroom observation |
Community Impact
| Metric | Target | Measurement Method |
|---|---|---|
| Parent awareness of AI curriculum | 70%+ | Parent survey |
| Community partner engagement | 5+ active partners | Partnership tracking |
| Media/visibility for equity approach | 3+ features | Media tracking |
Disaggregated Analysis Plan
All primary outcomes will be analyzed with disaggregation by:
- Gender: Male, female, non-binary/other
- Disability status: IEP, 504, no formal plan
- English learner status: ELL, RFEP, English-only
- Race/ethnicity: All NCES categories
- Socioeconomic status: Free/reduced lunch eligibility
This disaggregation ensures we identify any subgroups not benefiting equitably and can implement targeted interventions.
Data Collection Timeline
| Quarter | Data Collection Activities |
|---|---|
| Q1 | Baseline surveys, pre-assessments, enrollment demographics |
| Q2 | Mid-point engagement surveys, formative assessments |
| Q3 | Participation tracking, interim outcome analysis |
| Q4 | Post-assessments, final surveys, accessibility audit, portfolio review |
Reporting Plan
Internal Reporting
- Monthly: Participation dashboards with demographic breakdowns
- Quarterly: Outcome progress reports to project leadership
- Ongoing: Real-time accessibility compliance monitoring
External Reporting
- Stanford Reports: Quarterly progress reports aligned with grant requirements
- Public Transparency: Annual equity impact report published publicly
- Research Dissemination: Peer-reviewed publication of findings
Continuous Improvement
Data collection is not merely for accountability but drives ongoing curriculum refinement. Quarterly data reviews inform mid-course adjustments to maximize equity impact.
Success Criteria
The project will be considered successful if:
- Gender parity achieved: 50% female participation in AI programming
- Full accessibility: 100% WCAG 2.1 AA compliance verified by external audit
- Engagement equity: No significant engagement gap between underrepresented learners and peers
- Learning demonstrated: 80%+ of students demonstrate proficiency in grade-level AI competencies
- Sustainability established: Curriculum and professional development institutionalized for continuation beyond grant period