Skip to main content

Measurable Outcomes

Outcome Framework

The El Segundo AI Academy employs a comprehensive measurement framework aligned with Stanford CREATE AI Challenge evaluation criteria. All outcomes are measurable, time-bound, and supported by validated instruments.

Primary Outcomes

1. Gender Parity in Participation

MetricBaselineTargetMeasurement Method
Female participation in AI programming22% (industry)50%Enrollment records
Female completion of AI course sequences35% (national)50%Completion tracking
Girls expressing AI career interest18% (survey)45%Pre/post career surveys
Female representation in showcasesVariable50%Event participation data

Measurement Instruments:

  • Enrollment and completion tracking in student information system
  • Pre/post career interest surveys (validated STEM Career Interest Survey)
  • Event participation demographics

2. Accessibility Compliance

MetricBaselineTargetMeasurement Method
WCAG 2.1 AA complianceAudit required100%Third-party accessibility audit
Assistive technology compatibilityNot tested100%User testing with AT users
Student disability participation rateDistrict averageEqual or higherEnrollment comparison
Accommodation request fulfillmentNot tracked100%Request tracking system

Measurement Instruments:

  • Professional accessibility audit using WAVE, axe, and manual testing
  • User testing sessions with students using assistive technology
  • Comparison of disability participation rates to district demographics

3. Engagement Metrics

MetricBaselineTargetMeasurement Method
Overall engagement scoreNew program90%+ positiveStudent engagement survey
Underrepresented learner engagementNew programEqual to peersDisaggregated survey data
Voluntary participation in optional programmingN/A40%+ of eligibleParticipation tracking
Return rate for multi-session programsN/A80%+Attendance records

Measurement Instruments:

  • Validated student engagement survey (PERTS Engagement Survey)
  • Disaggregated analysis by gender, disability status, ELL status
  • Attendance and participation tracking

4. Learning Outcomes

MetricBaselineTargetMeasurement Method
AI concept mastery (by grade band)Pre-assessment80%+ proficiencyStandards-aligned assessments
Ethical reasoning developmentPre-assessmentSignificant growthEthical reasoning rubric
Portfolio completion (9-12)N/A90%+Portfolio submission tracking
Peer collaboration qualityN/A80%+ positiveCollaboration rubric

Measurement Instruments:

  • Standards-aligned assessments mapped to curriculum learning objectives
  • Ethical reasoning rubric adapted from AAC&U VALUE rubrics
  • Portfolio evaluation using curriculum-aligned rubric
  • Peer collaboration rubric with student self-assessment

Secondary Outcomes

Teacher Capacity

MetricTargetMeasurement Method
Teachers completing PD80%+ of AI-teaching staffTraining records
Teacher confidence in AI instruction80%+ confidentTeacher self-efficacy survey
Teacher use of accessibility features100%Classroom observation

Community Impact

MetricTargetMeasurement Method
Parent awareness of AI curriculum70%+Parent survey
Community partner engagement5+ active partnersPartnership tracking
Media/visibility for equity approach3+ featuresMedia tracking

Disaggregated Analysis Plan

All primary outcomes will be analyzed with disaggregation by:

  • Gender: Male, female, non-binary/other
  • Disability status: IEP, 504, no formal plan
  • English learner status: ELL, RFEP, English-only
  • Race/ethnicity: All NCES categories
  • Socioeconomic status: Free/reduced lunch eligibility

This disaggregation ensures we identify any subgroups not benefiting equitably and can implement targeted interventions.

Data Collection Timeline

QuarterData Collection Activities
Q1Baseline surveys, pre-assessments, enrollment demographics
Q2Mid-point engagement surveys, formative assessments
Q3Participation tracking, interim outcome analysis
Q4Post-assessments, final surveys, accessibility audit, portfolio review

Reporting Plan

Internal Reporting

  • Monthly: Participation dashboards with demographic breakdowns
  • Quarterly: Outcome progress reports to project leadership
  • Ongoing: Real-time accessibility compliance monitoring

External Reporting

  • Stanford Reports: Quarterly progress reports aligned with grant requirements
  • Public Transparency: Annual equity impact report published publicly
  • Research Dissemination: Peer-reviewed publication of findings
Continuous Improvement

Data collection is not merely for accountability but drives ongoing curriculum refinement. Quarterly data reviews inform mid-course adjustments to maximize equity impact.

Success Criteria

The project will be considered successful if:

  1. Gender parity achieved: 50% female participation in AI programming
  2. Full accessibility: 100% WCAG 2.1 AA compliance verified by external audit
  3. Engagement equity: No significant engagement gap between underrepresented learners and peers
  4. Learning demonstrated: 80%+ of students demonstrate proficiency in grade-level AI competencies
  5. Sustainability established: Curriculum and professional development institutionalized for continuation beyond grant period