Description: An AI system developed by Infinite Campus and deployed by Nevada to identify at-risk students led to a sharp reduction in the number classified as needing support, dropping from 270,000 to 65,000. The reclassification caused significant budget cuts in schools serving low-income populations. The drastic reduction in identified at-risk students reportedly left thousands of vulnerable children without resources and support.
Editor Notes: Timeline notes and clarification: Before 2023, Nevada identified at-risk students mostly by income, using free or reduced-price lunch eligibility as the key measure. In 2022, this system classified over 270,000 students as at-risk. Looking to improve the process, Nevada partnered with Infinite Campus in 2023 to introduce an AI system that used more factors like GPA, attendance, household structure, and home language. The new system was meant to better predict which students might struggle in school. However, during the 2023-2024 school year, the AI cut the number of at-risk students to less than 65,000. This reclassification caused budget cuts in schools that depended on the funding tied to at-risk students, especially those serving low-income populations. By October 2024, the problem gained national attention.
Entidades
Ver todas las entidadesPresunto: un sistema de IA desarrollado por Infinite Campus e implementado por Nevada Department of Education, perjudicó a Low-income students in Nevada , Nevada school districts , Mater Academy of Nevada y Somerset Academy.
Estadísticas de incidentes
ID
808
Cantidad de informes
1
Fecha del Incidente
2024-10-11
Editores
Daniel Atherton
Informes del Incidente
Cronología de Informes
nytimes.com · 2024
- Ver el informe original en su fuente
- Ver el informe en el Archivo de Internet
translated-es-Nevada has long had the most lopsided school funding in the country. Low-income districts there have nearly 35 percent less money to spend per pupil than wealthier ones do --- the largest gap of any state.
A year ago, Nevada s…
Variantes
Una "Variante" es un incidente que comparte los mismos factores causales, produce daños similares e involucra los mismos sistemas inteligentes que un incidente de IA conocido. En lugar de indexar las variantes como incidentes completamente separados, enumeramos las variaciones de los incidentes bajo el primer incidente similar enviado a la base de datos. A diferencia de otros tipos de envío a la base de datos de incidentes, no se requiere que las variantes tengan informes como evidencia externa a la base de datos de incidentes. Obtenga más información del trabajo de investigación.
Incidentes Similares
Did our AI mess up? Flag the unrelated incidents
Machine Bias - ProPublica
· 15 informes
Analyzing Released NYC Value-Added Data Part 2
· 7 informes
Policing the Future
· 17 informes
Incidentes Similares
Did our AI mess up? Flag the unrelated incidents
Machine Bias - ProPublica
· 15 informes
Analyzing Released NYC Value-Added Data Part 2
· 7 informes
Policing the Future
· 17 informes