For years, parents were told that Social and Emotional Learning (SEL) was simply about helping children manage emotions, build empathy, and succeed in relationships. More recently, they have been assured that artificial intelligence (AI) can help schools personalize learning, identify students at risk, and support student mental health. What is far less discussed is how these systems function in practice—and the kind of data infrastructure they require once they are embedded in classrooms at scale.
Across the country, schools are increasingly digitizing, analyzing, and retaining behavioral and emotional data about students. This information is often described as “soft skills assessment.” Functionally, however, it represents a form of continuous behavioral monitoring, enabled by AI and normalized through education policy. It is an analysis of design incentives and documented frameworks that already exist.
Unlike academic subjects, “soft skills” such as emotional regulation, resilience, adaptability, and collaboration cannot be measured through traditional testing. To make these traits usable for policy decisions, institutions have sought to standardize and quantify them. A leading example is the Organisation for Economic Co-operation and Development’s Survey on Social and Emotional Skills (SSES), which measures students’ social and emotional traits through self-reports and collects contextual data from parents, teachers, and school leaders. The OECD explicitly frames these traits as measurable, comparable, and policy-relevant outcomes designed to inform education systems across countries, as described in its official overview of the SSES.
The OECD further links social and emotional skills to long-term life outcomes and labor-market success, situating them within a broader human capital framework that connects education data to workforce readiness and economic planning.
Once traits are defined this way, they become comparable across populations, trackable over time, subject to intervention targets, and useful for workforce and economic planning. At that stage, they are no longer just personal qualities; they are system-level data points. This presents an unresolved question for U.S. parents: when emotional and behavioral traits are considered economic metrics, how—and if—they remain governed by federal student privacy laws such as the Family Educational Rights and Privacy Act (FERPA), the Protection of Pupil Rights Amendment (PPRA), and the Child Online Privacy Protection Act (COPPA).
Early SEL efforts relied on surveys administered once or twice a year. Artificial intelligence fundamentally changes that model. AI-enabled systems allow schools and vendors to analyze student writing for sentiment or tone, speech for emotional indicators, online behavior for “risk signals,” and patterns that emerge across months or years rather than isolated incidents.
The global policy community has acknowledged the risks inherent in this shift. In its Guidance on Generative AI in Education and Research, the United Nations Educational, Scientific, and Cultural Organization (UNESCO) explicitly warns that education is a high-risk environment for AI deployment, given that children are a vulnerable population and educational data is particularly sensitive. UNESCO emphasizes the need for human oversight, transparency, and strict data governance when AI systems are used in learning environments, cautioning that educational AI can easily move beyond instructional support into monitoring and profiling.
At the same time, many school districts have adopted AI-assisted student monitoring tools, often justified as measures to prevent self-harm, violence, or bullying. These systems scan student activity on school-issued devices and accounts, flagging keywords or behavioral patterns.
Investigative reporting by the Associated Press and The Seattle Times has shown that parents are often unaware of the scope of this monitoring, that opt-out options are limited or nonexistent, and that there is little independent evidence these systems reliably prevent harm. The investigation also documented significant privacy and data-security risks.
Civil liberties organizations such as the Electronic Frontier Foundation warn that constant monitoring can chill student speech, normalize surveillance in educational settings, and generate false positives that become part of a student’s permanent record.
The concern is not the goal of safety. The concern is infrastructure. Once systems exist to monitor language and expression, emotional tone, behavioral deviations, and inferred internal states, the line between support and surveillance becomes increasingly thin.
The most significant shift, however, is cultural. Students are being conditioned to accept that emotions are data, behavior is continuously observable, self-regulation is externally evaluated, and inner life can be logged and archived. This represents a sharp departure from traditional education, which evaluated academic work and observable conduct rather than ongoing emotional or psychological profiles. As these systems become normalized, it becomes easier to justify expanding their use, particularly when framed as personalization, efficiency, or workforce readiness.
The modern digital identity frameworks are widely defined by governments and international bodies as systems built on persistent identifiers, interoperable data systems, portable credentials, and longitudinal records. Digital identity today is no longer limited to passports or logins; it increasingly refers to a composite digital profile that aggregates credentials, attributes, and performance indicators over time. Similar data architectures already underpin predictive policing systems, behavioral health prevention models, and high-stakes policy determinations in other sectors.
A reasoned concern (opinion): From a functional standpoint, when schools digitize behavioral and emotional traits, store them longitudinally, link them to unique student identifiers, and integrate them across platforms, they are building identity-linked behavioral records regardless of the terminology used. This concern is amplified by the OECD’s explicit linkage of social and emotional skills to employability and workforce outcomes.
Portability of skills data, from school to training to employment, requires identity infrastructure by definition. From this perspective, AI-powered soft-skills assessments may not be labeled “digital ID,” but they generate the persistent, identity-linked data that such systems depend on. This is not an accusation of intent. It is an observation about design, trajectory, and momentum.
Why does this matter to families? Academic grades fade. Behavioral profiles persist. Once collected, AI-derived soft-skills data can influence disciplinary or counseling decisions, shape educational pathways, follow students across schools or programs, and feed future credentialing or workforce systems. Parents are often assured these tools exist “to help,” yet access to records, correction processes, opt-out options, and deletion policies are frequently unclear or unavailable. Children cannot meaningfully consent to lifelong data collection about their emotional and behavioral development.
Transparency, therefore, is the minimum safeguard. Families deserve clear answers about what data is collected, how it is collected, how it is analyzed, who has access, how long it is retained, how it is stored, whether it can be corrected or deleted, whether parents are given informed consent, and whether families can opt out without penalty. Absent these safeguards, AI-powered soft-skills assessment risks becoming behavioral governance by algorithm—quiet, normalized, and difficult to challenge.
The core problem is the transformation of a child’s inner life into permanent digital artifacts created by limited, mechanical systems the child cannot see, understand, or refuse. What is being built in schools today is not merely assessment infrastructure; it is behavioral data infrastructure, and once such systems exist, they rarely remain limited to their original purpose. Parents and policymakers should ask now, before this architecture becomes ubiquitous and intractable, whether schools remain places of education guided by families and communities, or data-collection nodes feeding lifelong identity systems. The answer will shape the future boundaries of privacy, freedom, childhood, and human dignity.