It’s common sense that we need to protect sensitive student data like social security numbers, but we must also be cautious about other personal information that is more common in K-12 schools.

School districts are much more likely to interact with IEP data, student IDs, student addresses, and health information. That brings us to a surprisingly difficult question: What exactly counts as “sensitive” data?
It sounds like a simple question, but if you ask three different people in your district, you’ll probably get three different answers.
- Is a student ID number sensitive?
- What about email addresses?
- A detention notice?
- What about a lunch balance?
- A bus route?
As IT admins, our primary responsibility is to protect sensitive student data by preventing school constituents (teachers, administrators, and students) from mishandling information within our systems. Whether it is a spreadsheet of special education designations accidentally shared with the entire faculty or a disciplinary report emailed to the wrong parent.
Three tier risk classification system
If we treat everything as sensitive, our systems become unusable. If we treat nothing as sensitive, we’re non-compliant. We need a procedure that allows us to protect sensitive student data while finding the sweet spot between security and usability.
Here are three categories to help you get started:
🔴 High Risk Data: This category includes data that, if shared, would be classified as a severe FERPA violation.
- Social Security Numbers
- Name + DOB
- Name + email address
- Disciplinary records
- Special education records
Key data points: student ID, student email address, SSN
Keywords: “Suspension,” “Expulsion,” “Detention,” “In-school suspension,” “Out-of-school suspension,” “Incident Report,” “Physical aggression,” “Referral”, “Least Restrict Environment (LRE), “Free Appropriate Public Education (FAPE)”, “Extended School Year (ESY)”, “Behavior Intervention Plan.”
🔔 DLP Action: Block external sharing completely.
🟡 Moderate Risk Data: The data in this category becomes more problematic as the number of records increases. Sharing a single record with a parent is okay, sharing a list of 100 records with an external recipient is not.
- Grades & transcript
- Student homework
- Academic records
- Directory information
🔔 DLP Action: Warning on external sharing
🟢 Low Risk Data: this information is “public” in nature and includes things that are commonly found on the school website. These records can be shared without concern.
- Bell Schedule
- Course syllabus
- Supply lists
🔔 DLP Action: No restrictions
Create a custom data detector to protect sensitive student data
Every district has a unique data fingerprint. You will likely need to create custom detection rules for the structured data used within your district:
- Student IDs
- IEP documentation
- Email addresses
- Disciplinary records
Once you have a custom detection rule, you can enable actions that will prevent or manage access to this sensitive information. Fortunately, the “security” section of the admin console includes policies for classifying sensitive data. Let’s use this section to create a custom data detection rule.
Note: the admin console has pre-configured DLP rules for social security numbers, credit cards, etc. We’re focusing on creating custom rules that are unique to each district.
Here are the steps to generate a custom detector for for your domain:
- Navigate to Security > Access & Data Control > Data protection > Detectors
- Click “add detector” > Regular expression
- Paste in your regular expression and test it for accuracy.
💡TIP: Creating an accurate regular expression takes years of experience…or you can use Google Gemini to write a custom regular expression for you:
“Please examine the 10 sample student ID numbers I have provided and write a RegEx formula that I can use to create a data protection detector in the Google Admin Console. Please do your best to write the RegEx formula in a way that will minimize false positives. Please provide an explanation of the formula in easy to understand language.”

Related Post: Protecting students from phishing attacks
Enable DLP Alerts
Now that you have defined a data detector, you are ready to create a DLP rule that will trigger when the data detector is identified.
- Visit the “rules” portion of the admin console
- Click on Create Rule > Data protection
- Give your rule a name (e.g. Student ID detected) and description and select the OU or group that will be impacted by this rule.
- Select the services that are impacted (I recommend Gmail and Drive)
- Configure your conditions:
- Content type to scan: All content
- What to scan for: matches regular expression
- Minimum times pattern is detected: 2 (this will ignore single instances of an ID)
- Click on regular expression name and select the data detector you created
- Select Action: this is where you can block emails from being sent, restrict sharing to internal only, etc. Choose the option that works best for your district
- Alert: this step is optional, but if you want to be notified each time a rule is triggered, you can send the alert to the alert center (recommended) or send an email to a designated contact (not recommended).

I recommend that you monitor a rule for a period of time before you enforce an action. This will give you insight into how often a rule is triggered and an opportunity to adjust your data detection policy if the rule is being triggered too frequently.
We know we need to protect our data from external threats, but sometimes the biggest threats can come from within. Data detection rules can help protect your most sensitive data from accidental exposure and fulfill your duty to implement “reasonable methods” to protect student data.
Learn more about this topic:
- FAQ – Cloud Computing (Student Privacy commission)
- Deep Research: School Data privacy and Google Workspace
- Create custom content detectors (Google Workspace help)
- Understanding Data Loss Prevention (DLP)




Reader Interactions