“I’ve been using my personal account with students. That’s okay, right?”
What was not allowed before is now acceptable with GenAI’s rampage into classrooms and schools. With the multiple data breach scenarios occurring weekly in US schools, it’s enough to make your stomach drop. And, as a school administrator, this scenario has become all too familiar. What do you say to teachers embracing GenAI for student use, but without the traditional restraints? In today’s schools, dark shadows are creeping into hallways and spaces, unknowable by design and concerning. Are you ready for shadow AI?
Did You Know?
Transform your teaching practice withTCEA’s AI-Amplified Educator program—a comprehensive journey that guides you through the Five Stages of AI Adoption, from initial discovery to sustainable district-wide implementation. Learn essential skills like RTCF prompt engineering, SHINE framework evaluation, and custom AI assistant creation while learning to integrate AI strategically rather than randomly adopting tools. Join thousands of educators who are moving beyond “shiny object syndrome” to become confident AI leaders in their schools and districts. Register now.
Why Shadow AI is Scary
You’ve probably noticed how all the same scenarios pop up in horror movies. Scary stuff is scary because of:
The lack of transparency
Unknowns about new players
A lack of trust
Trust is a social lubricant, and is integral to teaching, learning, and leading. One definition I heard is that “Trust is the residue of promises kept.” If that’s the case, there has to be a track record of transparency, openness, full knowledge of conversations, and their effectiveness to improve learning. New GenAI solutions being forced into classrooms lack this residue of promises kept. Trust has to be established at every level. Until that happens, shadow AI will haunt our use of GenAI in our classrooms.
The Shadow AI Phenomenon
“What are GenAI tool are you using in your classroom?” I asked a group of teachers in an affluent school district. For them, a more apt question might have been, “What GenAI tools are you NOT using?” In their efforts to learn more about GenAI, they had tried every AI tool they could get their hands on. “Which GenAI tool works best for me?” In that process of exploration, they threw every problem, every potential use case they could at the AI tools. They got their answers, but free GenAI tools got their data. Was it a fair trade? Potentially sensitive student/staff/organization data for GenAI benefits?
Shadow IT has evolved in schools through artificial intelligence. Many educators nationwide rely on AI tools, unaware of the compliance implications. But why they remain unaware varies from place to place.
“Our non-negotiable commitment to data privacy and safety.”
Shadow AI Scenarios
Here are some examples of AI scenarios:
Mrs. Johnson discovers ChatGPT provides instant essay feedback. She creates a personal account and inputs student work. The essay scores perfectly, but if Mrs. Johnson relied on free ChatGPT, she introduced student data to the pool of knowledge ChatGPT has access to.
Mr. Rodriguez finds Gemini Pro helps generate differentiated math problems. He uploads his class roster to create personalized worksheets, overlooking privacy considerations.
The special education team utilizes Perplexity to draft IEP goals. They input sensitive student information, focused on improving services rather than potential FERPA violations.
These educators aren’t trying to break the rules. Rather, they are overcoming old roadblocks put in place by time constraints, short deadlines, a multitude of tasks. In spite of relying on detours that make their work easier, those actions could expose districts to legal and financial repercussions.
Special Promotion: October 22 – December 31, 2025. For $20 per person, get your team up to speed on AI Tools for Educators (earn 12 CPE hours, get a digital badge and certificate). Ready to get started? Apply the code ATE25COURSE at purchase. Share with your friends.
Do schools have deep pockets like Big Tech to deal with the consequences of data breaches?
“Instead of asking ‘Is it new?’, we ask ‘Is it wise?”
The AI Risk Spectrum
Here are some ways to identify various AI uses.
Green Light: Low-Risk Applications
Yellow Light: Proceed with Caution
Red Light: High-Risk Territory
These AI uses typically pass compliance reviews with minimal concerns and represent a “safe zone” where innovation can flourish without significant compliance issues.
These applications require thoughtful implementation and proper safeguards, such as appropriate data protection, anonymized queries, and secure systems.
These applications demand rigorous scrutiny and comprehensive agreements, as they often involve direct handling of sensitive student data or create permanent records.
Examples: – Developing lesson plans without student information – Creating general educational content – Supporting professional development – Automating administrative tasks using anonymized data
Examples: – Content generation tools with appropriate data protection – Research assistants with anonymized queries – Professional development platforms with secure login systems – Administrative automation with proper data governance
Examples: – Writing feedback systems storing student submissions – Grade book assistants creating permanent academic records – Assessment tools influencing student pathways – Special education generators processing IEP/504 information – Direct student use of consumer AI tools – Uploading any student work for analysis
What category does your use of GenAI in schools fall into? If you’re honest, you may have examples for every category. Consider putting in a four-part framework to evaluate potential tools:
First, the Problem Test: what is this actually for?
Second, the Student Test: does this deepen learning or is it just a gimmick?
Third, the Sustainability Test: can a teacher realistically use this without burning out?
And finally, the most important one for us, the Alignment Test: does this fit our unique mission and culture? If a tool can’t pass these four tests, we simply say ‘no, not for us.
“Good pedagogy first. Technology second.”
Your Comprehensive Compliance Framework
Before integrating any AI tool into your educational ecosystem, it must navigate several critical checkpoints. Consider this your roadmap through the compliance landscape.
There are several steps your steering committee needs to follow. This isn’t simply true for Gen AI adoption, but for any technology or big purchase. You have to take these first steps. Below, please find a brief outline. In future blog entries, I will explore the details of each of the following:
Build a comprehensive AI implementation strategy. Anytime something is new, you want to include stakeholders, build policies, get professional development, allow for safe experimentation, and involve the community. Sure, it’s a lot and there are more specific suggestions, but those are each a blog entry of their own. Or, reach out to TCEA for AI in Education Journey support.
Establish an implementation roadmap. This will involve establish a benchline assessment, engaging in strategic communications, and a phased implementation.
Establishing a new culture. This can be the hardest thing to do and top leadership is essential. But so are veteran teachers who serve as influencers on the campus.
“Wisdom over overwhelm. Depth over breadth. Humanity over hype.”
Transforming Shadow AI into Strategic Innovation
Educators embracing GenAI are trying to find a way forward. As education leaders, our responsibility is to create structured pathways for responsible innovation. At the same time, we all play a part in safeguarding students and schools amidst an onslaught of hype and hallucinations. Consider these steps when launching GenAI tools:
Of course, if you need something a little more comprehensive, the TCEA SHINE Framework may work better for you. The shadow AI in your schools requires thoughtful implementation and management. Take a strategic approach to enhance educational outcomes and protecting those we care about.
In today’s digital-first classrooms, protecting student data has become a cornerstone of responsible education. From online learning platforms to digital gradebooks, schools are collecting more information than ever before. With this increased data collection comes a heightened responsibility to ensure that student information is handled with care, confidentiality, and compliance.
At the heart of student data privacy in the United States is the Family Educational Rights and Privacy Act, better known as FERPA. Enacted in 1974, FERPA is a federal law designed to protect the privacy of student education records. It grants parents, and students once they turn 18, the right to access their records, request corrections, and control who else can see their information. FERPA applies not only to schools but also to any third-party vendors that handle student data on a school’s behalf.
Understanding what FERPA protects is essential. The law covers a wide range of information, including personally identifiable information (PII) such as names, addresses, and student ID numbers, as well as academic records like grades, transcripts, and disciplinary reports. Even metadata collected by educational apps can fall under FERPA if it can be linked to a specific student.
Despite its clear guidelines, FERPA violations still occur – often unintentionally. A teacher might accidentally email a student’s grades to the wrong parent, or a school might use a new app without verifying its data privacy practices. Sometimes, the issue is as simple as not having proper access controls in place, allowing staff who don’t need access to sensitive data to view it anyway. In many cases, the root cause is a lack of training or awareness.
So how can schools ensure they’re not only compliant with FERPA but also fostering a culture of data privacy?
One of the most effective strategies is to implement strong access controls. This means limiting who can view or edit student data based on their role within the school. Adding layers of security, such as two-factor authentication, can further reduce the risk of unauthorized access.
Equally important is staff training. FERPA compliance isn’t just the responsibility of the IT department, it’s a shared duty across the entire school community. Regular training sessions can help teachers, administrators, and support staff understand how to handle student data appropriately and what to do if they suspect a breach.
Technology also plays a vital role in protecting student information. Schools should use secure platforms that offer encryption for both data storage and transmission. Regular security audits and assessments can help identify vulnerabilities before they become problems. And when working with third-party vendors, schools must ensure those companies are also FERPA-compliant. This includes signing data privacy agreements and reviewing the vendor’s terms of service. In addition to data privacy agreements, our friends at Common Sense Media have created a clearinghouse of reviewed privacy policies.
Another key aspect of FERPA compliance is transparency. Parents and students should be clearly informed of their rights, including how to access their records and request corrections. Schools should have a straightforward process in place for handling these requests.
Of course, even with the best precautions, data breaches can still happen. When they do, it’s critical to act quickly. Schools should have a response plan that includes identifying the scope of the breach, notifying affected parties, and taking steps to prevent future incidents. This might involve resetting passwords, patching software, or revisiting internal policies.
Final Thoughts
Ultimately, protecting student data is about more than just checking boxes – it’s about building trust. Parents entrust schools with their children’s most sensitive information, and it’s up to educators and administrators to honor that trust through thoughtful, proactive data stewardship. By understanding FERPA, investing in secure technologies, training staff, and engaging with families, schools can create a safe digital environment where students can learn and grow without compromising their privacy.
In our society, few folks actually take the time to read the terms of service before installing apps on their devices. This is true for students and adults alike. If we find an app that looks like it will meet a need, we typically install it without considering what we might be giving up. Here are five questions to ask before installing apps, and then we’ll look at several helpful resources to expand your understanding of the critical nature of data privacy.
Q1:
Does the app developer clearly state what data they collect and how they use that data on its website?
Q2:
Does the app developer share what permissions are granted to use the app BEFORE installing the app?
Q3:
Does the app identify a relatively recent date of its last update?
Q4:
Does the app developer give information on their website if they handle data in accordance with the European General Data Protection Regulation (GDPR) or the Children Online Privacy Protection Act (COPPA)?
Q5:
Does the app developer allow you to request that your personal data be removed?
Asking these five questions while searching for and researching apps will help you choose ones that are in alignment with your safety concerns for your students. If you don’t find the answer to any of these five questions on their app’s website, developer’s page, or privacy policy page, take the initiative to contact them and ask.
If you realize that these five questions reveal you need to brush up on data privacy concerns, here are some resources to help bring you up to speed. These resources make the biggest impact when you provide opportunities for discussion with peers, parents, and leaders in your school/district.
Data Quality Campaign
The Data Quality Campaign (DQC) is a national nonprofit policy and advocacy organization dedicated to ensuring that education data works for individuals, families, educators, communities, and policymakers. They provide many helpful resources that you might consider sharing with your peers, parents, and leaders in your district.
Encryption
Encryption is a common method often used to protect sensitive student information. Encryption is a common security practice, but privacy policies rarely mention it. Most commonly, encryption is mentioned in connection with billing information. Without examining its policy, there is no way to tell which data a service encrypts. De-identification is another important aspect of student data privacy on apps. Almost half of the privacy policies mention de-identification as a primary reason for collecting data. These policies are almost exclusively used for analyzing user behavior and reporting on student performance within districts.
If you are looking for more information or need help, go to Terms of Service; Didn’t Read. They may have reviewed the app already. If not, you can submit a request to help answer your question.
Parental Responsibility in Data Privacy
Parental responsibility for student privacy is a crucial responsibility. In addition to protecting their children’s privacy, parents must educate themselves about the technology their children are using. If parents allow their children to use technology without parental supervision, they could be agreeing to data tracking. In addition to educating themselves, parents should always check the privacy practices of the websites and apps they use. By ensuring these services are transparent about their data collection practices, parents can rest assured that their children’s privacy is protected.
Parents who want to learn more about taking a proactive role in protecting children’s privacy can go to one of these sites for relevant help.
Connect Safely – A nonprofit dedicated to educating users of connected technology about safety, privacy, and security. You can easily explore information sorted by topics and access quick guides, news, and podcasts on their website.
Parent Coalition for Student Privacy – This coalition focuses on being advocates with parents in helping keep their child’s data safe. Check out their website for an interactive state student privacy report card, an educator toolkit, and a parent toolkit.
The Education Cooperative – Helpful information to support parents in understanding legal issues such as COPPA, PPRA, CIPA, and FERPA and how it relates to the safety of their child.
Teacher Responsibility
Teachers, too, need to educate themselves about the privacy practices of third-party developers. Connecticut has passed a law establishing a task force to study student data privacy on apps. The task force will explore whether local boards of education should adopt data contracting policies, and they will train employees on best practices. The law will also develop a list of websites and software approved by the state and approved by school districts. By providing more information about the privacy practices of third-party providers, parents will feel more comfortable with these platforms.
Each district is responsible for managing student data generated within the school environment and on school devices. The following points can help teachers know how they play a critical role in ensuring student safety. Ignorance of your responsibilities as an educator does not protect you. Seek out the information to be an informed and proactive educator and advocate for your students.
Contact your Technology Department and ask where they provide training for educators on digital citizenship, student online safety, and best practices in relation to using apps and digital devices in the classroom.
Contact your Professional Development Department (or HR) to find out what resources are available to help teachers better understand their role when using technology with students.
Contact your local educational service center (or state education association) for information on student data privacy they’ve already prepared and made available. Sometimes this is posted on their respective websites; sometimes, you will need to call and track down the person with the information.
While most students are comfortable sharing their social-emotional learning survey results, they may be less comfortable sharing their social-emotional learning results with a school or other third party, for example. Regardless of the level of privacy concerns, a student’s privacy is the most important thing parents, teachers, schools, and he or she can protect.
When I first started working with video, I remember the joy of the old Flip camera. It was so easy to record a video and then save it to a computer. Prior to that, video recording was so difficult as to be impractical. But that didn’t stop enterprising educators from making student-created videos available. Now, video creation tools abound, but the question has become one of, “Now that we can record videos and share them, should we?” Student safety has become a critical consideration. A new tool promises to close the gap between student-created videos and student safety.
Did You Know?
TikTok has one billion monthly active users, has been downloaded over 2.6 billion times worldwide, and enjoys 100 million active users in the United States where 32.5% of users are 10 to 19 years old. See the chart below for other age groups (source: Wallroom Media).
Introducing Zigazoo
Zigazoo, available at no-cost for both Android and iOS devices, encourages “short-form video” creations from children. Billed as the “Tik Tok for kids,” Zigazoo offers a variety of creative options and enables children to create video responses to educational content in a safe environment.
Hailed as “a smarter screen-time activity” by TechCrunch and lauded by children’s media advocacy organizations for its safety, Zigazoo enables kids to share video responses to challenges built by leading museums, zoos, educators, and media stars. (source: Zigazoo).
According to Protect Young Eyes, the Zigazoo creation process is straightforward and includes three types of activities:
You can select learning prompts and then respond with videos up to thirty seconds in length
Share videos with friends (and only with friends you have approved)
Watch videos your friends and favorite characters have created
Since child safety is a priority for Zigazoo, the video response platform includes several protocols.
Ensuring Student Safety
One of the ways that Zigazoo safeguards student video creations is through a verified and secure sign-up process. They rely on single sign-on through verified Facebook, Google, or Apple accounts.
To create an account, you must be over 18 years of age. Furthermore, Zigazoo has eliminated private messaging, commenting, and negative emojis. This means that creators and viewers only see videos that survive a “stringent human moderation process.” All videos must undergo content review and be aligned to Zigazoo’s video review policy:
On topic: users must respond to the prompt and not be off-topic
No personal information: no last names, addresses, birth dates, or personal identifiers
Positive, school-friendly behavior: no shoving, throwing, anger, yelling, sarcasm, or sulking
Classroom-safe language: no swears or “potty” talk
Classroom-safe clothing: every person in the video needs school-appropriate clothes on, including shirts
Clear sound and visuals: we want to make sure we can hear and see you clearly
That moderation process makes sure all videos are relevant to the prompt (from museums, zoos, educators, etc.) and that language and visual media is kid friendly. Personally identifiable information (PII) is also locked out. Some of the restrictions Zigazoo has in place include:
Only your friends on Zigazoo can see your videos, and friends have to be approved prior.
You create videos in response to a question you select.
Children must make videos with an “adult presence,” so you avoid the issues (that pop up in YouTube and Tik Tok) that you have with unauthorized individuals viewing child-created videos.
Some of the videos you will see come from Bings & Potts, which will leave you laughing and learning like this one on Gobble Like a Turkey:
There are many other content providers you can assign content from in Zigazoo, as you can see from the photo slideshow below:
Zigazoo also offers a classroom component known as Zigazoo Classrooms. Let’s take a closer look at that.
In the Classroom
With Zigazoo Classrooms, teachers can create their own private classroom communities that gives them content moderation authority. You can watch a video about Zigazoo Classroom.
Once you get the app, you can choose to create a classroom or pod, registering as a teacher. This process will give you a code you can share with students.
You’ll be prompted to create a welcome video or explore existing projects that you can assign to students. Of course, you have the option to create on your own.
Here is a seven-minute walkthrough of Zigazoo and projects available from LaShundra Wigfall:
Creating with the App
You can start your students creating with Zigazoo easily as a teacher. Simply find and assign a challenge that is relevant. The interface is straightforward and you can use the HOME or DISCOVER buttons in the app to find challenges. You can see what that looks like below:
With October designated as cybersecurity awareness month, there’s a great deal to focus on. Indeed, the argument can be made that the conditions we’re living through in 2020 have brought cybersecurity to the forefront of people’s attention more than ever before. With more people working, teaching, and learning remotely, we are simply relying more on technology — and navigating all of the risks that come with that reliance.
This set of circumstances can open up all sorts of interesting and important conversations. In this one though, we’re going to look specifically at some ways to address and protect data privacy in an online learning environment. For teachers, school administrators, parents, and students alike, it’s important to keep data privacy in mind. And the following tips and ideas can help with that effort.
Take Advantage of Encrypted Ed Tech
The term “encrypted ed tech” would have sounded obscure, to say the least, just a year ago. Now, however, it’s something more school systems are looking into as a means of making sure that the vast troves of data changing hands in remote learning environments are protected. Naturally, it’s already expected that teachers, students, and administrators will take basic security precautions when handing this data — device passwords, good judgment, and so on. But encryption can add another layer of data security.
The idea of endpoint encryption for data protection is simply that information changing hands — between students and teachers, teachers and administrators, and so on — is encrypted within ed tech programs. This all but eliminates the possibility of data being intercepted or improperly exposed in the course of its transfer and storage, thus protecting the privacy of all involved.
Use Well-Made Devices
Much of the focus in ed tech and remote learning concerns software tools, but plenty of teachers and students are also working with devices more frequently — from laptops and tablets to mobile phones, and even smart speakers and voice assistants. By and large, these devices don’t pose many privacy risks people aren’t aware of. They can be protected against through passwords, encryption, secure network usage, and standard precautions. But more dependence on devices also means that the devices themselves have to be well made and secure.
Ultimately, more goes into the security of modern devices than most of us take the time to recognize. It begins with the internal printed circuit board, which is basically the instrument through which electric signals are passed. Once upon a time, these were relatively standardized, but today, the thickness of a printed circuit board and the style of its design can help to make it more secure and durable, even if it’s facilitating complex connections. By a somewhat similar token, a robust and modern battery is also a necessity. A weak or unreliable battery can easily overheat or die out quickly where modern device usage is concerned. And in fact, the same goes for an internal processing chip.
These internal electrical and mechanical concerns have more to do with security than privacy. Inadequate parts can make it all the more likely that a device in heavy, day-to-day use will break down or malfunction in such a way as to compromise student or teacher data. For that reason, even though they can be more expensive, well-made, and reputable devices are best whenever it’s possible to obtain and use them.
Adopt Cloud Storage
This is something a lot of schools and school systems have already done or looked into, and fortunately, in 2020, the idea of cloud storage is fairly ordinary. But it is still something that should be considered with specific regard to data privacy in online learning situations. Cloud networks are designed in some cases to be both more accessible and more secure than local storage, making them excellent choices for the storage of class-related data. Additionally, cloud storage can be backed up on a regular basis so that even a perfectly innocent system or device failure won’t result in the loss of saved data.
Specifics can change according to a schooling situation, but generally now would be a wise time to set up cloud environments that teachers, parents, and students can all access, so that all relevant data can be kept in a reliable digital location (ideally with tiers of access or separate folders for different purposes).
Be Transparent About New Tools and Processes
The final point is that it’s also important for all involved to be open, transparent, and communicative about new tools and processes that are being implemented to assist with remote learning.
Between communication programs, storage options, digital learning tools, and devices, there can be a lot for people to get used to as they adjust to remote learning. This is all well and good, but it also poses a challenge. Anyone using an unfamiliar system or tool without full knowledge of how to use it can make himself vulnerable to data breaches unwittingly. Thus, it’s important that when new tools and processes come up, there is open and thorough dialogue about how to use them, and how to do so in a way that protects privacy.
I’m reaching out to you with a question on Google and student data. In your opinion, how safe are we with sharing student names and assessment data on Google? We have a strict policy of not sharing student PEIMS data, but are we safe to assume we can save student names and assessment info?
-Jill
Dear Jill:
As you may know, G Suite for Education employs a higher level of security than the consumer version of Google Suite. Google employs encryption to protect your data while it is stored on their cloud servers. On that count, it is unlikely that your data will be breached by G Suite EDU. However, should a user’s account be compromised (e.g. a successful phishing attack is one common approach), then access to any data stored in the cloud would be in danger.
Two-Step Verification
That is why ensuring you have two-factor authentication (a.k.a. two-step verification) in place is so important. Even in the face of a successful phishing attack, no one would be able to get access to the shared data. Without that in place, it is best to encrypt files or data stored in Google Drive. This, of course, would defeat the convenience G Suite EDU provides. So, I STRONGLY recommend putting two-factor authentication in place.
Encryption
If you have Word or Excel files that will need to be stored in encrypted form in Google Drive, but have no need for others to work on it, then you can use a free cross-platform solution likeCryptomator. If you intend to share a Google Sheet with student names and info, then you are far better off considering two-step verification to protect account login/password and then using G Suite EDU. A solution like SysCloud offers on-the-fly encryption.
Shared Drive Settings
Another key point to keep in mind is that G Suite EDU makes it easy for end users to SHARE data for collaboration purposes. Improper SHARING approaches (e.g. sharing a file/folder to anyone with the link and public on the web) can put sensitive data at risk without any nefarious intent from anyone. Putting confidential data in a SHARED DRIVE (Team Drive) would make it viewable and accessible by all who have been granted rights to that shared drive.
A Response
To respond to your question, you own the data you store on Google and decide how to use it. So long as Google safeguards the data (which it does), then you own the responsibility for protecting it when the only access is via a user account. If you have taken sufficient precautions to protect user account logins/passwords and provided sufficient professional development to staff so they don’t make sharing errors, then there is no reason why you cannot store sensitive data on Google Drive’s encrypted cloud servers.
Did You Know?
TCEA is available to meet your professional development needs. We can provide friendly, education-specific professional learning on cybersecurity and cybersafety that enhances the work of faculty and staff. Reach out via phone at 800-282-8232 or via email to Lori Gracey (lgracey@tcea.org).
Google and Student Data
Protecting student data is even more important today given the many data breaches that have occurred. You must make every reasonable effort to secure student data while maintaining a balance. Fortunately, you need not work alone to accomplish this. Build capacity on your team. You may find the following resources to be of interest.
TCEA is working very hard to ensure that the Texas Legislature enacts good public policy that is necessary to implement digital learning in Texas classrooms. Below is an update on what is happening related to educational technology.
Technology Lending Grants
The Texas Senate Education Committee took testimony on SB 1483 that would provide grant funds to help give access to devices and internet for students who are unable to afford them. These grants would ensure that students have access to digital content both at school and at home. In the Project Tomorrow Speak Up survey, 81 percent of Texans who identify themselves as community members indicate it is important for students to have consistent, safe internet access outside of school time to be successful in school. The survey also revealed that 75 percent of Texas 6th through 12th graders use the internet at home for school work. Those who do not have internet access at home are at a distinct disadvantage. Not only do they have fewer options in terms of access to content; they are unable to develop the technical skills necessary in an economy driven by technology. SB 1483, authored by Chairman Larry Taylor (right), is designed to address this problem.
Student Data Privacy
On Tuesday, April 11, the House Public Education Committee will take testimony on HB 2087 which is authored by Representative Gary VanDeaver (left). HB 2087 gives schools the much-needed ability to use digital tools to individualize and customize learning and improve student educational outcomes, all while maintaining strong privacy protections for student data. Among other features, HB 2087 will:
Totally ban the sale or rental of student data
Ban targeted advertising to students based upon their use of educational services
Ban building a profile of students for any purpose other than for education
Sharply limit disclosures of student information obtained by educational tech providers
Require educational tech providers to maintain reasonable security practices and procedures to protect student data, and
Require deletion of student data whenever a school or school district requests that the data be deleted.
Also on Tuesday, April 11, the Senate Education Committee will take testimony on SB 1481. This bill is designed to help districts utilize the Instructional Materials Allotment as intended: a dual purpose fund for instructional materials and technology. In a survey conducted by TCEA, only 44 percent of the respondents said their district always includes technology staff when making decisions on the use of the IMA funds. This bill is designed to make it clear what the legislature’s original intent was for the IMA. SB 1481 will:
Change the name of the IMA to the Instructional Materials and Technology Fund to communicate the intent and purpose behind the creation of the IMA
Require the SBOE to update the Long-Range Plan for Technology at least once every five years
Require the SBOE to consider the technology needs of districts when planning the adoption process and proclamation schedule
Update statute language to align with industry usage by replacing “open source” with the more accurate and applicable “open education resource” to identify these educational resources
Require school districts to consider Open Education Resources when they adopt new instructional materials.
TCEA is working on behalf of Texas school districts to ensure that good policy is developed to assist in their implementation of digital learning. You can monitor these and other bills as they move through the legislative process and learn how to be more involved on the TCEA Advocacy site. Together, we can make a difference!