AI in Schools: Slowing Down to Make Ethical Decisions About EdTech

GMT20260309-200415_Recording
===

Speaker 3: [00:00:00] Welcome to the Principal's Handbook, your go-to resource for principals looking to revamp their leadership approach and prioritize self-care. I'm Barb Flowers, a certified life coach with eight years of experience as an elementary principal. Tune in each week as we delve into strategies for boosting mental resilience, managing time effectively, and nurturing overall wellness.

From tackling daily challenges to maintaining a healthy work-life balance. We'll navigate the complexities of school leadership together. Join me in fostering your sense of purpose as a principal and reigniting your passion for the job. Welcome to a podcast where your wellbeing is the top priority.

I.

Speaker: Welcome everyone to The Principal's Handbook. I'm excited. Today we have a guest with us. It's Priten Soundar-Shaw. He's with us today talking about his new book, ethical Ed Tech. So Priten, do you just wanna start by telling us about yourself, your experience in the ed tech industry.

Speaker 2: Yeah, [00:01:00] my experience in EdTech started back in high school.

, And I started my first EdTech nonprofit at that point. , And I really haven't, , deviated from that plan since. , Since then. I've, you know, I've gotten a bachelor's degree in philosophy, , a master's in, , education policy. , And I've spent a lot of that time also developing at tech tools for educational institutions of all sorts.

So, nonprofits, , universities and startups. . And in the, during the pandemic, we built a lot of tools for educational institutions to help them transition to online learning, specifically for like extracurricular activities. And then since 20 22, 20 23, we've been doing a lot of work helping educators in schools navigate, , what AI means for the classroom.

Speaker: That's exciting stuff and I love the conversation about AI because I feel like people have such strong opinions about it. So talk to us a little bit about your new book that is talking about ethical ed tech and how AI fits into that and , just the conversation of the book.

Speaker 2: Yeah. , So definitely some strong opinions here.

Yeah. , And that's what kind of gives rise to the book. , So my first book was really , a, like a, , I laid out all of the different possibilities for what [00:02:00] AI might mean for education. So how might we think about it for student usage and teacher usage, , what might be on the horizon. , But it didn't really tell us what we should do with the technology, more what it could do.

. In the last few years, I've become increasingly concerned with how we've been approaching, , technology in our schools. I think largely it's driven by marketing pressures, funding pressures, political pressures, , and it's pretty fast. We've been moving really fast because I think because of those pressures.

And so, , the book is largely a call to slow down, , and to think about exactly why we're integrating technology when we do. , A lot of it is focused on AI in the book itself, but of course the, the kinds of skills that I'm hoping to build, the kind of vocabulary that I'm hoping to build. Through the book, they're timeless.

And so it would apply to any sort of technology, , that we see in the future, , or that we've already been seeing, whether be social media, cell phones, devices, things like that.

Speaker: Okay. So you just said something that I am , a principle of a kindergarten, first and second grade building, and you said we need to slow down.

And time on computers was the first thought that came to my mind is, you know, being an [00:03:00] elementary principal, but I'm sure even K to 12, right? Think about kids in different classes all day and they're using their devices. What are your thoughts on that? On how much time kids are on computers and how much technology is being used in schools?

Speaker 2: Yeah. So, , in the book, I don't take a stance. Yeah. The book is largely meant to help equip individual people to make their own decisions That's right. For their context. , Holistically though, I have concerns about that much technology in our classrooms, especially at the younger age levels. , And I think there's a couple of different things I think, .

I think there's good documented evidence about what the technology does to attention spans and social emotional development. And so I think that those are especially important at the K to five level, , probably later on as well. , And I think the second concern is that our students are using technology like almost every minute.

They're not in schools. Um, and so one of the only places where there's structured time where we can control whether or not they're on a screen is within a school building. Um, and I think we should take advantage of that. So, , I think that those are my quick thoughts on it. , I know there's. There's lots of folks who wanna make sure that our students are [00:04:00] prepared to use the technology effectively.

And I think that's a great argument for later in schooling. I think like by the end of middle school and high school, we ought to start teaching them responsible usage of the tools. , But I think most of the pedagogical goals we have, , for our younger students really can be, , achieved better without the technology.

When we think about all the different, , aspects of learning beyond, , are they getting the right number of personalized math questions, which is, you know, an important part of this, but not the only part.

Speaker: Yeah. Yeah. And I think it's so important, like what is your take on computer usage? I'm always, you know, we always make the comments like, AI will eventually take over teachers and things like that.

And I don't believe that to be true, but like I do believe how we're using technology is super important that. If we don't want it to take over our jobs, we need to make sure that we're not just throwing a kid on the computer and letting them self-pace their learning.

Right. So what are your suggestions for, you know, principals leading teachers on this?

Speaker 2: Yeah, I think , that's the exact right framing, because I think one of the things [00:05:00] that I've been trying to like show educators is that a lot of this is like intentionally giving up our agency, , and giving into the dialogue , of the tech industry largely.

And so, , I think that , the way you framed it in that, , like the more we integrate the technology into our classrooms, the more likely it is that like the teacher becomes less important is true. But that's not because the teacher is less important, right? Like we're, we're downplaying the importance of the human in the classroom by doing that.

And so, , I think that the, the right approach is to figure out how can we get our teachers, , to not feel the pressure to integrate that much technology. And the second part is like we helping folks see what the value of that human interaction is in the classroom. And so. You know, one of the arguments for like an AI tutor bot is that they don't have to, like, students won't have to wait for the teacher to come and give them an answer.

Like they, everybody can get instant support at the same time. , And that might be true for like a very particular academic goal. , But like, especially K to five, we're teaching them a lot about their things, right? So you also want to learn patience and the ability to wait. , And so like if they are used to getting instant answers in every moment, they're not learning that, having to wait.

For [00:06:00] two minutes for a teacher to come answer your question, that's a great opportunity to learn some patience and some, some emotional regulation, right? , The opportunity to ask your friend for, , their opinion or help on how they're approaching the question, , is another opportunity to learn how to, like, solve problems on your own rather than, again, be given all your answers.

And so I think the more we can show like that there's other parts of learning that are happening when we keep our classrooms human, , I think that that would be very effective. The, the other part of this largely is, and this is like not really within the control of any individual educator or maybe sometimes even school, , school buildings.

, But some of the larger pushes for technology integration are driven by standardized tests. , And that's the reality of the systems we work in. , The more we can advocate at least for getting, you know, away from standardized testing, I think the less incentives we will have to do that kind of rot drilling that I think a lot of times the technology ends up being a good use case for.

Speaker: Yeah, I think that's a great point. , Well let's dive into the ethical piece of ed tech. So like, what, what are the things that you think that maybe we're [00:07:00] not thinking about when it comes to ethics and education? I know a big one is AI and, , and plagiarism, things like that, but what other pieces do you discuss in the book?

Speaker 2: Yeah. So, , I borrow from bioethics in the approach I take on ethics. , And that's because very similar to bioethics, , education or medicine rather, education also makes a lot of really important decisions in the moment. , And so there aren't universal answers that we can necessarily say, apply to every single case of ed tech use.

Similar as you, how there's not one universal answer for how organ donation ought to be handled in medicine. , And what. Medicine has done is empowered individual decision makers, doctors in the hospitals to make those decisions based on their individual context. Because so much of it depends on who are your patients, what's your community, what are your resources at your particular institution?

All of which apply to, , education as well. , And what bioethics does is they provide four principles of things that we can kind of use to think about what the ethical problems with, decisions might be. And so, um, those four things are, , does it actually do something good? , And then example of [00:08:00] this is, I think oftentimes we, .

We, we hear innovation and we hear like, exciting new technology developments. , And we, we rush to integrate it because there's all this narrative around like, this is the future, this is exciting and this is innovative. , But we're not really asking if it's doing something concretely good,, or asking for evidence of that.

Good. And so that's step one is to make sure before we do anything that we know it's gonna benefit someone. . The second step is the do no harm principle, , in medicine, , where it's what, what are the risks we're exposing our students to? , And there's two types of risks. There's risks that we absolutely cannot tolerate.

, Those are hard line boundaries. ,

so the one instance of that would be if a student becomes dependent on a technology tool in the classroom, we might think that emotional dependence by the student isn't, is not something we want to ever, , risk having in our classroom.

The second type is we kinda have to weigh the risks and benefits, right? So there are harms that may be we have to accept as part of the integration. And , an example might be that. We sacrifice some level of peer relationships because , the academic needs are so great, right? So it's not an all [00:09:00] or nothing like, oh, if there's any sort of benefit, we should automatically say, yes, we'll do this.

Nor is it that if there's any sort of harm, we say automatically we won't do it Now. That still keeps it pretty good general in terms of, , how to make those decisions. The other two principles start to give us a little bit more clarity. , The third is autonomy, , and informed consent. , How much are the parents involved in making these decisions and knowing what's happening in our schools?

How much of that is happening proactively rather than reactively when they're asking questions or when they're concerned? , And how much say are we giving our students as developmentally appropriate? Right? So. Men of say that we might give our younger students might be very different than the say we give our high school students.

But are they getting the opportunity to make the call about whether or not they wanna use technology? One of the things that, especially with AI we're seeing is that older students have their own ethical concerns about the technology. They're worried about the environmental impact. They're worrying about what it's funding, right?

So, when they express those concerns, , are we allowing. To opt out of usage or are we, requiring them to use it. , And then the fourth component from bioethics is justice. And that's the question of when we're thinking about all these harms and benefits, how, who is [00:10:00] being affected by them?

Who's getting all the benefits, who's getting all the harms, and how are we really making sure that those are allocated fairly? And so, , every time you integrate technology, if it's always helping your gifted students, but your, , your special needs students are being left out of that conversation, that might be a reason to think about whether or not the justice angle is being thought about.

The final piece of this, and this is what I, , argue that we need in education, is the care element. , Which is that, you know, in, in medicine, your relationship with your doctor oftentimes doesn't really matter. Like, , the example I give is if you really hate your surgeon, they can still take out your appendix and you can walk away healthier than you were before.

But if you hate your teacher, it's a very different, , situation, right? It does actually, I impact your ability to gain the goods and the benefits, , that schooling ought to bring you. , And so we need to make sure that the decisions we're making are. Are good for the relationships we have with our students.

And that's everything from do they trust us? Do they feel surveilled by us, or do they feel like they, they have some safety with us? Are we actually getting to know our students in that personal way? Do our students feel like we are invested in their own education? Right? All those kinds of concerns that are really relationship based.

, I [00:11:00] think that's often the one that's really left out of the conversation, , with EdTech and especially with AI these days.

Speaker: Yeah. I love that and I love all the, you know, how you relate it to medical, because I guess with technology I wouldn't have thought of these different areas. Talk to me a little bit, and I know I'm behind on this, but about the topics with AI and why people, how it's become so controversial.

Speaker 2: Yeah. , So I think that , there's all sorts of use cases we're seeing and like the headlines , are getting. , Scarier and scarier is, , I think , the least, , hyperbolic way I can say it. , And so one of the, one of the things we're seeing is that I think there is a lot of pressure to train students on how to use AI tools.

Mm-hmm. Um, and so like schools are, , integrating prompt engineering into different classes or figuring out how do we teach, , students about ai, , to use it for writing, to use it for historical research, to use it in art classes, generate images in art, , and. , There's some value in students learning the newest and latest technology.

, But I think , the pressure that schools are facing right now is an all or nothing pressure [00:12:00] where they feel the need that they need to reclaim the relevancy of mat of academic material. They're feeling the need to tie everything into ai. , And I think that's largely because , that's the narrative from the tech industry.

They're saying that like, oh, , it's not important to write anymore because AI can write. And so there's no economic value to you writing. , But as educators and as. Like, you know, most researchers will tell you that there's other reasons to teach these skills that aren't like immediate economic output.

And of course look like we don't live in an ideal society. Getting a job is a really important part of going to school and going to, , at any level of schooling, but especially, , most , high school motivations are based in career prospects and so are college. , And, but the reality is that.

Teaching them. Prompt engineering isn't actually preparing them for their careers like a ninth grader. Learning what to do with an AI tool today is that's not what they're gonna be doing when they graduate and step into a workplace. And so we need to step back and think what are the skills that are actually relevant that we want them to learn?

And not just what feels like the headlines are telling us we need to be concerned about. , And of all the things that we teach our students, the one that stood this [00:13:00] test of time is. Reading, writing, right? The humanities have lasted 3000 years. Like , the, this is not something that has like been something that becomes less valuable as society progresses.

Right? Whereas these, like particular tech skills, at some point typing classes were really popular and at some point, like, , Excel classes were really popular, but , as technology has adopted, we've said, oh, well that's not really important anymore. , We haven't done that with our core basic, , academic material.

And so that, , that's one part that I, I am really concerned about. The second part is I think figuring out the plagiarism and integrity angle that I think you mentioned earlier. Mm-hmm. I think that schools are struggling because right now outside of classroom assessments , are basically, , like not really reliable.

, And I think that's only getting worse. I think, with the last six months we've seen , the advent of agents, , where like even, you know, entire courses can be taken autonomously by these, , by very. Easily available AI tools. , And so all of the barriers that we've been trying to build for, , outside of classroom assessments , are really falling [00:14:00] apart.

And so I think folks are trying to figure out , what does this mean , for homework, for assessments? How do we still assign a take home essay? , And I think those concerns. Are important. , And I think what some of the narrative we hear, and this is where like I think figuring out what our values are and why we're having these conversations is so important.

, Because some folks are saying, oh, that's a sign that what you're teaching is irrelevant, right? If the student can do it with AI at home, that's. Probably a sign that you should be teaching something else. Like , why are you teaching something AI can do? , And , that's similar to the earlier conversation I was having right there.

There's a mismatch between whether a technology tool can do it and whether our students needs to learn. Like those are not directly correlated. That's never been the case, right? So , we still teach our students how to walk and like cars can travel, right? Like much faster, much better, like much more efficiently.

Right? , That's a, you know, that's definitely a strong argument. But, , but the point being that we do still teach our, , students skills that. Technology can do and we have good reason to do that. And that's probably some of the answer here. The second is I do think we're gonna have to rethink education.

And we mentioned standardized testing a little bit earlier. , But I think in all sorts of ways, like figuring out how can we do more assessment in the [00:15:00] classroom, , that's effective, but also keeps students, you know, motivated and engaged and understanding the value of what they're learning I think is really important.

And these are not. Things that are new to education, right? Like math classrooms have done this for decades now. Like calculators have existed, students have had to go, have had the option to go home and do basic timestables, , with calculators for their homework. But math is structured such that like the more you practice outside of the classroom, the more you do your homework, the better you will do when you show up in class and are assessed on those skills in class.

So. Homework isn't really like, oh, did you get this right or wrong necessarily? Especially at the lower levels. It's more, here's an opportunity for you to practice. But the actual assessment, the actual demonstration of that skill is gonna happen in the classroom, and that, that's worked effectively. We've been able to still assess math skills, , despite the fact that, you know, most students have a calculator on their phone, let alone like on, you know, an actual physical calculator.

, And we probably need to figure similar things out for other areas of education now as well.

Speaker: Yeah, I, you brought up a good point. It has me thinking we get into this all or nothing thinking, right, of like, oh, AI can do it, [00:16:00] so these jobs aren't going to exist or we're not gonna need this anymore. I can remember like.

In the nineties, people saying, what's it going to be like in 2000, 25, 26? Like, our car's gonna fly like the Jetsons, you know? And here we are. And none of that happens. And I think you're right. You bring up such a good point as educators that we need to remember these foundational skills that we've always taught.

Reading, writing, math, that, like you said, there's always been calculators, there's always been things that we can use, but we still have to have those skills. And it's important as educators not to go like. Completely to one side of it, and forget about that because I think with new tools that come out, it's so easy to do and think that things are irrelevant, but it's so important for our students still.

Speaker 2: Yeah. Yeah. And the pace is scary too. I think that's part of it is everything's moving much faster. , And so while some of these things we face with other technology integrations and the better technology innovations in the past, , I think with ai, like you're just hearing some new thing, like developed every, , every week sometimes, you know, like [00:17:00] every, at least every few months you're hearing some massive development in this space.

And so I think that , there's a lot of fear that drives this. , There's a lot of concern and anxiety that drives this, , and. All of that makes, , those are all valid emotions you have in response to something that's this groundbreaking and this fast. , But I think that the right response is to slow down and not speed up.

, And it's so , counterintuitive for most folks. , But I think that that slowing down will help us like actually figure out how much we need to speed up, how much, what, what exactly needs to change to deal with external speed. , If we're just trying to match that external speed, I think we're gonna get lost and let let the, let those external forces control what happens in education.

And I don't think that that's. You know, that's just not gonna serve the purposes that we want it to serve.

Speaker: Yeah. No, I agree. Well, thank you so much p, for being here today. Are there any final things or last things you wanna share before we go?

Speaker 2: Just that I think folks ought to, , try to have these conversations more often.

These are not, you know, magic solutions to any of these problems, but the more we all talk about it, , the more we can get closer to coming up with the right solutions for our communities. , As educators. , My book is coming [00:18:00] out in May that hopes to help equip folks to do that. , And that's ethical ed tech and it's at ethical ed tech.org.

, And then if folks wanna follow along with any of my other work where I talk about all these things, oftentimes on a soapbox, , they can check out my website@preton.org.

Speaker: All right. Awesome. Well, I will link that all in the show notes as well, so you would be able to connect with Priten there. So thank you so much.

I appreciate you being on the podcast and , I look forward to your book when it comes out.

Speaker 2: Well, thank you so much for having me.

Speaker: Of course. I.

AI in Schools: Slowing Down to Make Ethical Decisions About EdTech