Technology in the Classroom

Should Big Tech be in classroom education?

How Did Big Tech Get Into Your Kid’s Classrooms?

If your child is in school, then you’ve probably had to deal with Chromebooks, Google Classroom, and other Big Tech based resources used by your school. But have you given it thought about the implications of allowing Big Tech – in this altruistic role – access to your kids?

Nolan Higdon joins Matt again to discuss the role of Big Tech in education. Fresh off of his latest book, Media & Me; A Guide to Critical Media Literacy for Young People, Nolan discusses how tech companies were able to break the corporate barrier to education.

We discuss the following issues:

  • How classrooms went from safe spaces to public forums
  • Does using technology actually increase learning?
  • Is it legal for Tech Companies to provide hardware and software to students?
  • What happens to the data that Big Tech accumulates from schools and schoolchildren?
  • What is the place of the teacher union in protecting the children and teachers?
  • How algorithms are used in schools, but the rules of the algorithms are rarely examined for bias.

Transcript

Matt: Hello and welcome dear listener to another edition of the Endless Coffee Cup podcast. And today I’ve got a familiar voice for those of you that are still listening to the audio. And yes, I am still fighting the video side because if you use video, it becomes a show, not a podcast. If you know the voice, you’ll know Nolan Higdon, who has been with us a number of times. Nolan, how are you doing?

Nolan : I’m doing pretty well, although it has come to my attention that I’m the second guest who said the most appearances on this program. So I don’t know who number one is, but I’m coming for you.

Matt: All right. I’ll let ’em know. They got a mug and so yours will be in the mail soon. Well, for those of you who are new listeners or maybe some just need reminding, Nolan as a professor of media studies. And we have discussed a number of books in past podcasts but really focusing on digital literacy, media literacy and everything that goes along with that. And for those of you that have been listening, you know that is near and dear to my heart and a real passion.

Not just kids understand media and digital literacy, but adults as well. So, Nolan, thank you for coming on the show again.

Nolan : Yeah and I concur. I think that’s a really important statement you make there at the top. I think there’s so much talk about the youngins with their tools and all these things, but this is something for all age groups. More media literacy is certainly something we all need.

Matt: Absolutely. And dear listener before me is Nolan’s latest. You are one of a number of authors on the book called The Media And Me. I love the title but a guide to critical media literacy for young people. So you had a hand in that, Nolan, what was that like? What made that come together and the direction that you took with this?

Nolan : Yeah it’s an interesting story. I actually underestimated the work of the book. You’re writing for young people, you’re writing for a general audience. I do a lot of scholarly writing, so I think there was kind of this elitist and assumption inside of me that this will be easy cause it’s for less scholarly audience.

But actually it was way more difficult thinking about how to talk to young people in a respectful, responsible way. It took some like inner reflection and that’s why we have so many authors. We have a diversity of authors in every sense of the word, including age, and they were very helpful in saying what did and didn’t click.

What was still a relevant example and what wasn’t. And some of that stuff made me feel old. it improved the project overall. So it’s a great read for all age groups due to the diversity of authors, I think. Good.

Matt: I am looking forward to digging into it. I just got it the other day and I can’t wait to dive in.

Seeing a couple of familiar names there as well, of people who have been on the podcast Mickey Huff, and so I can’t wait to dive into that one cause I know Mickey doesn’t hold back. Well, you and I have been talking about via email and it’s on this subject of media and digital literacy is how schools are handling technology.

And one of the things that really, bothered me when I was doing my master’s degree in education. One of the professors was giving an example of how you can engage students by having them do an assignment through social media. And immediately red flags are coming up in this lecture for me, and I’m trying not to hold, put my hand up or whatever, but immediately I’m thinking, well, wait a minute.

You’re requiring a student to download an app to create content and now they’ve got a register on the app and we’ve got all kinds of big red flags coming up to use this in the classroom. And I knew, I’m like, I’ve gotta go to my buddy Nolan. We gotta talk about this.

Nolan : Yeah. And actually, I had a similar experience, listeners will recall the revelations from the whistleblower Edward Snowden 2013, 2014.

We started to get a bitter better picture of just how large this government surveillance apparatus was and how it involved the utilization of digital tools. But even if you’ve read all the work on it’s still something you really can’t wrap your mind around. Like, you can’t picture the surveillance state in your head.

You can talk about it, describe it, its so wonky and impossible to picture. But I had that kind of rolling around in my head as a fear of like, these digital tools are surveilling all the time. But I always thought of, as a student and a faculty, the classroom is a safe space.

No one’s listening, no one’s watching. There’s always rules about whether or not management can even come in when I’m teaching to protect academic freedom. But one day we had a student who needed an accessibility issue and essentially the accessibility was that they recorded the classroom and that recording was sent in real time to a third party who transcribed it in real time.

So this person could get equal access to learning, which is a goal we all have in education. But I recognize immediately, like if you leave the classroom, then this stuff is out there forever. You’re breaking the privacy of all the students and the faculty. And it was with that concern that I started to go down this rabbit hole to understand, what is legal, what’s illegal, what’s being done, what may be done.

All those sorts of critical questions. And to your point, and I think this may have happened with the teacher in your example, you can tell me, but the more I’ve spoken to faculty or managers or students and parents I see just a lot of ignorance about the topic. But once folks know what the topic is, you see people change their minds and be freaked out, they want privacy. But I think a lot of people have this assumption that their privacy is protected in all these spaces and it takes some time to, get through to them about that.

Matt: You’re absolutely right. The teacher, obviously they’re looking out for the best interest of the child.

They’re, they’re trying to find creative ways to engage students on their own level. And so while that was being given out as an idea, when I talked to them about, wait, there’s a couple issues here that would’ve some parents are forbidding students to use certain apps or certain social media, or what if that family does not use social media at all? You’ve got to realize that there are going to be degrees to which this will be used.

The other aspect is, it is economic, do all your students have phones? Do they all have smartphones? Do they all have phones capable of doing this? So you don’t want to create something where there’s a disparity, where some are able to participate, some can’t.

And now, you’ve got students that may not want to say anything. And the intentions, I agree. We want to engage them more, but there needs to be this and that was funny because that was one of the things in my master’s program, we had to evaluate our curriculum to see how safe certain things were, what were risks.

And that was one that just, it kind of went in one ear out the other, and no one really understood the risk, so ignorance is a big factor.

Nolan : Yeah. And I think in that space where folks are against ignorant of some of these things, we’ve allowed it to become normalized.

So when you do draw to some group’s attention, like say an academic senate or union, they have kind of the feeling of like, well, we’ve already come too far. By this point everything’s already integrated into learning environment and that’s something I totally reject.

I think there’s a lot of work we can do in this space to minimize the threats to people’s livelihood and protect their privacy better.

Matt: Absolutely. I know there’s been a couple of things in I think our local area where a teacher assigned something on TikTok and immediately there were parents that were absolutely not, we’re not gonna do that, we don’t allow that.

On the other hand, the vast majority you heard nothing from. So it’s very interesting, the disparity of, I would say the literacy among parents, among students there seem to be those that are just extremely well-informed and those that just like you said, it’s been normalized to the point of we’re not even considering it.

Nolan : Yeah, and I think at a certain degree, even those of us who lived in kind of this more pre-digital environment, it’s even tough for us to imagine what it would look like today if we got rid of so many of these tools. So that’s why I’m a big advocate for examining the ways these tools are used in education and platforms.

First of all, asking a critical question, which I always find fascinating is, is this actually helping learning? We don’t have really substantive studies to say that this is improving the learning process, but let’s say some of this is. Well, which ones are what are suitable replacements to get equal quality of learning?

What are the cost benefits? So that example I gave you earlier, is the benefit to that one student at the cost of privacy of all the other students and faculty? Is that something we think is worth it? these are, the ways we need to start having I think discussions about these topics that we rarely feel comfortable doing.

Matt: What if we widen the lens a little bit more and even get into using things like Chromebooks, Google Classroom? I mean, this is the corporatization of our education system. I know when our school started using Google Classroom immediately I was at the door of the superintendent just saying I am not comfortable with this.

And then they start sending Chromebooks home and I’m absolutely not. So, I’m one of those parents that’s I don’t know, could be irritating to the administration. But again, it’s one of those things where, everyone else is doing it, we’re just following it as well. So what happens when we widen that scope to allow like a Google Classroom Chromebooks to really be the cornerstone of what’s going on in the education system?

Nolan : Yeah. And I think it’s a great question that needs a little context as well which is, and Alison Butler and I did this in a study a few years ago called put your marketing cap on where we talked about how for over a century corporations have tried to get into the classroom.

Public education’s one of those unique spaces where a majority of the population is forced to be there for a substantial amount of time. And they’re largely an audience that’s paying attention to the front of the room. So if you can get that teacher in the front to be advocating for your product, then you have lifetime customers.

And this is something that Chevy has done and Ford has done. Pepsi even, Electra, the record label used to give out posters of the band rush to get into schools and things like that, right? So, there’s a long history of this. But by selling itself as kind of the future of education, that is they being big tech.

There was a lot of schools that were a lot more sympathetic to allowing these corporations into the classroom than they may have been for like a Coca-Cola, right? It’s tough to find the educational value in, in Coca-Cola, but there’s a pretty good PR to say that. Big tech is synonymous with the future of education.

And so these companies under the auspices of altruism would quote, unquote donate things like Chromebooks to the classroom. Especially for schools that are dealing with struggling budgets, this is a way to get computers in the hands of their students. But of course, it’s not only getting students used to using Apple products and get a positive view of Apple products, you’re also getting the students data and the data of the school and the process of donating those things.

This is why I highly encourage audiences to think about how we use the word free. Seen as social media’s not free. Your Chromebook’s not free, your Google searches are not free. It, costs you your privacy costs you your labor and things like that. And so when you talk about the Chromebooks, it’s, eventually a tracking device, but it’s really interesting to think about. Cause if in like the 1970s, I told you to carry something in your pocket cuz Pepsi wanted to track you would tell me to screw off. But I call it smart, and all of a sudden people are excited to carry around a tracking device. It’s very good PR.

Matt: Absolutely. And one of the things as well, when you have Google Classroom in a Chromebook, all of a sudden now you have kids of all ages creating Gmail accounts. And so now, I don’t care what Google says, you can’t tell me you’re not tracking minors and what they’re searching for.

What, all of a sudden now you’ve tied someone into your entire ecosystem and you’ve got them from such a young age to start profiling what types of information they look for, what they do, how they’re using everything. I mean, when you take that step back, it’s

pretty scary.

Nolan :

I agree. And I think this is another area where we need more conversation because inevitably when somebody hears the two of us having this portion of the conversation that they’re gonna say, well, so what?

I’ve got nothing to hide. Who cares if people are looking at my content and I’d like to encourage audiences. Number one, unfortunately, a lot of the threats that you should be most afraid of you don’t know are a threat until it’s too late. Laws can change for example, we saw this with like abortion access.

People who were communicating about getting abortion, all of a sudden were doing something illegal under state laws. And we saw the big tech was sharing communications with law enforcement to go after those individuals. Some people are in like a perpetual, vulnerable state with their privacy.

If your documentation status is something that’s a less than certain that’s something you may be a person who wants privacy If you’re the victim of stalking you certainly want privacy. And then we also need to be cognizant of the ways in which this data is analyzed incorrectly and often acts as a form of confirmation bias for, problematic thinking.

So, I mean, there’s been upteen volumes of research on how Silicon Valley is, overly white. It doesn’t have a lot of people of color, and it seems to make algorithms that reflect racist thinking. Sophia Noble’s done some great work on this Rohar Benjamin. So we find is, the data shows that, due to racist policies, we’ve overpoliced communities of color for years, so they’re more likely to be arrested cause they’re more likely to encounter police.

Well, from a data perspective, that means people of color are more likely to be criminals. And so we get these outputs that, to police departments say like, here’s where you need to police for criminality. So we have the over-policing of communities of color, well in schools. They’re also using this data to predict behavior.

So who’s more likely to be criminal or commit a crime or commit violence or have mental health issues? And we see, again, reinforcement of these biases, which creates extra challenges for the these already vulnerable communities. And that’s just looking at what we know is going on. I mean, in the historical context, if you’re someone who’s ever tried to challenge power the last thing you want to do is compromise your privacy.

For years, you know, CIA, FBI, they’ve tried to infiltrate activist groups and tried to undermine activists by getting dirt on them and using against them. They did this to Martin Luther King with his extramarital affairs, for example. So you simply just don’t know. And there’s centuries of research to illustrate why privacy is important.

So I don’t think we’ve achieved some new paradigm where privacy no longer matters. I’m not convinced that we’re in some era of time where the past doesn’t affect us.

Matt: Absolutely. And I hear that all the time and even from people in my own family that I’ve got nothing to hide.

What am I worried about? And I’m like, same answer. You don’t have anything to hide right now. It goes to Google recently changed up some of their targeting or you could log into Google, you could see your ad settings and you could see how Google has siloed you according to the content.

You watch search for, look at, and they have a privacy tab where it’s sensitive information. And what I find interesting is we’ve allowed Google to define what is sensitive. Government hasn’t done it. I didn’t do it. They did it. And now that kind of information and in my classes I have people look at that to see how Google sees them.

And it’s very eye-opening when finally someone sees that and especially clicks on that sensitive information tab to see what’s there. That changes some minds.

Nolan : And I think, you mentioned this earlier, but this speaks to the, largely the corporatization of all the spaces of American society.

You noted we don’t define sensitive information. We also don’t define what’s social about social media. We as the community don’t set the community’s standards for a lot of these, these platforms. And it has gone to this sensibility, especially since the seventies and eighties where we’ve as a culture increasingly came to believe like corporations were better than public sector or government.

And I think we’re now kind of living in the era where that’s come to fruition. We’re seeing what it’s like when governments are less powerful than corporations and dominated and or working with them in a lot of cases. And just this example I think is prime, like in a more democratic culture we would think electorally.

Like how do we wanna decide what is sensitive information or what information should be out there, right? Who should have that information? But instead we sort of enter a space where it’s like, well Google decide. It’ll tell us what’s sensitive about my information.

Matt: It’s amazing how much public policy is being generated by big tech. And the government seems to be chasing and I don’t know about you, but the last oh, it’s where they had TikTok in front of Congress. I dunno about you, but that was just embarrassing.

Nolan : Yeah. This is clearly one of the, many flaws in the American style of government, but you have to have people who are knowledgeable on the thing they’re regulating.

If you hope to get any regulations in place, they’re gonna do anything. And I, remember, like half a decade earlier, it was a 2017 when they had Facebook up there and there was a congressperson who was like, well, tell me how you guys make money. And I was like, oh no like regulating the commerce for this.

But yeah, there’s that and these companies take advantage of that. The so-called godfather of, AI there who just left Google. He was discussing about the threats that he thinks are being posed by AI and how he’s voiced these and, the companies moving beyond it.

But something else was interesting in, the coverage of that from sources the journalists found. So they were saying that. The industry has technology that it has not released publicly yet, and it starts lobbying Congress people to make laws that are favorable toward that tech before it gets released on the public.

So, in some cases, I mean, it, looks like horrible corruption, which definitely exists in the United States government, but there are some spaces where it’s certainly just ignorance. They listen to these slick tech execs who tell them, you know how this is different and this is different and this has changed, and they make legislation to reflect that. And then we as the public end up with a highly dangerous, unregulated piece of technology dominating major important parts of our society.

Matt: Absolutely. And I don’t think the average American knows that your top five lobbying companies are I think three out of the five are tech firms. They’re big tech.

Google, I think for years has been the number one lobbyist with a DC location. And then you’ve got Facebook and Microsoft very close behind. I mean, that speaks volumes to who is influencing regulations, who’s influencing laws, who’s influencing the views on privacy.

Nolan : The tight relationship is key.

And we saw this in the Obama years. Like most of Obama’s administration left and went into to big tech. Chuck Schumer’s daughter works for fa well used to work for Facebook. I think she’s still there. But you have these major connections and it was interesting like way to prove this, how dominant big tech is.

I dunno if folks will remember this, but at the end of the waning days of the Trump administration, Trump broke with Republican politicians and said, I wanna give everybody $2,000 checks. And Nancy Pelosi and the Democrats who ran the house were like, that’s great. We would love to also, so let’s put pressure on McConnell and the Republicans who ran the Senate and McConnell did something interesting, McConnell said, yeah, we’ll put it on the floor, but we’re gonna tie it to the removal of section two 30 of the Communications Decency Act.

And it went down in flames because McConnell knew nobody will ever vote against big tech. So I’ll tie it to this, and no matter what, they’ll never vote against big tech. So I think that just, ill just speaks to the lobbying power of this industry that you can kill any piece of legislation by making it perceived as damaging big tech.

Matt: Wow. That is amazing. Well, I wanna circle back a little bit cuz we talked, I was asking you a little bit about, Google Classroom and Chromebooks and things like that. But I wanna bring it back to a question that you asked. Does it help learning. Do does having all these Chromebooks in the classroom and Google Classroom instead of just getting a school designated learning management system, does that help learning?

Nolan : It’s not really clear in the data if this is actually helping learning. Now advocates for these devices will point to certain data about more degree attainment or more passing of classes, which on the surface looks like data to say this is improving learning. But there’s al there’s been a skepticism amongst educators because we’ve seen a dramatic rise in grade inflation over the last 20 or 30 years.

And this is because a lot of school policy since, No Child Left Behind in 2001, they determined a school’s success based on things like test scores. So if your test scores were good, you passed. And if you were a good teacher, your students passed. If your students didn’t pass, they you must be a bad teacher.

Well, especially in a time of economic precarity, which you’ve essentially done, is you’ve put an economic incentive in to teachers to say, if you wanna be perceived as doing your job well past the students. And, we’ve seen this trend for 20 or 30 years. I think you’ll rarely get educators on the record in this.

But that’s fine. But the data’s really clear on this grade inflation. So beyond quote unquote passing, I mean, the other metrics we have, individual studies on critical thinking and things like that. We’re not seeing an improvement in learning. We’re seeing to go the opposite direction.

And so this is where I think the question needs to start about one, is this even helping? And it’s not really clear at least the macro level, it’s helping maybe for certain individuals and certain contexts. But interestingly related is also the legalistic question.

How is this legal? I’ve been teaching for over a decade and one thing that’s been beat through my head is FERPA. These students have privacy rights and if you break them, Nolan, this is a way to lose your job. And what I mean by that is I teach college, these are adults, but if their parent contacts me, which happens occasionally, I can’t respond to their parent cuz I can’t admit their child is in my class.

Cause that would break their child’s privacy rights. So I mean, this is how serious like FERPA laws are in what we do. So when I was hearing, thinking about big tech surveilling in the classroom, I, was thinking like there must be some legal mechanism in place to make sure this data collection stays within the school.

But then I found the loopholes. So the loopholes are, FERPA allows schools to share student information with educational partners. And these companies are classified as educational partners. Sometimes they’re precluded from selling data, but this is also a misconception. Selling data is bad but they also can share data and they can share data analysis.

So they can give like psychographic profiles of individuals or schools and things like that. So there’s these major loopholes within the existing law structure. So if you push back at your institution, whether you’re a parent to teacher student, you’ll inevitably hear like, well, this device is FERPA compliant.

You need to push back and say, well, unfortunately not good enough. I’m asking about my privacy. I’m asking about surveillance. Cause FERPA at least the 21st century as it’s currently written is not getting the job done.

Matt: Wow, that’s like a gut punch when you, as soon as you said the loopholes and what they are, I mean, that is literally like a gut punch because we’ve reclassified a commercial entity as an educational partner.

Going back to what you said they’ve been trying to get into the classroom and, it was as simple as getting reclassified that opened the door.

Nolan : Yeah. And it’s also important to remember that these companies are not just involved in education. So like larger companies will buy a platform like Canvas, which I’m sure users are used to.

It’s a learning management system like Blackboard, Moodle, or Web CT for folks who remember that way back in the day. But these learning management systems are bought up by companies. One of the major companies is EAB Which I think now they use as their full name. It’s no longer an abbreviation.

It just is EAB and they a lot of these platforms like Starfish, for example, which tracks athletes and students to, look at their academic progress. And they own a ton of these, but they also work for, like the United States government. They’ve worked security for US government data.

Their majority stake is owned by an equity firm. So this, data is not just staying in like the educational context. It’s not like your, kindergarten trades it to like your middle school who trades it to your high school, trades it to your college. This is an in every industry possible wants access to this data insurance companies and et cetera.

And they’re able to do it through these loopholes. Cause again, they’re all, now the US government is part of EAB it’s part of that educational partnership. That equity firm is part of that educational partnership. They have access to that, data.

Matt: That is absolutely amazing. And then that’s the thing. You put the wider lens on this, it starts to get very uncomfortable. Just how much we have allowed this open window into the schools and I’ve said for years, and I’ve told other parents about this, that when you have internet coming into your home, I hate to use this anymore, but it’s still, it’s like a loaded gun.

If you don’t know how to use it, it will harm you. It will hurt you, it will change your family. You’ve got to treat it that way when you’ve got this pipe coming into your home. Well, we’ve got this pipe coming into our schools and it’s like the parents are even more clueless about that than what’s going on in their home.

Nolan : Yeah. And It literally starts from the cradle all the way up through the grave. I’ve been looking at programs like Baby Connect which try to monitor every aspect of your newborn. Their temperature, their likes, their eating schedule, they call it all these kinds of details, these data points which are seemingly meaningless when you’re a few months old.

But when they’re put together with data all the way up until the age of 18, we can get a pretty substantive profile of who you are as a person. And there’s a lot of actors seeking to use that information to nudge or predict user behavior in framing ways.

This could result in paying more insurance premiums that this could result in you being the target of like a propaganda campaign for voting or something like that. So time and time again. We have to remind people to your point right about once you let this in, it’s a lot of, there’s been a lot of fear of what’s coming into people’s house, what their kids are gonna see online.

There’s supposed also to be a lot of fear of what’s coming outta your house, cuz all your data is going with it.

Matt: Absolutely. Oh man. I thought I knew, you know, you and I could talk about this and I know quite a bit just from the ad side and the commercial side. But I love the research that you’ve done just into the legal side of finding out.

It’s almost like, well, it’s not almost like there is no fear of corporations of anyone just stepping in and accessing data or collecting it. It seems like right now it’s just this massive collection phase of all this data, and we have yet to see what’s going to come with that. And you keep alluding to that, that in the future something like what they’re doing in China with the social credit, if all we’re doing is collecting right now, there’s got to be a point where something changes.

Nolan : Yeah, and that’s an interesting point to bring up the, the social credit because you really have to go back to look at like, credit card companies who I think pioneered a lot of what could be done with data starting the 1980s. In particular, they wanna develop these giant credit bureaus because they’re worried about giving credits, people who wouldn’t, who wouldn’t pay or wouldn’t pay the plus interest when they find them for not paying. These huge credit bureaus, now control massive amounts of data and have gotten into other industries. So I think they documented a way for how to use this. And they are the ones who can allocate like credit scores.

Like if you work for whatever, a car lot or something and someone wants to buy a car, typically that car lot goes to the credit bureau. The credit bureau gets to make some unaccountable decision about whether or not you’re worthy enough to have a car. I think is, and you know, we all know little we trust credit cards except, the country did vote in Joe Biden, the senator for MasterCard.

But these credit card companies you could easily see how that can be replicated in other spaces moving forward where people just rely on the bureau. They just now rely on the algorithm. Virginia Eubanks has written a lot about this. She wrote a book called Automating Inequality, which I think is a great book.

And she pointed out how there’s biases in these algorithms that governments use to distribute social services and the ultimate result of less people get who need the, get social services. And when people rightly complain, hey, I’m poor person and the losses, they deserve this, I’m getting screwed.

The office says, I’m sorry, the algorithm says you don’t qualify. And then that’s how you get that sort of social credit score type of info infrastructure is like that.

Matt: Absolutely. Yeah. And this is the dangerous part about AI. The dangerous part about, you know, the algorithm, quote unquote, is the black box that we don’t know what went into that.

We don’t know the framework, we don’t know the rules, we don’t know the instructions, we don’t know what it was trained on. And so without that transparency, literally, let, and that’s the thing, credit card companies can say whatever they want. They can do whatever we want. We have no ability to really work with that.

I’ve got friends that have been trying to clean up their credit score for two years and it just seems like it is stacked against them. It’s a huge burden on them because they’re ultimately, they wanna buy a house, they wanna move on. They want to build the dream. But there’s this arbitrary number that is, over their entire life right now. And it’s that we don’t know what’s in the box. We don’t know what’s driving that. It’s more and more of our lives are being run by that.

Nolan : Yeah, absolutely. And it speaks to that corporatism we were mentioning before, which is, Whether it be these credit card companies or it be Facebook or Canvas at your kid’s school these companies are largely able to operate in secret cuz they’re private industry and that means free of government and democratic oversight.

We don’t get a vote at Canvas, we don’t get a vote at Facebook. We don’t get a vote at the credit card company. But they have huge influence over our lives and our elected officials in a way that we do not. And I think that’s speaks to where this kind of corporatism has led us ultimately is further away from democracy.

Matt: Amazing. What else is going on in the schools with technology that as a parent, what’s something that I need to be more aware of?

Nolan : At some level, all parents will get pushback from schools to say that they use, tech for a multitude of factors, maybe like to turn in things digitally.

They have to track, grades at some level. These are things that students are gonna want to be able to transfer their next institution or they may need a record of. So I think that trying to delete this idea that there’s gonna be any trail of you is a fool’s errand.

I don’t think it’ll let society wouldn’t be able to operate, and I just simply think you’re gonna fail in that effort as well. But I think there are steps I would encourage parents to think about. One of ’em I already mentioned when you’re asking questions about new platforms entering your kid’s school, one, is there a way to to opt out if not argue that in future negotiations with companies there should be an opt out for students.

Ask about FERPA and like I said, I’m sure you’ll get it’s FERPA compliant, but say okay. But do you have guarantees that they won’t share data. I’ve also been a big advocate for, look, if they’re gonna use your data and profit from it, demand data dividend in there. Make sure every student gets a paycheck every single year from the data that’s being taken.

At least get some pay for your exploitation.

Matt: I love that. Yesterday I was looking at all the school levees that failed around the area and only a few passed, but, well, there’s a way to fund the schools right there if you’re gonna pull that much data and yeah, we could charge by the terabyte. Because you look at how much is being pulled off of this, that’s one way to fund the schools.

Nolan : Yeah. And then there’s also some proprietary software that some schools have developed. So say if we do need these, why don’t we develop ’em in-house or have ’em develop where they’re only for this institution.

So we can, well not guarantee, but we can promise in a more direct way that we have more oversight over this data than outside or third parties and those sorts of things. So I think those are some, like in important steps that folks can take. I think using open source browsers like tour that delete data, talking to kids about encryption and password protection.

All the while knowing you’ll never completely eradicate digital footprints, but trying to minimize them or at least disperse your data across multiple platforms versus consolidate it all into to the hands of a few. I think these are sort of ways we can start to address this problem in a responsible way.

Matt: Absolutely. I like the idea of moving more towards open source browsers. And I think that more and more, I hope it’s becoming part of the curriculum. I know at our school they’re developing more and more literacy based or they’re incorporating it into some of the schools.

But From what I see, it’s not enough. Especially if we’re raising this next generation and they’re going to be in the job market, they’re going to be moving up. I see almost in 20 years, even those that are graduating now. Do they have enough of an understanding of digital literacy to make good decisions as managers, as leaders, that they have a good enough understanding about the privacy, about what is happening with the corporatization and how is that going to affect public policy even more.

Nolan : Yeah. And I think that’s why parents and students need to be reminded of their power. Schools do not want to hear from angry parents or students. It’s an irritation, it gets in the way, which is exactly what a activism is supposed to do. So if you’re a parent or a student, I’d highly encourage you to bring these things up.

We see this at some level in states across the country, right? When parents get irate about CRT or drag queen story hour or whatever conservatives are complaining about now. Those kinds of things draw, parents get angry about ’em and they draw a lot of attention from lawmakers.

I think there’s a case that parents and students could do something similar and in my estimation, more constructive with this space by appealing to school districts. And then another piece that I think we haven’t really discussed. It’s somewhat self-serving on my end, but I think unions need to reflect upon the threat to academic freedom that surveillance poses.

And anybody who’s ever taught knows that, part of teaching necessitates a space where people feel safe to work out ideas. I don’t know of a classroom I’ve ever been in where students didn’t make mistakes. They have to make mistakes in order to learn, but if they’re. terrified of being surveilled or another student recording them on their phone or something like that, whether it’s, righteous paranoia or not.

That undermines the learning process. And the same thing goes as a teacher. Sometimes you play devil’s advocate, you work ideas out loud. You say things that you know are ridiculously false because you want to test the student’s ability to call you out on it with their evidence. If you’re concerned that those things could ever be used out of context against you that’s a tough thing to get rid of in the classroom because sometimes you end up in a classroom with the homogeneity of ideas.

So you have to play the outsider to make sure they hear different views. If you’re only safe, recourse is to preach to the choir to save your job, that’s great. You’ll get a paycheck, but there won’t be a lot of learning involved. And the inverse true too, which is a lot of faculty will and have lost their jobs or their jobs threatened because the ways things are recorded in the classroom out of context.

And so I think we need Allstate Unions to take this issue of more seriously in their protection of academic freedom.

Matt: That is an excellent point. I mean, especially yes, when most students are bringing their phones into the classroom as you said, so many things can be recorded, taken out of context.

But I love the approach you’re taking that this is supposed to be an area where ideas are challenged, where we have to be able to explain our position. This is the beginning of understanding of things, and that’s what teaching is all about. And so just the concept of having students with phones in the classroom, I think there’s some districts that have made some changes about that just because it just things you hate to see, hit online when they become viral of like a fight or something like that. One thing you notice is everyone’s got their phones out.

So there it seems like there is nothing private inside the school walls, which that brings up a whole host of things, but I love, just from that professional standpoint of if I’m to do my job better, then where’s my support? Where is the union and how will they protect me

being able to do that? That is such a great point.

Nolan : Yeah. And at some level we as teachers and schools and unions though we unfortunately kind of fed into the threats to academic freedom. I think during Covid, there was a rush to remote education. This isn’t an argument against lockdown so much, is it to say that we rushed to remote education without really discussions about like, is this the right thing to do?

To these students? And this is a question I’m asking, not a point I’m making, but I wonder would it have been better to lose school for six months rather than put them into online spaces and normalize this type of learning. Like tho those are the kind of questions I don’t think we wrestled with.

There was a push online, there were some of us who voiced concerns about mandating Zoom and these tools. But for the most part you know, we were sort of ran over and it was this has to be online. And in that way we’ve sort of taught students that like, well this is normal.

We can use these devices in our home, we can use it, you know, look at the teachers using it in their home. You’re using it in your home. And after a while that I, I kind of worry that we lost a lot of the foundation we would’ve had to be taken seriously when we as faculty and unions make this point.

Matt: Absolutely. That was one thing that I was absolutely amazed. Is that what it was an immediate adoption of the technology of cameras, of students at home being able to see into everyone’s home in that area. And there wasn’t a debate if anything, if anyone raised any objection, raised any questions about, is this the best way it was immediately shouted down rather than and really instituted.

And I’m kind of looking back on that going, where did that come from? There was just this rush and unfortunately very few teachers we’re prepared for what it takes to teach online. I think too many of them just thought it was just a matter of leading the class on a camera, but curriculum has to be different.

Your approach has to be different. How you interact is different, and I think the vast majority of teachers we’re not trained or ready for that kind of movement. I like your idea, let’s put a pause on this until we know what’s going to work.

Nolan : Yeah, I’ve been teaching gosh, like eight years by the time Covid hit.

And this will speak to your point about most teachers not teaching online. When I first started teaching, I learned how to teach remotely because that was the way I was offered classes because a lot of the old timers didn’t wanna teach remote, but these campuses wanted to offer more remote classes.

That’s actually how I got into teaching was by saying I knew how to teach remote. And to your point, once we went online, there was haphazardly put together trainings for faculty members. I’ve talked to Upteen students who talk about how they had to help their instructor learn how to use these platforms.

Which is great students did that, but I think it speaks to how we just sort of rushed to do something without the preparation to go for it or asked the necessary questions. We just sort of said, All right. It’s Zoom, let’s go.

Matt: Exactly. And honestly, doing that defeats so much of what you’re taught in education.

I mean, when Covid hit, I was right in the middle of my master’s program in education. And when I’m learning, is when you’re setting up the curriculum and you’ve got your theory and your objectives and how you’re gonna do it. That jump to zoom, like was completely contrary to everything that’s taught in theory, in practice. So it was dizzying how quickly that that happened.

Nolan : We’ll give you your methodology first so you can get your theory and goals later.

Matt: Exactly. Yeah. Do it over the weekend. Figure out how this works.

Well, Noah, this has been a great conversation and I look forward to connecting with you again as soon as I crunch through media and me, or the media and me. We’re gonna follow up again and go over some of this because, again, I don’t think we can have enough information about media literacy, about digital literacy.

And we don’t have enough of those conversations professionally personally in all areas of life. So definitely gonna have you back to talk about that.

Nolan : Absolutely. I appreciate it and I’m working with Alison Butler on putting some of this research into a book and you can check out more of our research at nolanhigdon.substack.com.

I appreciate being here. You’re one of the few spaces, if not the only space that holds these conversations. Thanks, Matt.

Matt: Oh, I appreciate it and thank you so much for also putting that I was gonna ask you but the subs stack link, I will put that in the show notes so that we’ve got a direct link right over there.

And thank you for putting that research out there. That is fantastic. Maybe someday we’ll have a conversation about all these research places online and getting access to academic research and what a scam it is.

Nolan : Here’s the next book, Matt. We can co-write it.

Matt: Exactly. Nolan, hey, I hope you have a great day.

Thank you so much for coming on the podcast today. All right, and thank you dear listener for tuning in to another edition of the Endless Coffee Cup podcast. And until next time and the next cup of coffee, I’ll see you later on the Endless Coffee Cup.