Hopefully the next places will be more durable. It is still SAD and damaging when vibrant communities get destroyed though. I am more lamenting that.
Hopefully the next places will be more durable. It is still SAD and damaging when vibrant communities get destroyed though. I am more lamenting that.
People haven’t adjusted yet to the reality that online social ecosystems matter, they affect so much in the real world. Decimating multiple online spaces in such a short time has consequences and i hate that a handful of random guys with no stake in any of it except money get to make decisions like that.
You have articulated exactly how I feel whenever I see that word in a headline haha.
I feel you’re coming at this from an abstract angle more than how these things actually play out in practice. This isn’t reliable software, it isn’t proven to work, and the social and economic realities of the students and families and districts have to be taken into account. The article does a better job explaining that. There are documented harms here. You, an adult, might have a good understanding of how to use a monitored device in a way that keeps you safe from some of the potential harms, but this software is predatory and markets itself deceptively. It’s very different than what I think you are describing.
Yeah, I just fundamentally don’t think companies or workplaces or schools have the right to so much information about someone. But I can understand that we just see it differently.
An issue here for me is that the kids can’t op out. Their guardians aren’t the ones checking up on their digital behavior, it’s an ai system owned by a company on a device they are forced or heavily pressured to use by a school district. That’s just too much of a power imbalance for an informed decision to my mind, even if the user in question were an adult. Kids are even more vulnerable. I do not think it is a binary option between no supervision and complete surveillance. We have to find ways to address potential issues that uphold the humanity of all the humans involved. This seems to me like a bad but also very ineffective way to meet either goal.
Kids going to school cannot reasonably be expected to have the knowledge, forethought, or ability to protect themselves from privacy violations. They lack the rights, info and social power to meaningfully do anything about this. That’s why it’s exploitative and harmful. Edit: that’s also to say nothing of the chilling effect this is going to have on kids who DO need to talk about something but now feel they have to hide it, or feel ashamed of it. Shit is bad news all around.
This is awful. Surveillance is not a replacement for childcare. How many times must people say it. It is also not a replacement for managing employees or any other thing. I hate this timeline.
This was a great read. These dynamics are so prevalent.
Yep, I use it. I like it.
Thanks for posting this context. I’ve been wondering about this aspect of this event
Ummmmmmmm. This seems illegal. Is this not illegal?
!!! Yay for them! This is great to hear.
Exactly this. I hope that privacy is talked about in these terms more often going forward. It’s about so much more than what’s illegal or might put a person in danger. We cannot be authentic and connect to one another if we are constantly wary of some imaginary audience!!! Surveillance and what buzzfeed called “panopticontent” have absolutely wrecked and flattened self expression and in so many contexts where people used to be so vibrant. People who say “I have nothing to hide” miss the point entirely.
Tears of the kingdom over here. It’s bringing me joy
I think it’s also relevant that when I was growing up, people regularly changed between public and private depending on life circumstances, friend groups, etc. It was billed as a way to switch between people seeing your posts or not, NOT as a way to revoke or grant Facebook or any other entity any specific permission. It served a social function, and at a time when AI did not exist. They changed the meaning of that on us years after the fact and I have not seen any article address that. No teenager in 2011 was thinking of the private/public setting as consent for ai use, and none of these articles talk about pictures that were set to private after being public for a while. It’s bad faith