One day I will go on a rant about our society’s weird attitude toward neutering male dogs v spaying females but it’ll have to wait until I am no longer dealing with the blood loss.
I know a lot of people have way worse periods than I do. And usually mine aren’t too bad, a lot of cramping but not vomit-through-my-nose bad. I have to take iron supplements and usually it’s fine but right now
I keep getting dizzy. I feel limp. I took a really simple phone call and went upstairs, taking the elevator up and the stairs down and now I’m wiped out.
I need some hemoglobin. Nobody make me do anything else.
I feel like so much of trauma recovery for adults who’ve been traumatized is “you are not the only one who feels this way, this is normal.”
But for adults who were traumatized as children, there is an important experience of learning that it is in fact, not normal. It’s good to not feel alone, to find people who get it and to not feel crazy, but the process of going through who you are and picking out the things you had accepted as normal that are definitely not is so important.
I’ll never forget the moment that I realized people have happy dreams, like frequently. Or the day I realized that most people sleep through the night like almost all of the time. Those were really sad moments, but they were really important too, because they made me understand that I am not crazy. Something bad happened, and it changed me.
It’s okay to be different, it’s okay if the trauma changed you, but if you are running a race with a broken legand beating yourself up because you aren’t as fast are the other runners something needs to change. Realizing you are different is the first step to figuring out how to heal
Now, the alarming aspect of this story is that the very same technology is probably what tumblr is using to identify porn. Now, if it can’t tell that an empty field is not, in fact, full of sheep, what hope do we have that it can’t tell an empty room isn’t full of writing human forms engaged in passionate coitus?
this really does sound like an episode of black mirror
This is gonna produce some absolutely baffling pornography.
…. oh my fucking god they actually are using open source software. They’re using a fucking one-layer unidirectional bicategory tag-trained neural network. This will never work. Literally, it will never work. There’s just not enough algorithmic complexity to do what they’re asking of it. I bet you I could prove on a mathematical level that this joke of a neural net fundamentally lacks the abstraction necessary to do its job.
This will never get better. Their algorithm will never stop fucking up, it will never actually flag porn reliably and it will always require a massive quantity of human hours to deal with the deluge of mistagged pictures. This isn’t just a case of an insufficiently trained algorithm, it’s just … this is the most basic neural network you can make. It probably hasa a lot of neurons and has loads of training data but like … you can’t just brute force this kind of stuff. One layer of neurons is just Not Enough.
Also, just to make this clear, Tumblr lied. I mean, we already know this, but I mean they liiiieeeeed. All that stuff they promised about what would or would not be censored? That cannot be delivered on with a system this simple. Nude classical sculptures, political protests, male-presenting nipples (really Tumblr?), nude art outside the context of sex, all that? You cannot train a bicategory one-layer neural network to exclude those things. It cannot be done. Tumblr never intended for those things to actually be permitted, they were just lying. Because the system they have cannot actually do what they said it would and never will be able to.
Also, this kind of system is super vulnerable to counter-neural strategies. I bet you before the end of the month someone hooks up their own open source one layer bicategory neural network which puts an imperceptible (to humans) layer of patterned static over arbitrary images, and trains it by having it bot-post static-ed images to Tumblr and reinforcing based on whether the images are labeled nsfw or sfw. Seriously, within a month someone will have an input-output machine which can turn any image ‘sfw’ in Tumblr’s eyes.
This is genuinely pathetic. Like, I have real pity for whoever implemented this, because it’s clear Tumblr doesn’t actually have any engineers with any expertise with machine learning left at all and they foisted the job off on some poor bastard who has no idea what they’re doing and is going to get all kinds of flak for their (perfectly reasonable and predetermined) failure from management.
“All that stuff they promised about what would or would not be censored? That cannot be delivered on with a system this simple. Nude classical sculptures, political protests, male-presenting nipples (really Tumblr?), nude art outside the context of sex, all that? You cannot train a bicategory one-layer neural network to exclude those things. It cannot be done. Tumblr never intended for those things to actually be permitted, they were just lying.“