[I’m so ready to NEVER again write about Siri and abortion]
by Julie Rovner
Just because you’re paranoid doesn’t mean they’re out to get you.
Feminists are paranoid nitpickers, amirite? Sigh.
That could be the motto this week for abortion rights groups that immediately sprang into battle mode when it was discovered that Siri, Apple’s new artificially intelligent personal assistant, wasn’t so, well, intelligent when it came to abortion.
And feminists as militant. Yay! Because women can’t possibly draw attention to systemic failures in a major piece of technology without “springing into battle mode.” (But, of course, me even writing this post tonight is just simply evidence of me being in battle mode, I guess)
It turns out, however, that it was all much ado about not so much.
Except that you are about to tell us why Siri is a problem and in ways that people hadn’t considered on a large scale (one that goes beyond simply information about abortion), especially since Apple markets Siri to make you think it’s backed by a powerful database, ala Google:
True, Siri does fail to find multiple abortion providers in large cities like Washington and New York City. She (the voice is female) also tends to send inquirers instead to far-flung pregnancy crisis centers, which not only don’t do abortions, but also actively work to dissuade women from having the procedure.
In a letter to Apple CEO Tim Cook, NARAL Pro-Choice America President Nancy Keenan complained that Siri “is not providing your customers with accurate or complete information about women’s reproductive-health services.”
“complained”? Come on. You know you could have picked a MUCH better word than that. But we also all know that feminists are complainers.
Some advocates wondered if there was a conspiracy afoot.
“There was conjecture by many colleagues that perhaps they were afraid of the anti-choice community, or perhaps, as one put it, it was a bunch of dudes programming, who didn’t have any notion of what women might need,” said Jodi Jacobson, editor-in-chief of RH Reality Check, a leading abortion-rights website.
But it turns out not so much.
“Siri is a dumb tool,” says Damon Poeter, a reporter for PC Magazine who wrote a story about the dustup.
“Dustup.” Okay, I see that word choice there. Way to belittle people’s concerns via your diction.
By that he means not literally dumb, but perhaps not as smart as she sometimes seems. For example, Siri doesn’t do searches based on Google, which Apple sees as a rival, but on other, newer search engines.
But Google, he says, “has had a decade to refine its results and get smarter and smarter about deciding what people actually want when they do searches. Siri is still in its infancy, so we’re going to see things like this happen.”
So, then, why don’t you talk about the way Apple markets itself? People were genuinely confused by this lack of information on Siri’s part because they were led to believe that Apple used something akin to Google to power it. And, in fact, it uses Yelp, which is user-added content (this linked article is WAY more informative than this NPR report and I suggest you read it if you are interested in learning more about all this). And clearly users who add to Yelp are much more interested in strip clubs and escorts than abortion clinics.
There’s another reason Siri may be favoring crisis pregnancy centers — they tend to use the word abortion a lot, while actual abortion clinics may not, if only to avoid protesters.
“So when Siri goes out into the Internet looking for what an abortion center is or what an abortion provider is, it hits on these non-abortion-providing organizations because they’re the ones who use the word to underlie their websites,” Poeter says.
Jacobson agrees that abortion-rights forces need to do a better job getting information out.
Yay, pro-choice advocates painted in militaristic tones…again.
“I think this is telling about a broad set of issues about the reliability of information on the internet, and the ways in which we might be self-censoring, that then lead us to have search engines that don’t fully inform us about what we need,” she said.
And now let’s try to point blame away from Apple and make Jodi’s words mean that blame should also be hoisted on the pro-choice movement. Of course, underlying what Jodi is saying is the reality that there is a large group of people that constantly threatens and harasses abortion providers, sometimes even killing them (clinics self-censoring is not simply to “avoid protesters”). It’s not simply about “getting information out” and I don’t think Jodi’s quote implies that pro-choice advocates have to do a “better job.” That’s an unfair assessment and one that shouldn’t be made unless you take the time to explain WHY Jodi mentions self-censoring.
On the other hand, many women’s groups have lamented that many tech products, Siri included, are clearly designed by men, primarily for men.
Indeed, a quick experiment from NPR headquarters in Washington using a colleague’s borrowed phone found Siri unable to find a single birth control clinic (“sorry about that”), but 16 drug stores where Viagra could be purchased (“12 of them fairly close to you!”)
Also, as Amadi and Abortioneers have made VERY clear, this isn’t simply about abortion. Siri fails when it comes to mammograms and rape, too. To act like this is SIMPLY about the controversial topic of abortion is to skate around the fact that Siri has LARGE database failures, especially in the face of Apple’s contention that Siri will provide you the information you ask for.
And as I say about this topic every time I have written about it over on Keep Your Boehner Out of My Uterus, this isn’t about punishing Apple or whining because that’s what ladies do, but is, in fact, about revealing the way knowledge moves in our society, how some of it is privileged and some not, and about changing Siri for the better. So, whether pro-choice advocates were battle-fierce paranoid complainers or not, it only took four days from Abortioneer’s original post to Apple’s response in the NY Times. That’s a win.
This article, though, is a fail.
[NB: the radio version of this report was much better, leaving out lots of the problematic language. Hmmm.]