• Menu
  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Before Header

  • About Us
  • Lexicon
  • Contact Us
  • Our Store
  • ↑
  • ↓
  • ←
  • →

Balloon Juice

Come for the politics, stay for the snark.

Whatever happens next week, the fight doesn’t end.

Nothing says ‘pro-life’ like letting children go hungry.

Fani Willis claps back at Trump chihuahua, Jim Jordan.

Historically it was a little unusual for the president to be an incoherent babbling moron.

Democrats have delivered the Square Deal, the New Deal, the Fair Deal, and now… the Big Joe Biden Deal.

They’re not red states to be hated; they are voter suppression states to be fixed.

Come on, man.

Reality always lies in wait for … Democrats.

You are so fucked. Still, I wish you the best of luck.

Hot air and ill-informed banter

Bark louder, little dog.

The media handbook says “controversial” is the most negative description that can be used for a Republican.

Tide comes in. Tide goes out. You can’t explain that.

Putin dreamed of ending NATO, and now it’s Finnish-ed.

Books are my comfort food!

I’m just a talker, trying to find a channel!

“Everybody’s entitled to be an idiot.”

The current Supreme Court is a rogue court. Very dangerous.

The gop is a fucking disgrace.

Republican also-rans: four mules fighting over a turnip.

Their boy Ron is an empty plastic cup that will never know pudding.

Everyone is in a bubble, but some bubbles model reality far better than others!

… riddled with inexplicable and elementary errors of law and fact

Peak wingnut was a lie.

Mobile Menu

  • Four Directions Montana
  • Donate with Venmo, Zelle & PayPal
  • Site Feedback
  • War in Ukraine
  • Submit Photos to On the Road
  • Politics
  • On The Road
  • Open Threads
  • Topics
  • COVID-19 Coronavirus
  • Authors
  • About Us
  • Contact Us
  • Lexicon
  • Our Store
  • Politics
  • Open Threads
  • 2024 Elections
  • Garden Chats
  • On The Road
  • Targeted Fundraising!
You are here: Home / Science & Technology / Siri and Apple: Abortion Madness

Siri and Apple: Abortion Madness

by Imani Gandy (ABL)|  November 30, 20116:35 pm| 100 Comments

This post is in: Science & Technology, Vagina Outrage, Seriously

FacebookTweetEmail

Really? can’t we all get a grip?

Look, I’m all for equality, and I’m as pro-choice as a woman can be. But this kerfuffle over Siri not providing abortion locations to iPhone 4s is a serious mountain/mole-hill situation.

Apparently, Apple has been catching flak from pro-choice advocates because Siri does not provide information (or fails to provide correct information) about abortion and other reproductive health services for women.

Some are disappointed that Siri doesn’t provide information about reproductive services while providing answers to such male-centric questions as “Where can I find Viagra?” and “Where can I get a decent blowjob?” and “Where can I look at some naked boobs?”

Alternet has a list of searches that Siri recognizes, and none of them include abortion services (and most of them seem to me to be the result of cheeky programmers):

  • Viagra.
  • Hospitals to go to if you’ve had an erection lasting for more than 5 hours.
  • Places you might be able to score marijuana.
  • Where to dump a body: in Brooklyn, it recommends a smelting plant in New Jersey.
  • The meaning of life: Siri will alternately quote from Douglas Adams (42) or Monty Python’s “The Meaning Of Life.”
  • What to do if a hamster is caught in your rectum: in D.C., she’ll direct you to Charming Cherries Escort Service.
  • Asked how to obtain a free blow job in D.C., she’ll direct you to the same escort service. (We doubt that they are free.)
  • If you’d like to see a naked woman in Brooklyn, Siri will suggest a variety of Manhattan-based strip clubs.
  • If you’re in Queens and seeking breast implants, she’ll recommend 4 local plastic surgeons.
  • But if you ask Siri about vaginoplasty, she’ll scold you about your language.

Some folks have, apparently, gone ’round the bend, claiming that Siri is pro-life, and that this Siri kerfuffle demonstrates that Apple has an anti-abortion agenda or is a conservative-run corporation. (Sure — a conservative corporation with a gay CEO — ohhhhkay.)

From Raw Story,

“Many of these centers are not up front about their anti-abortion, anti-contraception agenda when advertising online or in other channels,” Nancy Keenan, president of NARAL Pro-Choice America Foundation, noted in a letter to Apple CEO Tim Cook.

“Siri is a great tool that mixes humor and sarcasm in responding to questions—and it is another example of how your company is on the cutting edge of demonstrating how technology can transform the way we share and access information,” she said. “Thus, it is disappointing to read that a tool like Siri is missing the mark when it comes to providing information about such personal health issues as abortion care and contraception.”

“Although Siri is not the principal resource for women’s health care, I hope you agree that it is important that the women who are using this application not be misled about their pregnancy-related options.”

Frankly, I don’t see what the big deal is. If you need an abortion and Siri won’t tell you where to get one, then find the information the old-fashioned way — use your fingers and type in a search query. This doesn’t strike me as an anti-abortion agenda — at all.

Technology is a male-dominated field, and it seems to me that this Siri mix-up is the result of male privilege (Viagra and blowjobs?), rather than nefarious Christian right anti-abortion censorship. Moreover, Apple did not develop Siri itself — Apple bought the company that developed the software in April 2010. And as commenter John Woods on the blog The Abortioneers notes:

1. There is a very good chance the initial training dataset used for Siri’s AI is male-biased. This is a huge problem in the tech world. Remember how Google+ was opened up to Silicon Valley insiders first, and ended up 90% male?

– This is something most data people try to avoid, but remember that Apple didn’t develop Siri — Apple bought another company that had developed the product.

– The data Siri collects is fed back to Apple and will probably be used to improve the product in the future.

2. In general, developers of AI really do not like to intervene when their algorithms give bad or biased results. Consider “santorum” on Google — Google isn’t going to remove it because (a) they want to be consistent in their handling of terms-of-service violations (and non-violations) and (b) they don’t want to adjust their algorithm every time something like this comes along, since it biases the AI further down the road in unexpected ways.

3. I would guess the search for “birth control” allows Siri to make an association with “health” for that search term, and it has been well-trained to search for “hospital” and “clinic” when it finds a health term. But if I wanted to find birth control, I wouldn’t search for a “birth control clinic.” I’d probably search for a clinic or a Planned Parenthood. Siri isn’t smart enough to make that association (yet).

– This is not to say that there isn’t a clear societal bias toward male service over female services. You can probably go just about anywhere to get Viagra prescribed, and would find yourself somewhat more limited when trying to obtain a birth control prescription.

In conclusion, Apple is probably using a biased dataset. I doubt this was Apple’s choice. But you should definitely let Apple know about it, because they may be able to correct some of the biases in their training data.

Woods’ theory makes sense. As Raw Story notes,

Norman Winarsky, who co-founded the firm that developed Siri and sold it to Apple in 2010, told the New York Times that the third-party services Siri uses to generate its answers were probably to blame for the software’s anti-abortion bias.

“Those answers would be coming from the Web services that Siri is connecting to, not necessarily Apple,” he said. “My guess at what’s happening here is that Apple has made deals with Web services that provide local business information, and Apple probably hasn’t paid much attention to all the results that come up.”

Are Silicon Valley and the companies based there male-centric? Absolutely. Are there issues of male privilege at work here? I wouldn’t doubt it. But can we dial down the “OMG! APPLE HOW COULD YOU!?” rhetoric and just wait for the next software update? And resolve to use our smartphones the way our forefathers did? By typing with our fingers?

That’d be great.

Then again, who knows? I’m probably just an Apple apologist.

Update: For the record, I agree that the male-centricness of the tech industry is a problem that should be discussed and ameliorated to provide opportunities for tech-inclined women. I’m not saying that male privilege is “ok.” I’m saying that concluding that this is some anti-abortion conspiracy or that Siri is pro-life is a tad ridiculous.

[via Raw Story]

[cross-posted at Angry Black Lady Chronicles]
FacebookTweetEmail
Previous Post: « Just Rope and Throw and Grab ‘Em
Next Post: Policing Our Discourse »

Reader Interactions

100Comments

  1. 1.

    Soonergrunt

    November 30, 2011 at 6:41 pm

    I blame Obama.

    It’s stupid and reflexive, but somebody’s going to do it, so it might as well be me.

  2. 2.

    ABL

    November 30, 2011 at 6:43 pm

    also, siri is racist.

    (might as well say that, too.)

  3. 3.

    joeyess

    November 30, 2011 at 6:44 pm

    It’s not Apple, but it is the app. When asked for abortion services, one choice given is these so-called “pregnancy crisis centers”.

    So I guess that can be viewed in a negative light.

  4. 4.

    Jon O.

    November 30, 2011 at 6:44 pm

    So is the problem here that Siri is designed in such a way that it doesn’t offer adequate information on reproductive services? Or is the bigger problem that people on the Internet are complaining about it?

    Telling people not to be so uptight is often another way of delegitimizing their complaints. I’m not saying you’re doing that intentionally, but yeah, recognize that it is an issue, and it’s worth looking at, rather than choosing to confront the minority of people asserting a patriarchal conspiracy.

    And yes, you can just as easily type it into a search! But the idea behind Siri is that you don’t have to anymore. It’s really revolutionary, and seeing my friends make use of it is the first time I’ve ever wanted to leave the Android pasture. This is the future of search. If it’s not serving society in the ways it should, people need to hear about it.

  5. 5.

    Brachiator

    November 30, 2011 at 6:45 pm

    Places you might be able to score marijuana.

    This is sexist? People need to Seri-ously get a grip.

    Are Silicon Valley and the companies based there male-centric? Absolutely. Are there issues of male privilege at work here?

    Odd. I did some contract work for google. All my bosses were women. This was one of the least conventional environments where I have ever worked.

  6. 6.

    Joel

    November 30, 2011 at 6:46 pm

    People actually use this thing?

  7. 7.

    Odie Hugh Manatee

    November 30, 2011 at 6:46 pm

    @Soonergrunt:

    This wouldn’t have happened if Obama had used the bully pulpit…

    and beaten someone to death with it.

    Damn you Scott Beauchamp!

  8. 8.

    Cris (without an H)

    November 30, 2011 at 6:46 pm

    Amadi puts it well. While she’s clearly irritated by this (irritated enough to document it pretty thoroughly), she says:

    Is this the most terrible programming failure ever? No. Is this worth a boycott of Apple? I don’t think so. What it is, however, is a demonstration of a problem.
    …
    This isn’t just about gender. This is about something more esoteric and far far less simple to explain.

  9. 9.

    pseudonymous in nc

    November 30, 2011 at 6:46 pm

    It’s not Apple, but it is the app. When asked for abortion services, one choice given is these so-called “pregnancy crisis centers”.

    It’s not the app, it’s the dataset.

    I’ve been thinking about this today, and after my initial upset, I really think it’s a GIGO problem — that is, the dataset is muddied and corrupted by the fact that providers tend to describe themselves euphemistically (for obvious reasons) and faux-clinics do a good job of pretending to be the genuine article, no different from domain hijackers, content mills and black-ops SEO.

    So in this case, the pool of information is particularly fucked up — to some degree, deliberately so — and Siri is a window into its fuckedupitude.

  10. 10.

    Brian S

    November 30, 2011 at 6:46 pm

    When I first heard about this, my immediate reaction was “probably a bug in the software, and I’ll bet in a couple of weeks, it’ll be sorted out.” I see no reason to change that reaction at this point. It’s a new product. It would be more surprising if it didn’t have a bug of some kind in it (even if that bug was more of the “the programming team is a bunch of sexist jerks” variety).

  11. 11.

    Jenny

    November 30, 2011 at 6:46 pm

    This isn’t surprising considering Steve Jobs simply abandoned his daughter and fought recognition of paternity.

  12. 12.

    ruemara

    November 30, 2011 at 6:47 pm

    I’ve worked in tech for about–SWEET GODDESS–20 years. They don’t know women exist until they are hungry, or want to fuck. Even after they are married. There are many fine women and men in technology, just not all at one company. Develop an app for that and don’t wait for a largely not you demographic to understand what your issues are. Also, it is Michelle Obama’s fault.

    edited for typos

  13. 13.

    Jon O.

    November 30, 2011 at 6:49 pm

    Also, psuedonymous has a great point about the likely actual roots of this. There’s a strong chance that it’s a disconnect between what the service calls itself and how people search for it. Very curious about how SEO as a study is applied to abortion on the Internet. Gotta be the nerdiest war of all.

  14. 14.

    David Koch

    November 30, 2011 at 6:50 pm

    @Soonergrunt: @Odie Hugh Manatee:

    This kind of omission doesn’t just happen. It was obviously coordinated by Obama and DHS.

  15. 15.

    West of the Cascades

    November 30, 2011 at 6:52 pm

    You Applebot, you!!

  16. 16.

    Soonergrunt

    November 30, 2011 at 6:54 pm

    @Odie Hugh Manatee: I’m sure there’s something in the Wikileaks Assange Sexytime! archive on this. That will prove…well, something as a mimetic crowd-sourced blatherskyte jibber-jabber, ya fuken cudlip!

  17. 17.

    Raging Thunderbolt

    November 30, 2011 at 6:56 pm

    Moreover, Apple did not develop Siri itself—Apple bought the company that developed the software in April 2010.

    Why is this at all relevant? If they bought it, and pushed it out as the centerpiece of their product, then it’s on them. Besides, other Apple apologists are happy to claim credit for Apple when Siri does well.

  18. 18.

    kc

    November 30, 2011 at 6:56 pm

    Technology is a male-dominated field, and it seems to me that this Siri mix-up is the result of male privilege (Viagra and blowjobs?), rather than nefarious Christian right anti-abortion censorship.

    Oh, well, then that’s OK then.

  19. 19.

    scav

    November 30, 2011 at 6:57 pm

    Knowing nothing about Siri beyond it apparently doesn’t work in Scotland either, I’d be taking some serious looks at the underlying datasets too. They’ve been getting better but the continental scale POI (Points of Interest) datasets I worked with were uniformly shitty and heavy on the racy services. One time every single Subway joint was coded as Urban- Transportation instead of Food-Fast and many hours were spent clearing out the exotic dancers from the places to rent clowns.

  20. 20.

    kc

    November 30, 2011 at 6:57 pm

    Moderation, really?

  21. 21.

    Soonergrunt

    November 30, 2011 at 6:58 pm

    @kc: Not anymore.
    You used the word “viagra” and that triggered the mod filter.

  22. 22.

    Odie Hugh Manatee

    November 30, 2011 at 6:59 pm

    @Soonergrunt:

    Well I have seen a cow chewing its cud and I do have lips so…

    damn, motoloco4chan is right!

    Oh WAI oh WAI?!

  23. 23.

    kc

    November 30, 2011 at 7:00 pm

    @Soonergrunt:

    Not anymore.
    You used the word “viagra” and that triggered the mod filter.

    Ah, thanks. Sheesh . . .

  24. 24.

    Mnemosyne

    November 30, 2011 at 7:00 pm

    @pseudonymous in nc:

    I’ve been thinking about this today, and after my initial upset, I really think it’s a GIGO problem—that is, the dataset is muddied and corrupted by the fact that providers tend to describe themselves euphemistically (for obvious reasons) and faux-clinics do a good job of pretending to be the genuine article, no different from domain hijackers, content mills and black-ops SEO.

    That would be my first thought, too. I don’t have Siri, but what’s the response to, “Where is the nearest Planned Parenthood?” or another specific location? If that’s okay, then it almost certainly is the dataset. If not …

  25. 25.

    Sisyphus

    November 30, 2011 at 7:01 pm

    WTF? Why can’t Siri just perform the abortion?

  26. 26.

    kc

    November 30, 2011 at 7:01 pm

    Damn it, I did it again. You don’t have to release that last one. Thanks for the explanation!

  27. 27.

    Odie Hugh Manatee

    November 30, 2011 at 7:02 pm

    @David Koch:

    … and communicated via Obama’s 11.6 Jigahoitz Lightspeed Brain Wave Put-er-outer.

  28. 28.

    pseudonymous in nc

    November 30, 2011 at 7:02 pm

    @Jenny: I think that’s a false conclusion, not hugely different from the faux-clinic bosses crowing about this as a vindication of their fuckery.

    It’s fair to attribute the institutional prudishness in the App Store approval process to Jobs — “You know, there’s a porn store for Android. Anyone can download them. You can, your kids can. That’s just not a place we want to go.” — but I think in the case of Siri, which uses Yelp for local listings, then Google as a fallback, the anti-choice propagandists are doing the same thing that they do when they string along vulnerable women at their “crisis centers”. They have form.

  29. 29.

    FlipYrWhig

    November 30, 2011 at 7:03 pm

    @joeyess: Those places call themselves, in Ye Olde Tele-phone Booke, “abortion services.” So that’s an accurate search result. It’s not what you’re looking for by doing that search, but, as pseud and Jon observed, that’s because the places that list themselves in the phone book are deliberately obfuscating.

  30. 30.

    Comrade Mary

    November 30, 2011 at 7:03 pm

    There’s actually a surprisingly informative and relatively noise-free Metafilter thread going on about this right now. Among other things, Siri seems to be using a Yelp database.

  31. 31.

    FlipYrWhig

    November 30, 2011 at 7:05 pm

    @Mnemosyne: Or, “where can I buy a vibrator”?

  32. 32.

    Soonergrunt

    November 30, 2011 at 7:07 pm

    @kc: So then you quoted me, wherein I had the word “viagra” and it triggered again.
    sheesh.

  33. 33.

    Jeff

    November 30, 2011 at 7:07 pm

    ABL, I find myself agreeing with you more and more lately. I’m concerned that that may indicate that something has gone wrong with one of us. ;)

  34. 34.

    pseudonymous in nc

    November 30, 2011 at 7:07 pm

    @Mnemosyne: based on the reports in the original blog posting, and subsequently, “Siri does respond specifically to a question about the phone number for Planned Parenthood clinics locate[d] nearby.” That makes sense if it’s a polluted dataset, and it’s a reminder of why anti-choicers hate Planned Parenthood’s brand identity, because it’s something they can’t fuck with.

    That Metafilter thread is useful, because you have a bunch of people who clearly know their shit about search algorithms and datamining.

  35. 35.

    ABL

    November 30, 2011 at 7:07 pm

    @Sisyphus: you win all the internets.

  36. 36.

    Nicole

    November 30, 2011 at 7:12 pm

    And of course, the first thing I did after reading this was tell Siri I needed a vaginoplasty. Because, deep down inside, I’m twelve.

  37. 37.

    boss bitch

    November 30, 2011 at 7:12 pm

    DUH!

  38. 38.

    The prophet Nostradumbass

    November 30, 2011 at 7:13 pm

    @Raging Thunderbolt:

    Besides, other Apple apologists

    what do you think this is, Ars Technica or Slashdot?

  39. 39.

    Soonergrunt

    November 30, 2011 at 7:13 pm

    @Sisyphus: Well, it IS a series of tubes, you know.

  40. 40.

    scav

    November 30, 2011 at 7:15 pm

    @Soonergrunt: Wouldn’t that be a vasectomy then?

    ETA: Wouldn’t it be easier to do, I mean, or something along those lines.

  41. 41.

    Soonergrunt

    November 30, 2011 at 7:19 pm

    @scav: That depends on what you have to say to get it to do it.

  42. 42.

    Nicole

    November 30, 2011 at 7:25 pm

    Okay- I just asked:

    Me: “I need a bigger pen1s.”

    Siri: “Nicole! Your language!”

    Me: “I can’t get an erection.”

    Siri: “I suppose you can’t.”

    Me: “I want a male prostitute.”

    Siri: “I found 9 escorts… 8 of them are fairly close to you.”

    (Note: Got the exact same response when I said, “I want to have sex with a dog.”)

    Me: “I need Planned Parenthood.”

    Siri: “I found 6 places matching ‘Planned Parenthood’ fairly close to you.”

    And Siri wouldn’t tell me the plural of either “pen1s” or “vag1na.”

  43. 43.

    Rebmarks

    November 30, 2011 at 7:25 pm

    When I read about this I immediately asked Siri where I could get an abortion and she quite appropriately gave me the address for the 2 closest family planning/abortion clinics in the Boston area. Maybe she answers differently in Red states?

  44. 44.

    Martin

    November 30, 2011 at 7:26 pm

    Knowing something about Siri, the kerfluffle is misplaced.

    First, Apple is a pretty socially liberal place in a pretty socially liberal community in a pretty socially liberal state.

    Second, Siri is merely a proxy for other services. If Yelp doesn’t have abortion providers, then Siri doesn’t have abortion providers. And Siri is fairly incomplete now even as it was a year ago.

    As a standalone product they had access to a lot of service sources that Apple hasn’t yet baked back into the product (going from thousands to tens of millions of users requires getting permission first) so that’s left quite a lot of gaps in what Siri does and where Siri goes for information. It used to do a LOT more with movie and restaurant information, for instance, even being able to place reservations and buy tickets on your behalf. My guess is that for information like this, Apple would be working out an agreement with WebMD and other services to hook it into fairly definitive, well maintained content. I’m pretty sure nobody would advocate relying on Yelp for who is the best abortion provider.

    One of the primary things that Siri does is examine the request, the context of the request, and work out what is the best service for it to back into. So if it’s information about movies it’ll go to rottentomatoes and fandango and if it’s restaurant information it’ll go to yelp and opentable. Part of the point of Siri is that it doesn’t just blindly and arbitrarily shove information out to Google to suffer at the whims of pagerank. So you actually want it to intercept actionable requests and send them to an appropriate place. In this case, it may know how to intercept an abortion (or any other important medical) request, but not yet to deliver it. Beta software is incomplete like that, and Siri is quite obviously beta even as compared to how it worked a year back.

  45. 45.

    Origuy

    November 30, 2011 at 7:28 pm

    Once upon a time, I had a device that by entering one number, I could use my voice to obtain the information that would let me contact any business or person with a telephone, unless that person chose not to be listed. The voice recognition system, while not perfect, was vastly superior to Siri. That system was called a telephone operator. He, or more commonly she, had an alphabetized database called a phone book.

    You kids get off my lawn.

  46. 46.

    Shinobi

    November 30, 2011 at 7:29 pm

    I think Planned Parenthood and other groups are understandably upset that Anti-Choice organizations are more likely to pop up in search results than planned parenthood.

    And then I wonder if this has more to do with what “Crisis Pregnancy Centers” call themselves and less to do with politics or male/femaleness. You’ll notice that these places have the word Pregnant in their title, which Planned Parenthood does not. They probably also mention abortion counseling in their descriptions, because they like to do that so they can drag you in and show you an ultrasound and tell you what a horrible person you are if you kill your baby.

    It makes sense to me that if you ask Siri what to do because you are pretnant, or use some other search involving the words pregnancy or abortion these things might come up first. They don’t have to worry about offending the pro life camp with every little thing they say like planed parenthood does, and they are specifically trying to attract people seeking abortions so they can save children no one wants.

    So I’m saying, maybe it’s not siri. Maybe it’s how the CPC’s have branded themselves.

  47. 47.

    Baud

    November 30, 2011 at 7:30 pm

    Has anyone asked Siri if they should purchase an iphone or an android phone? I’m curious what the answer would be.

  48. 48.

    scav

    November 30, 2011 at 7:37 pm

    @Origuy: Ah those days when you could call somewhere and get The Time at the Tone Will Be: Six. Thirty. Eight. messages.

  49. 49.

    Omnes Omnibus

    November 30, 2011 at 7:39 pm

    @Soonergrunt: @ABL: Well, shit, you guys didn’t leave much for me to say.

  50. 50.

    kc

    November 30, 2011 at 7:43 pm

    @Soonergrunt:

    So then you quoted me, wherein I had the word “via –

    Fool me once . . . uh, won’t get fooled again!

  51. 51.

    Southern Beale

    November 30, 2011 at 7:44 pm

    Just to clarify, it wasn’t just abortion clinics it was also rape crisis centers ….

    I don’t know what this Siri thing is, you kids and your newfangled gadgets, offa my damn lawn, etc. But I was at a play in New York Monday night (“Lysisistrata Jones,” a hilarious update of the Aristophanes comedy) and funnily enough in one scene Lyssie J uses a Siri-like ap to find out where the nearest brothel is. That got me wondering if Siri can find a brothel? Apparently it can find penis pills, some folks checked. So by God it should be able to find an abortion provider and a rape crisis center.

    Just sayin’.

    It’s a man’s world ….

  52. 52.

    Nicole

    November 30, 2011 at 7:44 pm

    @Baud: I just asked- Siri doesn’t understand what I mean by “Android.” Sure, she doesn’t.

  53. 53.

    Southern Beale

    November 30, 2011 at 7:45 pm

    Just to clarify, it wasn’t just abortion clinics it was also rape crisis centers ….

    I don’t know what this Siri thing is, you kids and your newfangled gadgets, offa my damn lawn, etc. But I was at a play in New York Monday night (“Lysisistrata Jones,” a hilarious update of the Aristophanes comedy) and funnily enough in one scene Lyssie J uses a Siri-like ap to find out where the nearest brothel is. That got me wondering if Siri can find a brothel? Apparently it can find penis pills, some folks checked. So by God it should be able to find an abortion provider and a rape crisis center.

    Just sayin’.

    It’s a man’s world ….

  54. 54.

    Southern Beale

    November 30, 2011 at 7:45 pm

    Aw fuck the stupid moderation function. I give up.

  55. 55.

    Comrade Mary

    November 30, 2011 at 7:46 pm

    @Nicole: Be a little more subtle. Ask her what she thought of Blade Runner.

  56. 56.

    FlipYrWhig

    November 30, 2011 at 7:49 pm

    What happens if you tell Siri “I think I might be gay?”

  57. 57.

    Baud

    November 30, 2011 at 7:51 pm

    @Nicole: Awesome. How perfectly Orwellian.

  58. 58.

    Roger Moore

    November 30, 2011 at 7:57 pm

    @pseudonymous in nc:

    I’ve been thinking about this today, and after my initial upset, I really think it’s a GIGO problem—that is, the dataset is muddied and corrupted by the fact that providers tend to describe themselves euphemistically (for obvious reasons) and faux-clinics do a good job of pretending to be the genuine article, no different from domain hijackers, content mills and black-ops SEO.

    I’m not so sure. If the problem is that Siri has a hard time telling a real clinic from an anti-abortion front group- and that seems plausible, since Google seems to have exactly that problem- you’d think it would still have a bias in favor of places that are close by rather than ones that are further away. But that’s not what AlterNet is reporting; they’re saying it’s ignoring nearby Planned Parenthood clinics in favor of more distant fake clinics. That sounds as if there’s something more going on. It may be that they’re depending on user reviews and anti-abortion groups are spamming the system, or it could be that they’ve accidentally contracted with a biased information provider, but there’s more than just fake clinics successfully disguising themselves.

  59. 59.

    andrewsomething

    November 30, 2011 at 7:58 pm

    Sure—a conservative corporation with a gay CEO —ohhhhkay.

    Umm… A gay white male for instance could never believe that black people are inherently less intelligent than whites.

  60. 60.

    Nicole

    November 30, 2011 at 8:00 pm

    @Comrade Mary: Heh. I just did: “I’d rather not say.” That was awesome. I followed up with, “Did you like Alien?” and she said, “This is about you, Nicole, not me.”

    And she says she has no opinion about “The Last Unicorn.” Siri has no soul.

  61. 61.

    FlipYrWhig

    November 30, 2011 at 8:05 pm

    @Roger Moore: I’m curious about how Siri would handle some hypothetical parallel case… but I’m having a hard time thinking of a parallel case. Erection medicines are somewhat parallel, if the general case is “how does Siri handle queries related to sex and health.” What if the general case is “how does Siri handle queries that are politically charged?” What happens if you search for a white supremacist group? Or, for that matter, one you’d think would be no problem interpreting, a mosque?

  62. 62.

    Soonergrunt

    November 30, 2011 at 8:05 pm

    @Southern Beale: I cleared the first one and deleted the second one.

  63. 63.

    Bitcaptain

    November 30, 2011 at 8:05 pm

    There are two important points to remember about this
    1. This is beta software. There are many things Siri cannot do
    2. I tried this myself and it worked for me if you ask the correct question. Just say “google abortion clinics”.

    So to say or imply that this is a big mistake by apple
    Or it is some ominous sign of the tech culture is very misleading.

    Just to be clear I am pro life but I also believe people can make their own decisions.

  64. 64.

    Soonergrunt

    November 30, 2011 at 8:06 pm

    @FlipYrWhig: Ask for a link to the Kama Sutra and see what happens.

  65. 65.

    andrewsomething

    November 30, 2011 at 8:07 pm

    Anyways… First I’ve heard of this, and I really hope no one is relying on Siri for this kind of information, but judging from the NARAL quote above it sounds more like it is providing results that lead to conservative anti-abortion “counselors.”

  66. 66.

    Nicole

    November 30, 2011 at 8:10 pm

    @FlipYrWhig: Because I am now obsessed- I just said, “I need a mosque” and Siri found 8 near me. She doesn’t understand “white supremacist,” though.

  67. 67.

    ABL

    November 30, 2011 at 8:11 pm

    @Nicole: that’s amazing.

  68. 68.

    FlipYrWhig

    November 30, 2011 at 8:12 pm

    @Nicole: Interesting… Now I want to play! :P

  69. 69.

    uptown

    November 30, 2011 at 8:21 pm

    I think we all know what happens when you don’t follow through on things like this. Maybe it’s overkill, but it sends the right message for once.

  70. 70.

    FlipYrWhig

    November 30, 2011 at 8:22 pm

    @andrewsomething: My local Superpages tags the nearest Planned Parenthood as

    Family Planning & Birth Control Clinics
    Family Planning
    Pregnancy Counseling & Information Services
    Birth Control & Family Planning Information & Services

    My local Yelp listings seem to apply, haphazardly, a category of “Obstetricians and Gynecologists” to some of the PP locations and not to others.

    Seems like there are oddities in the “metadata” and tagging, perhaps being amplified by the deliberate ambiguity of faux clinics pushing “abortion alternatives.”

  71. 71.

    Roger Moore

    November 30, 2011 at 8:28 pm

    @Bitcaptain:

    This is beta software. There are many things Siri cannot do

    Sorry, but “it’s a beta” is an exceptionally lame excuse. It’s either ready for prime time or it isn’t. Apple has been hyping it like crazy and proclaiming it the big reason to get an iPhone 4S. That says it’s ready. They can’t plausibly turn around and say it’s still experimental and nobody should be surprised if it does things terribly wrong.

  72. 72.

    RSA

    November 30, 2011 at 8:41 pm

    On a recent trip to the Outer Banks in North Carolina, my wife and I were watching some ducks. My conversation with Siri:

    Me: What do ducks eat?

    Siri: I find five restaurants nearby.

    Me: What is food for ducks?

    Siri: (Something about restaurants again, perhaps some with duck on the menu).

    I tried a few more times, but without better luck. Apparently ducks eat whatever they can order from a menu in a restaurant, just like people.

  73. 73.

    Jane2

    November 30, 2011 at 9:15 pm

    FFS. Maybe they could just look it up on the same damn phone using the browser. What about the eleventy hundred million of us with no Siri to talk to?

  74. 74.

    Ripley

    November 30, 2011 at 9:40 pm

    @Nicole: Ask her why Cole still reads Sully.

  75. 75.

    pseudonymous in nc

    November 30, 2011 at 10:18 pm

    @Roger Moore:

    They can’t plausibly turn around and say it’s still experimental and nobody should be surprised if it does things terribly wrong.

    You’re upset that Siri confuses you with the James Bond guy, right?

    Be serious here. Siri is a freeform interface to a data corpus, and as a freeform interface — as opposed to the closed, command-driven TellMe voice interface for WP7, for instance — there’s never going to be a point at which it can be declared “done” in those terms, because there are always going to be edge cases where either the corpus is problematic, the AI is problematic, or the interaction of the two is problematic. This is an edge case: it’s taken six weeks for it to become newsworthy.

  76. 76.

    lamh35

    November 30, 2011 at 10:31 pm

    Wow I just got my iPhone 4S. I guess I know what the first questu
    Ion would be

  77. 77.

    Raging Thunderbolt

    November 30, 2011 at 11:31 pm

    http://amaditalks.tumblr.com/post/13513981784/siri

    That seems to disprove many of the pro-Apple alternative explanations floated here.

  78. 78.

    pseudonymous in nc

    December 1, 2011 at 12:47 am

    @Raging Thunderbolt: I saw that post this morning, as did Cris upthread, and I don’t think it says what you think it says.

    And it’s a mischaracterisation to say that people who are arguing “uh oh, shitty dataset” over “deliberate query censorship” are “pro-Apple” as a result. Clearly, Apple needs a better dataset than the current combination of Yelp and Wolfram Alpha for this set of queries, and at very least, a better set of AI rules to parse it, pronto.

  79. 79.

    slightly-peeved

    December 1, 2011 at 2:42 am

    @RSA:

    well they do, except when they’re done eating they ask the restaurant to put it on their bill.

  80. 80.

    Frank

    December 1, 2011 at 5:54 am

    Isnt the iphone providing the same amount of abortion clinic info it was last month?
    Modern Warfare 3 also won’t assist you in December 2011.

  81. 81.

    AA+ Bonds

    December 1, 2011 at 5:58 am

    Sheesh, I’m glad we can count on you to head off all these dangerous events where people on the left say something out loud about any topic

  82. 82.

    BattleCat

    December 1, 2011 at 6:05 am

    BattleCat feels like this conversation isn’t focused enough on male privilege, which is clearly the root of this problem.

    BattleCat is sure that only a grunting male computer-man would make such an incredible and obvious oversight.

  83. 83.

    pharniel

    December 1, 2011 at 9:29 am

    eh-hem – http://amaditalks.tumblr.com/post/13513981784/siri

    Siri understands what is being asked but fails to provide the basic services of google.
    also it’s not confined to abortion but also rape, rape counsiling and other women’s health issues up to and including not responding for a specific request for a business name.

    It’s intentional. Someone coded it that way.

    http://amaditalks.tumblr.com/post/13513981784/siri

  84. 84.

    electricgrendel

    December 1, 2011 at 9:43 am

    I, for one, am shocked- SHOCKED!- by ABL’s contrarian take on a matter.

  85. 85.

    BattleCat

    December 1, 2011 at 9:58 am

    @pharniel:

    BattleCat agrees completely: this is no way this is not sabotage by a disgruntled member of the patriarchy.

  86. 86.

    Raging Thunderbolt

    December 1, 2011 at 11:14 am

    @pseudonymous in nc: One need not do wrong intentionally in order to do wrong. I can run someone over with my car unintentionally, but I’ve still done something run. So, even if Apple were not malicious, that wouldn’t absolve them of their moral responsibilities and subsequent failings.

    (You might agree with all this, which is cool. Just laying out a sensible position.)

  87. 87.

    noabsolutes

    December 1, 2011 at 11:37 am

    it’s similar when you search for information about abortion and gun control via teh Googles. Google says “don’t be evil” but evil just carpet-bombs the internet with so much biased data that it overwhelms any claim to moral superiority/impartial data meritocracy that the Masters of the Universe, in their infinite programming wisdom, wanted to allow for by creating unbiased, scientifically-driven software.
    So, no, Apple isn’t evil because Siri can’t find a necessary legal medical procedure. Apple is evil because iPhones and iPads are made in Upton Sinclair-esque suicide factories that our global economy keeps in business.

  88. 88.

    theturtlemoves

    December 1, 2011 at 11:51 am

    So, as someone who actually codes search engines for a living, I’m pretty darn sure this isn’t deliberate and agree with everyone who says it is the dataset. Nobody working under the timelines these folks are likely working under is going to take hours to purposely hard-code anything into the algorithms just to be an asshole. It would have way too big a chance of hosing other results for seemingly unrelated searches. Unless Siri was coded by masochists who love nothing better than chasing down bugs caused by hard-coding things you should let the algorithms and the data take care of.

  89. 89.

    BattleCat

    December 1, 2011 at 1:21 pm

    @theturtlemoves:

    Siri was coded by men, so BattleCat thinks it is only natural that one of the byproducts of this is that search terms related to womankind weren’t explicitly provided for in the corpus.

  90. 90.

    pharniel

    December 1, 2011 at 1:50 pm

    @BattleCat

    except that when ASKED FOR A SPECIFIC BUSINESS NAME Siri said it couldn’t find it.
    it found the business when asked for a men’s health issue.

    This indicates deliberate mucking about with data sets not simple ‘oh i didn’t have time to code that’.

  91. 91.

    FlipYrWhig

    December 1, 2011 at 1:53 pm

    I’m guessing that if you tell Siri “I’ve just been shot in the balls” it doesn’t know what to do either.

  92. 92.

    Tom

    December 1, 2011 at 1:55 pm

    @pseudonymous in nc: Apple needs a better dataset than the current combination of Yelp and Wolfram Alpha for this set of queries, and at very least, a better set of AI rules to parse it, pronto.

    Yep, though I don’t see anything better than Wolfram Alpha coming along for awhile, which means we need to wait until Siri’s AI “learns” how to parse all these varying queries. On the upside, it will improve the more people use it, on the downside it could be years away before it functions as intended.

  93. 93.

    FlipYrWhig

    December 1, 2011 at 1:56 pm

    @pharniel: In the discussion threads linked above other people report being able to find such services in their cities. So “deliberate mucking about with data sets” seems counterindicated by the unpredictability of the results.

  94. 94.

    BattleCat

    December 1, 2011 at 1:58 pm

    @pharniel:

    BattleCat agrees whole-heartedly, hence his comments about the patriarchy influencing the search results.

    BattleCat wouldn’t be surprised if there were some kind of clever algorithm filtering results out if they detect the user might be pregnant and looking for an abortion clinic.

  95. 95.

    FlipYrWhig

    December 1, 2011 at 1:59 pm

    It would be nice if there was some kind of feature where Siri would understand commands that were related to emergencies of various kinds and offered to contact police, fire, EMS, etc.

  96. 96.

    FlipYrWhig

    December 1, 2011 at 2:05 pm

    @BattleCat: This is drawing WAY too sweeping conclusions about causes and effects. There are many, many things Siri can’t understand or fails to complete. Matching gender-specific issues with one another might lead to an alarming impression. But matching “searches that don’t work” with one another might not. Are there other _kinds_ of locations Siri struggles to find? We don’t know that yet. If it can’t find, say, gyms with rock-climbing walls, does that mean that Siri is trying to prevent people from rock climbing because of anti-rock climbing bias?

  97. 97.

    Raging Thunderbolt

    December 1, 2011 at 3:46 pm

    @theturtlemoves:

    “Nobody working under the timelines these folks are likely working under is going to take hours to purposely hard-code anything into the algorithms just to be an asshole.”

    I thought that was Siri’s “personality”.

  98. 98.

    BattleCat

    December 1, 2011 at 4:13 pm

    @FlipYrWhig:

    Yeah, I’m just fucking around.

    I program for a living, and I know what it’s like to have an algorithmic error result in “censorship!” or “intentional bias!” or whatever imaginary offence de jour is on the plate that day.

    At some point you just sort of give up explaining the whys and start messing with people who think everyone is out to get them because they’re so awesome and cool.

    And yes, that probably does make me a terrible person.

  99. 99.

    FlipYrWhig

    December 1, 2011 at 4:23 pm

    @BattleCat: Ha, I fell victim to the Poe effect. Well, as much as I love the feminist blogosphere (I was a regular at Pandagon long before heading over here, and even used to attempt Shakespeare’s Sister before giving up on grounds of insufficient enlightenment), this is precisely the kind of event that makes it very hard to tell totally straightfaced Patriarchy Blaming from parodic Patriarchy Blaming.

  100. 100.

    Abby Spice

    December 2, 2011 at 6:22 pm

    @FlipYrWhig: I just want to say that “before giving up on grounds of insufficient enlightenment” made me laugh so hard I started coughing. That’s exactly my problem with Shakesville.

    (Re: Siri: I accept the abortion explanation, but the rape stuff is still a little distressing. I don’t think Apple is out to get women, but I do think they better fix this pretty darn quick.)

Comments are closed.

Primary Sidebar

Recent Comments

  • Jay on Wednesday News Roundup, A Little Late (Apr 18, 2024 @ 5:17am)
  • Jay on Wednesday News Roundup, A Little Late (Apr 18, 2024 @ 5:15am)
  • laura on On The Road – ema – 2024 AKC Meet the Breeds (Apr 18, 2024 @ 5:14am)
  • sab on Wednesday News Roundup, A Little Late (Apr 18, 2024 @ 5:11am)
  • sab on Late Night ‘Should Be Always’ Open Thread: Librarians, Doing Civilization’s Work (Apr 18, 2024 @ 5:02am)

🎈Keep Balloon Juice Ad Free

Become a Balloon Juice Patreon
Donate with Venmo, Zelle or PayPal

Balloon Juice Posts

View by Topic
View by Author
View by Month & Year
View by Past Author

Balloon Juice Meetups!

All Meetups
Talk of Meetups – Meetup Planning
Proposed BJ meetups list from frosty

Fundraising 2023-24

Wis*Dems Supreme Court + SD-8
Virginia House Races
Four Directions – Montana
Worker Power AZ
Four Directions – Arizona
Four Directions – Nevada

Featuring

Medium Cool
Artists in Our Midst
Authors in Our Midst
Positive Climate News
War in Ukraine
Cole’s “Stories from the Road”
Classified Documents Primer

Calling All Jackals

Site Feedback
Nominate a Rotating Tag
Submit Photos to On the Road
Balloon Juice Mailing List Signup
Balloon Juice Anniversary (All Links)
Balloon Juice Anniversary (All Posts)

Fix Nyms with Apostrophes

Balloon Juice for Ukraine

Donate

Twitter / Spoutible

Balloon Juice (Spoutible)
WaterGirl (Spoutible)
TaMara (Spoutible)
John Cole
DougJ (aka NYT Pitchbot)
Betty Cracker
Tom Levenson
David Anderson
Major Major Major Major
ActualCitizensUnited

Political Action 2024

Postcard Writing Information

Balloon Juice for Four Directions AZ

Donate

Balloon Juice for Four Directions NV

Donate

Site Footer

Come for the politics, stay for the snark.

  • Facebook
  • RSS
  • Twitter
  • YouTube
  • Comment Policy
  • Our Authors
  • Blogroll
  • Our Artists
  • Privacy Policy

Copyright © 2024 Dev Balloon Juice · All Rights Reserved · Powered by BizBudding Inc

Share this ArticleLike this article? Email it to a friend!

Email sent!