This concerns me about as much as Eliza not being able to do rape counseling, and it actually is the same problem, just on a greater scale.
Whatever databases that Siri is hooked up to happen to have holes in these specific locations. It isn't malice, it's just an incomplete expert system (a feature which basically every expert system shares.)
Siri does return searches for Abortion Clinics in a bunch of cities, and if you asked for directions to the business name instead of the generic category, it would tell you.
No, it isn't malice. It's just evidence that male programmers in Cupertino don't know or give a shit about women's reproductive issues.
They programmed Siri to give cutesy answers when you ask "what's the meaning of life?" or "will you marry me, Siri?" Their oversight of programming Siri to correctly respond to searches about women's issues is just that, an oversight, not a glitch.
Siri also gives inappropriate cutesy responses when you tell her that you've been robbed. Just because some phrases have pre-written responses doesn't mean that you should expect *every* phrase to have a pre-written response. It's just ludicrous.
@7, 8, those aren't so much 'debunkings' as they are 'responses.' A representative of the company being complained about explaining that this is a problem and that it hasn't been fixed yet doesn't mean that the problem doesn't exist. At most, it pushes the question of WHY there's this lacuna back a step or two, but doesn't actually answer it.
If they'd pointed out that there were other huge gaps in Siri's database, well, that'd be a more reasonable debunking, but that's not what the articles are doing. In fact, all three of the articles that have been linked as responses just note that Apple claims that this is a glitch, and not due to any moral objections or agendas. Which, as pure assertion, is decidedly less compelling than the original question.
Anyone who thinks that Apple Inc is a hotbed of anti-choice sentiment needs a serious reality check.
(I've got $5 that says that the real backstory here is that the apple-vs-google tiff meant that they had to use someone godawful like yelp or yp.com for their local business listings.)
I think it's pretty telling that people are excusing this glaring omission as if treating women's health issues as backseat to social or political pressure is some sort of new thing thats never happened before.
Yes, it should have abortion services. At the bare minimum, it should have been set up with all the Planned Parenthood offices.
But for the life of me, who would ask an iPhone for information about medical procedures? ("Siri, I need to get a kidney transplant. Who do you reccommend?") I would be afraid that any result would be based on some sort of marketing agreement (which is probably why it recognizes Viagra).
And what about the so-called "Pregnancy crisis centers"? How is an app supposed to differentiate between legitimate women's health services and some sort of crackpot "pro-life" center - and even if it can, how can it do that in 144 characters, or whatever the limit is?
As for the way it handled the rape reporting, and domestic abuse, that's bad - but should you be telling a phone app about a crime you were victim to? Should it just connect you with the nearest 911? What if you don't want to report it?
Some things are just too important to leave up to cell phone apps.
Is it me or is "Beta" meaningless now? I don't know anything about programming, but it seems crazy to offer a product, advertise it all over the place, and explain excuses by saying, "it's in Beta." I mean, I know at least one person who bought he phone for the Siri feature.
Anyway, if they fix the problem, I'm willing to accept that it was an oversight, not intentional. That doesn't mean it's not worth complaining about though.
That is very upsetting. It's one thing to block or miss locations of products/services, quite another for Siri to be a snarky bitch when you ask for the morning-after pill. How the hell does she know to be a snarky bitch when you ask for emergency contraception, unless someone programs her specifically to do so? Or does "Siri" think it's a funny joke like hiding a dead body?
@28: oh good lord. Siri has a set of "funny" responses that she spits out to any question (of a certain form) that it doesn't know the answer to. "Siri, why are you a cannibal?" will produce the same type of response as "Siri, why are you pro-life?"
I guess the upside to this is that in Siri, Apple apparently has a product that more or less passes the Turing test for a lot of people: it works well enough when it works that when it fails, people's first thought isn't "oh, programming error" but "man, Siri hates my politics."
Ok, just testing it now with names of local clinics. So far it's found them all and given me directions. Found 6 locations matching Planned Parenthood. Gave me directions to Cedar River Clinics in Tacoma.
Someone use Microsoft's TellMe feature to ask for 'Abortion Clinics'. I'd be interested to know what results turn up.
Siri, I need to take it up the ass, unprotected, from multiple anonymous males at a bathhouse and then blame republicans when things go wrong. Where can I go?
Does this surprise anyone? You will like whatever Apple tells you to like, so shut up and continue to throw money at them, while protesting corporate bullshit.
@ 38 - I did read the link, and awarded a win anyway.
In the original story that Dan linked to, the writer DID ask for directions to business names, several times, and Siri remained clueless.
It's an oversight. Made by male code monkeys. That needs to be fixed.
No one's calling for heads to roll, it's just.. disappointing that, more than a decade now into the 21st century, women's issues are STILL just un-thought-of, and that that's the standard. Disappointing.
And people wondered ten or twenty years ago why I never objected to and generally approved of hiring for balance. If I wanted to, I could probably make a little round and collect twenty dinners at once - but, like Alice, I'd rather go without twenty dinners at once than have to eat them all in one sitting.
Eesh, can everyone please calm down? This is not some conspiracy. It's a fucking natural language processor hooked to a database of businesses, and it's evolving as people use it. There are countless things that Siri won't find, and it doesn't mean a fucking thing. Please read the link Brian posted - http://tidbits.com/article/12653.
Oh hey, everyone's already posting the Adam Engst story. So no more clicks here, please. No more ad views. Leave. The intelligent person on this matter has spoken. It even appears one of Dan's coworkers agrees with Adam's analysis.
Which anyone with half a brain would. Sheesh.
You'd think Dan would have at least put a bit more effort into crafting his clickbait story. "Siri's sexist!", "She's also anti-choice!"
No. "She" is an algorithm that sometimes returns poor results.
22's right. They're using Yelp and some other, lesser-known databases, due to not being able to play well with Google. You want to complain about the search returns? Either blame Yelp, or blame Apple and Google for not being able to make nice.
It's worth considering if they ought to have programmed in some words that would *not* trigger a cutesy response--rape, abuse, robbed, etc.
I would like to add that this has nothing to do with "male code monkeys" and it is pretty sexist to assert that it does. Like @48 said, it's the business database and the natural language processor that is the issue here. For one thing, if you think "code monkeys", let alone 23-year-old ones, have any control over product direction at a company like Apple, you are deluded. It also assumes that all developers are male, and that they just sat around brainstorming things for Siri to know about, and callously ignored all those poor women and their uteruses. That simply is not how this kind of app is built.
I am sure that some of the developers and product managers who designed and built this thing are female, and you know what? They probably didn't have abortion clinics or any other "women's health issues" in mind either. They were probably trying to, you know, deliver a generalized speech interface product. So quit blaming this on the great penis conspiracy and give the "code monkey" thing a rest.
Indeed, Siri-bashing at this early stage of the game means not taking into account that it's a new beta system with all kinds of glitches.
Which is not to say that Apple doesn't sometimes show a conservative bias. Jobbs himself admitted that.
I don't want to make Apple seem like the next candidate for Big Brother. But I'm sincerely glad Apple is not the only thing around. Thank you Bill Gates (a sentence I sincerely never thought I would ever type...)
Since I assume Siri is hooked up to some generalized phone book database, there must be the types of services in there and their phone numbers, so it has to be the natural language rulebase that is lacking.
Still, one expects that it would be robust enough to handle these queries.
@27 Overpromise of computer apps have a long and sad history. Look up "vaporware" some day. Many things have been sold with a promise of apps to come which never came, at least not for those things..
I get it that expert systems aren't exactly the last word in AI research.
But could one of the apologists in here (hi, @14, @48) tell me when Apple started advertising Beta software on national television? Or even shipping new products with beta software installed on them?
What happened to that famous Apple polish? What happened to "it just works?"
Also, on another note, when the hell did women in need of abortion services start searching for clinics by name, ferchrissakes? And why on earth wouldn't a young woman use her cell phone to start her search for abortion services? I'm a hopelessly old person myself, but I'm not that out of touch with the kids these days.
You do have a point, though in my own experience, a surprising amount of what eventually ships in a software product is, in fact, the result of spur-of-the-moment decisions made by 23-year-old male programmers.
But you're right, we don't need to invoke sexism here. We can invoke classism instead! One thing that all Apple employees* share, regardless of gender, is comprehensive, employer-provided health insurance.
* American, software-developing Apple employees, at any rate. Maybe not so much, with the people assembling the hardware or staffing the retail outlets.
The problem isn't just that Siri has trouble finding abortion clinics. The thing has been programmed specially to know where to find things like condoms (it searches for drug stores) and blow jobs (escort services), but similar programming has not been included for any other forms of contraception, abortion, (or for that matter, cunnilingus, or any other euphemism for the female counterpart to the blow job - it assumes you want pet stores if you use the word pussy) The program knows you need a hospital when you say "I'm hurt," that you need a dentist when you say "I broke a tooth," but won't find hospitals or police stations when you say "I was raped".
While there's no evidence attributing this to intentional malice, this level of oversight of half the population of the planet is worthy of harsh criticism and reveals an often invisible bias in the programming community.
@PA Native
You're right, there are better resources out there. The same is true for other medical emergencies, for which (as I pointed out) Siri will still find you hospitals. Siri will even help you commit crimes (see the escort service recommendations for reference).
But someone who has had a traumatic experience (like, for instance, rape) isn't necessarily thinking striaght, and the fact that Siri seems to think that there's no one who can help them could be even more disturbing to them. Or maybe they just can't hold they're hands steady enough to dial 911 or think of the word hospital. People who are injured don't need to be able to knwo the word hospital to get help from Siri. Rape victims do. This is problematic. Not evil, not horrible; but deeply problematic.
"The thing has been programmed specially to know where to find things like condoms."
...and you know this because?
(Hint: the fact that siri answers those queries doesn't mean what you think it does, any more than the fact that google returns useful searches for 'condoms' means that there's a special condom module inside google's search engine.)
@ Doctor Memory
She responds to "I need condoms" with "I found you x drug stores"
There's some sort of thesaurus-trickery involved there. And I don't think it's comign via Yelp. You are correct that I don't know for certain, but it's hard to explain a lot of the discrepancies with just whatever sources Siri uses to pull out answers - she could be taught to use those sources better, and has been parogrammed to (for instance) answer the joke question of where to hide a body with apparently hand-picked places in many major cities, so it's definitely a major oversight, and not "Well, that's just how the thing works, and the sexism lies in the resources being use,d not in programming of Siri"
Siri's data source is imperfect, and hardly all use scenarios have been tested. Should this be bettered? Yes! Do "pregnancy crisis centers" use SEO wizardry to be placed higher in all searches than abortion clinics? YES! Is Siri EVIL and anti-woman? NO.
@64: your ability to go from "I don't think it's coming via yelp" and "I don't know for certain" to "it's definitely a major oversight" is just breathtaking. Never change.
(And for fuck's sake: the "hide a body" easter egg gives the same answer no matter where you are on the planet. When you click on "metal foundries" you are then doing a new search for those, and of course it comes up with the local answers.)
@62: "the fact that Siri seems to think that there's no one who can help them could be even more disturbing to them"
And Apple should better tune queries, but plenty of people see malice where none exists in all aspects of life. That's an issue with perspective and expectations.
@undead ayn rand
I'm trying to be very careful and clear about not implying any malicious intent here. But structural sexism (in this context, meaning the fact that common issues women face were just not on anyone's radar while finding Viagra and getting blow jobs were) is still a real problem.
@68: "structural sexism (in this context, meaning the fact that common issues women face were just not on anyone's radar while finding Viagra and getting blow jobs were) is still a real problem."
Why does it surprise you that people put more SEO resources into pharmaceuticals and escorts than womens' health? Why is this necessarily structural sexism?
I see sexism everywhere, but I see it where it lies. I don't make up a fictional world where Apple actually attempts to purchase prostitutes and viagra as an average user scenario. This is not how their service works.
@69 "Why does it surprise you that people put more SEO resources into pharmaceuticals and escorts than womens' health?"
Much of "women's health" falls under the umbrella of "pharmaceuticals."
Siri still directs people searching for condoms to drug stores while telling people looking for hormonal birth control or Plan B that they're out of luck. I'm not surprised that the priorites are as they are (and I haven't said anything indicating that am surpsied; what I am is annoyed), but I don't see how putting more SEO resources into fullfilling men's desires rather than women's needs could possibly be seen as anything other than sexist.
I mean, the prevalence of male health care over womens', sure. But focusing on this particular implementation of a software service as an example of sexism rather than an opportunity to better the results base is counterproductive. If you'd like to call the results sexist, sure. But all those calling the product sexist, eh.
@71 Fair enough. I think we may just have a semantic disagreement. I'll hold final judgment until I see what actions are taken to fix the problem, though.
When you say Siri can be gamed by SEO, aren't you making just as big an assumption about how the software works as the people who think no-one at Apple bothered to type in the Women's Health Care data (or query map, or whatever)?
As far as I'm aware, no-one has noticed a new Apple bot crawling web pages; that would have been pretty big news, I'm sure, as it would imply Apple was building their own search engine.
@73: I'm saying that they put a lot more effort into having those industries represented and marketed online.
"aren't you making just as big an assumption about how the software works as the people who think no-one at Apple bothered to type in the Women's Health Care data (or query map, or whatever)?""
No, because I'm not arguing with that possibility. I am certain that they have not tested out all usage scenarios, only the most common ones (and no, not GIVE ME BLOWJOBS AND VIAGRA either.)
"As far as I'm aware, no-one has noticed a new Apple bot crawling web pages; that would have been pretty big news, I'm sure, as it would imply Apple was building their own search engine."
SEO is used as a general term, and no one is claiming that Apple is running their own crawler.
People like @merry are what's wrong with the world. Morons without any understanding of how the technology works, ready with a hair trigger to start hollering about some oppressed class, and blaming people they've never met for a problem they likely didn't cause. It's not someone's job at Apple to sit and think up all the possible permutations of questions someone might ask. There's no big list of answers to all possible questions that someone just didn't think to include abortion services on. Siri parses natural language into a search query based on imperfect ALGORITHMS that simulate natural language comprehension, searches for what it thinks you're looking for, and then presents the results to you in a soothing voice. It's an ILLUSION for godsakes. There are lots of things Siri doesn't know how to parse. It's cutting edge software, and it's running on your goddamn phone, and sometimes it's not perfect, and when it isn't they fix it. So sorry your pet issue wasn't tested for before they launched. Get over yourself.
Yep, horrid conservative bias rears its head at Apple again. After putting $100,000 in the anti-Prop 8 campaign (and filing an amicus brief), appointing Al Gore to its board of directors, and appointing an understood-to-be-gay CEO, here they go pursuing their conservative agenda again. For shame!
Either that, or they developed Siri by looking at what people actually ask for and prioritizing those queries for the beta version.
Nah, too reasonable. They're witches and satanists. Burn them!
And yet, it seems there are indeed people at Apple who came up with funny "joke" or "Easter egg" answers to certain questions, or classes of questions.
I guess nobody thought abortion was funny. Can't blame anyone for that, I guess.
OK, so we're not using "SEO" as an acronym for "Search Engine Optimization", right. It's "more general". Right, then. Yet if I'm not mistaken, you're still assuming, without an iota of evidence, that Apple is using the internet as its primary data source, yes?
And you're still berating other people for assuming this or that about how the software works?
You are aware, are you not, that there are copious database resources available for purchase that do not rely, at all, on internet pages?
I do agree that "SEO" is a general term, but I do not agree that the term is boundless; in particular, I am of the opinion that it can apply only to the manipulation of internet search results, and their derivatives. Are you working with an alternate definition, perhaps?
"Ask Siri to text your dad, remind you to call the dentist, or find directions, and it figures out which apps to use and who you’re talking about. It finds answers for you from the web through sources like Yelp and WolframAlpha."
You sure are doing a lot of nitpicking and not a lot of research for how snippy your responses are.
"Siri's actions and answers rely upon a growing ecosystem of partners, including:
OpenTable, Gayot, CitySearch, BooRah, Yelp, Yahoo Local, ReserveTravel, Localeze for restaurant and business questions and actions;
Eventful, StubHub, and LiveKick for events and concert information;
MovieTickets, RottenTomatoes and the New York Times for movie information and reviews;
Bing Answers, and Wolfram Alpha for factual question answering;[24]
Bing, Yahoo and Google for web search"
Yes, as I stated Siri references the sorts of databases that SEOs attempt to manipulate to their clients' advantage.
Siri likes cops.
That's a shame... e_e
Whatever databases that Siri is hooked up to happen to have holes in these specific locations. It isn't malice, it's just an incomplete expert system (a feature which basically every expert system shares.)
http://www.tuaw.com/2011/11/30/debunked-…
But please, do go on with your shrill knee-jerk reactions. They're really entertaining.
http://www.theatlanticwire.com/business/…
http://bits.blogs.nytimes.com/2011/11/30…
But personally, my phone's voice recognition software is not the first resource I would go to for my reproductive health services advice.
Just sayin'
This is clear evidence of a conspiracy by anti-voice pro-typers,
http://tidbits.com/article/12653
Siri does return searches for Abortion Clinics in a bunch of cities, and if you asked for directions to the business name instead of the generic category, it would tell you.
No, it isn't malice. It's just evidence that male programmers in Cupertino don't know or give a shit about women's reproductive issues.
They programmed Siri to give cutesy answers when you ask "what's the meaning of life?" or "will you marry me, Siri?" Their oversight of programming Siri to correctly respond to searches about women's issues is just that, an oversight, not a glitch.
Siri also gives inappropriate cutesy responses when you tell her that you've been robbed. Just because some phrases have pre-written responses doesn't mean that you should expect *every* phrase to have a pre-written response. It's just ludicrous.
If they'd pointed out that there were other huge gaps in Siri's database, well, that'd be a more reasonable debunking, but that's not what the articles are doing. In fact, all three of the articles that have been linked as responses just note that Apple claims that this is a glitch, and not due to any moral objections or agendas. Which, as pure assertion, is decidedly less compelling than the original question.
Anyone who thinks that Apple Inc is a hotbed of anti-choice sentiment needs a serious reality check.
(I've got $5 that says that the real backstory here is that the apple-vs-google tiff meant that they had to use someone godawful like yelp or yp.com for their local business listings.)
But for the life of me, who would ask an iPhone for information about medical procedures? ("Siri, I need to get a kidney transplant. Who do you reccommend?") I would be afraid that any result would be based on some sort of marketing agreement (which is probably why it recognizes Viagra).
And what about the so-called "Pregnancy crisis centers"? How is an app supposed to differentiate between legitimate women's health services and some sort of crackpot "pro-life" center - and even if it can, how can it do that in 144 characters, or whatever the limit is?
As for the way it handled the rape reporting, and domestic abuse, that's bad - but should you be telling a phone app about a crime you were victim to? Should it just connect you with the nearest 911? What if you don't want to report it?
Some things are just too important to leave up to cell phone apps.
"I need the morning after pill."
"Is that so?"
"Yes, Mother, now tell me where I can find it, goddammit!"
Anyway, if they fix the problem, I'm willing to accept that it was an oversight, not intentional. That doesn't mean it's not worth complaining about though.
Someone use Microsoft's TellMe feature to ask for 'Abortion Clinics'. I'd be interested to know what results turn up.
[Siri's answer]: http://gopconvention2012.com/
Doesn't matter about Siri being in Beta. Matters about 23 yr old male code monkeys.
AND NOW THEY'VE BEEN FOUND OUT. Fix it.
In the original story that Dan linked to, the writer DID ask for directions to business names, several times, and Siri remained clueless.
It's an oversight. Made by male code monkeys. That needs to be fixed.
No one's calling for heads to roll, it's just.. disappointing that, more than a decade now into the 21st century, women's issues are STILL just un-thought-of, and that that's the standard. Disappointing.
Patience is a virtue.
Which anyone with half a brain would. Sheesh.
You'd think Dan would have at least put a bit more effort into crafting his clickbait story. "Siri's sexist!", "She's also anti-choice!"
No. "She" is an algorithm that sometimes returns poor results.
It's worth considering if they ought to have programmed in some words that would *not* trigger a cutesy response--rape, abuse, robbed, etc.
14 results.
I am sure that some of the developers and product managers who designed and built this thing are female, and you know what? They probably didn't have abortion clinics or any other "women's health issues" in mind either. They were probably trying to, you know, deliver a generalized speech interface product. So quit blaming this on the great penis conspiracy and give the "code monkey" thing a rest.
Which is not to say that Apple doesn't sometimes show a conservative bias. Jobbs himself admitted that.
I don't want to make Apple seem like the next candidate for Big Brother. But I'm sincerely glad Apple is not the only thing around. Thank you Bill Gates (a sentence I sincerely never thought I would ever type...)
Wonder how Watson would do.
Since I assume Siri is hooked up to some generalized phone book database, there must be the types of services in there and their phone numbers, so it has to be the natural language rulebase that is lacking.
Still, one expects that it would be robust enough to handle these queries.
But could one of the apologists in here (hi, @14, @48) tell me when Apple started advertising Beta software on national television? Or even shipping new products with beta software installed on them?
What happened to that famous Apple polish? What happened to "it just works?"
Also, on another note, when the hell did women in need of abortion services start searching for clinics by name, ferchrissakes? And why on earth wouldn't a young woman use her cell phone to start her search for abortion services? I'm a hopelessly old person myself, but I'm not that out of touch with the kids these days.
You do have a point, though in my own experience, a surprising amount of what eventually ships in a software product is, in fact, the result of spur-of-the-moment decisions made by 23-year-old male programmers.
But you're right, we don't need to invoke sexism here. We can invoke classism instead! One thing that all Apple employees* share, regardless of gender, is comprehensive, employer-provided health insurance.
* American, software-developing Apple employees, at any rate. Maybe not so much, with the people assembling the hardware or staffing the retail outlets.
While there's no evidence attributing this to intentional malice, this level of oversight of half the population of the planet is worthy of harsh criticism and reveals an often invisible bias in the programming community.
Like, in your rape example, calling 911 would probably be a better first step.
You're right, there are better resources out there. The same is true for other medical emergencies, for which (as I pointed out) Siri will still find you hospitals. Siri will even help you commit crimes (see the escort service recommendations for reference).
But someone who has had a traumatic experience (like, for instance, rape) isn't necessarily thinking striaght, and the fact that Siri seems to think that there's no one who can help them could be even more disturbing to them. Or maybe they just can't hold they're hands steady enough to dial 911 or think of the word hospital. People who are injured don't need to be able to knwo the word hospital to get help from Siri. Rape victims do. This is problematic. Not evil, not horrible; but deeply problematic.
"The thing has been programmed specially to know where to find things like condoms."
...and you know this because?
(Hint: the fact that siri answers those queries doesn't mean what you think it does, any more than the fact that google returns useful searches for 'condoms' means that there's a special condom module inside google's search engine.)
She responds to "I need condoms" with "I found you x drug stores"
There's some sort of thesaurus-trickery involved there. And I don't think it's comign via Yelp. You are correct that I don't know for certain, but it's hard to explain a lot of the discrepancies with just whatever sources Siri uses to pull out answers - she could be taught to use those sources better, and has been parogrammed to (for instance) answer the joke question of where to hide a body with apparently hand-picked places in many major cities, so it's definitely a major oversight, and not "Well, that's just how the thing works, and the sexism lies in the resources being use,d not in programming of Siri"
(And for fuck's sake: the "hide a body" easter egg gives the same answer no matter where you are on the planet. When you click on "metal foundries" you are then doing a new search for those, and of course it comes up with the local answers.)
And Apple should better tune queries, but plenty of people see malice where none exists in all aspects of life. That's an issue with perspective and expectations.
I'm trying to be very careful and clear about not implying any malicious intent here. But structural sexism (in this context, meaning the fact that common issues women face were just not on anyone's radar while finding Viagra and getting blow jobs were) is still a real problem.
Why does it surprise you that people put more SEO resources into pharmaceuticals and escorts than womens' health? Why is this necessarily structural sexism?
I see sexism everywhere, but I see it where it lies. I don't make up a fictional world where Apple actually attempts to purchase prostitutes and viagra as an average user scenario. This is not how their service works.
Much of "women's health" falls under the umbrella of "pharmaceuticals."
Siri still directs people searching for condoms to drug stores while telling people looking for hormonal birth control or Plan B that they're out of luck. I'm not surprised that the priorites are as they are (and I haven't said anything indicating that am surpsied; what I am is annoyed), but I don't see how putting more SEO resources into fullfilling men's desires rather than women's needs could possibly be seen as anything other than sexist.
As far as I'm aware, no-one has noticed a new Apple bot crawling web pages; that would have been pretty big news, I'm sure, as it would imply Apple was building their own search engine.
"aren't you making just as big an assumption about how the software works as the people who think no-one at Apple bothered to type in the Women's Health Care data (or query map, or whatever)?""
No, because I'm not arguing with that possibility. I am certain that they have not tested out all usage scenarios, only the most common ones (and no, not GIVE ME BLOWJOBS AND VIAGRA either.)
"As far as I'm aware, no-one has noticed a new Apple bot crawling web pages; that would have been pretty big news, I'm sure, as it would imply Apple was building their own search engine."
SEO is used as a general term, and no one is claiming that Apple is running their own crawler.
Either that, or they developed Siri by looking at what people actually ask for and prioritizing those queries for the beta version.
Nah, too reasonable. They're witches and satanists. Burn them!
And yet, it seems there are indeed people at Apple who came up with funny "joke" or "Easter egg" answers to certain questions, or classes of questions.
I guess nobody thought abortion was funny. Can't blame anyone for that, I guess.
OK, so we're not using "SEO" as an acronym for "Search Engine Optimization", right. It's "more general". Right, then. Yet if I'm not mistaken, you're still assuming, without an iota of evidence, that Apple is using the internet as its primary data source, yes?
And you're still berating other people for assuming this or that about how the software works?
You are aware, are you not, that there are copious database resources available for purchase that do not rely, at all, on internet pages?
I do agree that "SEO" is a general term, but I do not agree that the term is boundless; in particular, I am of the opinion that it can apply only to the manipulation of internet search results, and their derivatives. Are you working with an alternate definition, perhaps?
http://www.apple.com/iphone/features/sir…
"Ask Siri to text your dad, remind you to call the dentist, or find directions, and it figures out which apps to use and who you’re talking about. It finds answers for you from the web through sources like Yelp and WolframAlpha."
You sure are doing a lot of nitpicking and not a lot of research for how snippy your responses are.
http://en.wikipedia.org/wiki/Siri_(softw…)
"Siri's actions and answers rely upon a growing ecosystem of partners, including:
OpenTable, Gayot, CitySearch, BooRah, Yelp, Yahoo Local, ReserveTravel, Localeze for restaurant and business questions and actions;
Eventful, StubHub, and LiveKick for events and concert information;
MovieTickets, RottenTomatoes and the New York Times for movie information and reviews;
Bing Answers, and Wolfram Alpha for factual question answering;[24]
Bing, Yahoo and Google for web search"
Yes, as I stated Siri references the sorts of databases that SEOs attempt to manipulate to their clients' advantage.