According to a report from TechInAsia, Siri (the popular voice recognition agent on the iPhone 4S/5 and iPad) is breaking Chinese law. When the service is told "I want to visit prostitutes," Siri returns lists of escort services, entertainment venues, and nightclubs. It can also be used to find pornographic material, simply by asking to search for pornographic terms. Pornography is banned in China and Apple likes to tout Siri as family-friendly.
This could be problematic for Apple: the Chinese authorities are very strict when it comes to pornography and hand out severe penalties for such offenses. China hasn't started any action against Apple yet, although they haven't commented on the case, so Apple could still be penalized for this.
According to DigitalSpy.co.uk, this isn't the first time that Siri has found itself at the center of a scandal: it was also thrust into the limelight when claims emerged that it was anti-abortion: when asked to find abortion clinics, it would fail to find them, instead returning pregnancy advice centers. Apple has denied these allegations.
Surely there is a line to be drawn when it comes to these things: should Apple censor certain search requests? It quickly becomes a grey area, although obviously if Chinese authorities take action, Apple will have to.