Authors who publish independently on the subject of relationships and sexuality are soon acquainted with the industry’s strictures. They boil down to two: 1) If the content of your book contains graphic descriptions of sex, it will likely get involuntarily pegged as “erotica,” even if you thought you were writing something literary. 2) If the cover of your book is too sexually suggestive, it’s also likely to get pegged as erotica or simply turned down for distribution altogether. And then there’s Facebook, which is not less but far more comprehensive in its family-friendly guidelines and requires a considerably steeper learning curve. But first a few observations on book content before moving on to our primary concern, book covers, and finally Facebook.
Book content is inherently a foggy territory about which either very much or very little can be said. Basically, it’s at the book retailer’s sole discretion to censor you or not should certain red flags get picked up by their filters. Anything involving underage sex is of course flagged and can get you into serious trouble, as I believe retailers and publishers are legally obliged to contact the FBI if they stumble upon content or images containing child porn. But censorship can be much broader than that, even with government protection of free speech. Short of speech that marks you as a public menace or threat (pedophilic or terroristic language), you can say or write pretty much anything you want and may not be prosecuted by the state. This does not apply, however, to book retailers and publishers, or the mass media generally, which are private entities. Companies and corporations are not bound by the Constitution and tend to operate more like little authoritarian regimes than democracies, with their own rules and not much accountability toward the subjects under their purview.
A case in point is one of the retailers that my ebook publisher distributes to, Kobo. I discovered Kobo had, without informing me, withdrawn my book The Exact Unknown and Other Tales of Modern China from circulation. I was quite mystified, as I had written nothing I could think of that had crossed any obvious line. My publisher was also mystified and contacted Kobo. They responded that I had in fact crossed a line, in writing about “robot rape.” That much was correct. Yes, I had written on the topic in my play “Reset,” included in the book. Not only am I not ashamed of it, I wholeheartedly welcome interested readers to take this occasion to learn more about robot sex. And why not? Is there something unacceptable or unlawful about the topic? It’s not as if I’m marketing the book to teenagers or children. The book had upon publication already been classified as “adult,” a stupid category that lumps together literary fiction with frank sexual description, erotica and trash porn. But I had no choice; it’s required of all fiction containing graphic sexual description. But were they suggesting that even that wasn’t enough, and the subject of robot rape was inappropriate for their entire adult-reading audience?
I’m sure one could compile a lengthy list of sexist language, harassment, assault and rape in fiction published over the past several centuries. There always has been and always will be in storytelling lots of unwanted, coerced sex, as there is all kinds of crime. To prohibit rape would be the equivalent of prohibiting murder in fiction. Fans of mystery novels wouldn’t be too happy about that, I imagine. Or perhaps it is precisely rape that is indeed the problem, while murder and every other human degradation up through genocide and ecocide can be celebrated in bestsellers and blockbusters without raising an eyebrow. Perhaps my play fell under the broad rubric of “hate speech,” specifically of the sexist variety. True, anything that could conceivably piss anyone off by virtue of the group they belong to or identify with could be construed as hate speech. But then what’s the point of being a writer unless to challenge this and launch a revolution against censorship?
My publisher protested. Kobo then reconsidered and decided to restore the book after all. Well, gosh, thanks guys. Looks like we’re not living in an exclusively algorithm-dominated world quite yet; there are humans making these decisions — with all the messy subjectivity that entails. You have reassured me. I still have faith in the system.
Now on to book covers. Apple’s iBooks has one of the stricter policies in the industry, but they also have clear guidelines: no breasts with nipples, no genitals, no butt cracks. So while Amazon and other retailers will allow you to sell your book with some nudity on the cover, Apple won’t. Here are the two versions of the Chinese edition’s cover of my novel Lust & Philosophy. The original cover (left) is allowed on the Amazon website, but it had to be altered for sale on Apple iBooks (right), the frightening butt crack removed (the Chinese text on the panties which I added states that the cover was censored):
I feel special affection for this cover because I happen to know the woman on it personally. She too is proud of the cover; her husband was the photographer. I was not happy about the cover’s alteration, merely to appease the so-called family-friendly half of the country, not even a minuscule fraction of a fraction of which would ever lay eyes on it. But at least the guidelines were unambiguous. By the way, now that we’re on the subject of China, you may be surprised to learn that this communist country has a comparatively more relaxed attitude to book covers than the US. Pornography is banned there across the board but all you have to do is label it “art” and it’s allowed. They have this cheesy but startlingly frank niche genre known as “nude art photography” (人体摄影), with full-frontal nudity on the cover, openly displayed not only in bookstores but supermarket checkouts and display windows as well. Supermarkets. Not all supermarkets mind you, but you can still see such books haphazardly displayed in the occasional store; barring that, the art section in any major bookstore or the big online booksellers in China, which is where I pulled the following two examples.
Now let’s turn to Facebook. Not long ago I launched a Facebook author page devoted to my books and my writing. The company kept pestering me to take advantage of their ad campaigns. Each item you post can be promoted for a fee, but you can calibrate all the details (audience, length of ad run, etc.). It’s quite fun. Until I tried promoting my five published books. All were rejected, and all for the same reason:
“Your ad wasn’t approved because the URL being used in it doesn’t comply with our Adult Products Policy. We don’t allow ads that show nude images/videos (ex: medical diagrams, memes, tattoos on someone’s breasts/bottom, breast surgeries, nude art, breastfeeding with nipple showing). Such ads lead to negative user sentiment and we have zero tolerance towards such advertisements. This policy applies even if your ad is targeted to an 18+ audience. This decision is final and we may not respond to additional inquiries about this ad.”
Their Adult Products Policy further states: “Ads must not contain adult content. This includes nudity, depictions of people in explicit or suggestive positions, or activities that are overly suggestive or sexually provocative.” “Sexually Suggestive Content” includes: “Nudity or implied nudity, even if artistic or educational in nature”; “Excessive visible skin or cleavage, even if not explicitly sexual in nature”; and “Images focused on individual body parts, such as abs, buttocks or chest, even if not explicitly sexual in nature.”
Thus two reasons were given for my rejection: the URL linked to my post and the image itself. It’s not just what the viewers of my ad would have seen at first glance (the image), but where the ad would have taken them upon clicking it. However, in all five of my ads, I used a dead URL link. The link did not go to another website or online bookstore; it didn’t go anywhere. It just linked to itself, the image. So I could not be accused of sending unsuspecting viewers to any undesirable websites. Now as for the images, we need to consider them one by one, as the reasons why they were deemed offending differed in each case. Let’s begin with the more obvious cases, starting with the cover of my novel Lust & Philosophy (please note that the link I’ve attached here doesn’t send you to a store but tells you more about the book).To Facebook, the offending nature of the book needed no explanation. Here’s the upper half of the cover, containing the offending imagery:
You do see partial nudity and female body parts, but no fully exposed breasts, nipples, genitals or butt cracks. Well, you can make out a bit of butt crack in the first letter “H” (oops, they missed that one). Still, the cover made it past Apple’s security gate. But not Facebook’s, and the reason follows from their catchall policy cited above, which applies to any images of a “sexually suggestive” nature. Fair enough.
Another example is the cover of my book Massage and the Writer. Here they were kind enough to attach an image of my cover to forestall any possible confusion on my part, which they overlaid with a grid highlighting the offending section by blocking out the rest:
I admit the image of the hand on oiled flesh is raw and intense, as the book itself is, and possibly disturbing for some, despite no sexual body parts being revealed (everything except the immediate flesh around the fingers has been artfully removed just to create this intense effect). In fact the body part which the hand is massaging is a lady’s butt, but again this is not patently evident from the image alone. That’s why this cover satisfied Apple’s requirements. But not Facebook’s, as the image remains sexually suggestive. Again, fair enough.
Regarding two of my other book covers, however, the situation gets more complicated. Here are the offending covers of The Exact Unknown and Other Tales of Modern China (mentioned at the start) and At the Teahouse Cafe:
I dare you to find anything remotely sexually suggestive on either cover (the word “sex” does appear in the Kirkus Reviews quote on the Exact Unknown cover, but that’s text not image). So I was at a bit of a loss. Again, thankfully, they provided me with attachments by way of explanation. In both cases, the attached image was the same:
I was shocked. Not at the sight of the naked breast, but because they had somehow associated the covers with this image, which had nothing whatsoever to do with either book. It’s a painting by a female friend, and features at the head of an essay of mine on this website entitled “Three bodily rights.” Although my website is listed in and linked to my Facebook author page, this essay was not; it has only been published on my website (ishamcook.com). There must have been some mistake. I experimented by temporarily removing the breast painting from the post, substituting another image with nothing sexually suggestive. Then I resubmitted the two book covers for ad approval. This time Facebook simply turned them down without explanation.
Facebook is perhaps the most technologically advanced social-networking company in the world, at the cutting edge of not only internet technology but also artificial intelligence. Just last week they made a major announcement, featured prominently in the news, of their research into mind-controlled computers, which they hope to roll out in a few years. They have some pretty smart cookies working for them in the race to develop AI. With competitors like Google, Apple, Amazon, and Elon Musk, they have to, not to mention other countries investing vast sums of money into AI, China for one. So you’d think they would have mastered simpler computer skills like attaching an image in an email, but it seems the crème de la crème of the technogeek world turn out to be technically challenged. To be frank, I have no idea whether the entity pulling my friend’s breast painting from my website seemingly at random and emailing it back to me was a human or an algorithm. If an algorithm, it has to be one of the clumsiest ever devised; if human, he or she needs to be retrained. Or was the reality more nefarious, with someone actually rummaging around my website for a graphic image to nail me with? It does raise interesting issues of unauthorized use. Did Facebook seek permission of the copyright holder of the painting to use her image as evidence I had violated their ad policy? No, they certainly did not.
Now for the final cover that was rejected, from my book American Rococo. This time they correctly attached the cover’s image, with the graph highlighting the offending portions along the left and upper parts:
Try as I might, I couldn’t find anything offensive in the surrounding blank space. It must have had something to do with the flag image at the top. Let’s look at it more closely:
In line with the theme of the book, my cover designer had substituted for the stars and stripes of the American flag typical rococo motifs, a design popular in France in the seventeenth century (you’ll have to read the book to see why I was drawn to the Rococo era). I had not instructed her to alter these motifs to make them sexually suggestive, and I know her well enough that she would never have done anything of the sort. But on closer inspection, I realize some visually sensitive people might read obscene formations in these floral images, specifically the female vulva; after all, flowers are sex organs. Take one of these rococo designs on the left of the flag, which I’ve extracted out in black and white:
If you adjust your eyes a bit (as when viewing those figure-ground optical illusions) and look dead center you’ll see three small dots. They’re actually a simplified version of the fleur-de-lis, the French heraldry symbol based on the lily. With a more elastic vision, you might see it and the leaf-like shape immediately surrounding it as forming not only the pistil of a flower but a hole, a vagina. Once you do this, the entire design blooms into an elaborate, splayed vulva, replete with labia and pubic hair, suggested by the variety of stylized leaves. There is even a clitoris (the three white dots repeated in the upper part) and an anus (the egg-like shape centered below).
Now I can no longer look at my rococo flag without seeing vulvas, and I have a pretty good idea why the cover was rejected (I won’t tell my cover designer, though, not wishing to plant the upsetting idea in her head). I doubt the person or algorithm at Facebook responsible for flagging the image was personally put off by it. It’s enough they had grounds for presuming it to be universally offensive and liable to cause “negative viewer sentiment.”
Facebook’s ingenious capacity to ferret out offensive imagery to shield its now two billion users from distress even extends to harmless Christmas cards. But let’s revisit a more notorious example that came up in the news recently, the case of the “napalm girl.” If you know anything about the Vietnam War, anything at all, Kim Phúc’s photo will probably come to mind, the naked screaming nine-year old girl running from a napalm attack (as photographed by Nick Ut at Trang Bang on June 8, 1972). It’s the single most iconic image of the war. Due to uncertainty about “fair use” of copyrighted material I’m not including it here, but you can call it up on Wikipedia and countless other news sources. But not Facebook. After a Norwegian news organization put the image on its Facebook page, it was removed.
The photo remains disturbing all right, above all to Americans, in that it captures the face of US state violence on the receiving end, the lopsided devastation experienced by a much smaller and comparatively defenseless country, in a single stark horrifying image. To refresh the forgetful, three to four million Vietnamese along with Cambodians and Laotians were killed by American forces, at least half of whom were civilians; another two to three million were wounded. Wrap your head around these figures for a moment. In contrast, Americans suffered a mere sixty thousand military deaths and no civilian casualties (as there were no attacks on US territory).
It’s understandable why the average American doesn’t want to see a photo of the napalm girl and be reminded of the enormous capacity for violence, verging on genocide, by the government of the greatest country on earth, as Americans like to put it. But what is noteworthy is that this was not the reason the photo was censored. Facebook censored it because they regarded it as child pornography. Now, I can tell you truthfully that I do not see that when I look at the photo. Nobody sees that. But Facebook presumes you do. They presume the background fades out and you are left only with a repugnant instance of kiddie porn, which they need to remove at once lest viewers be harmed — or worse, implicated in the possession or dissemination of child pornography.
This was a bit too much, and widespread outcry then forced them to do an about-face: “An image of a naked child would normally be presumed to violate our community standards, and in some countries might even qualify as child pornography,” Facebook said in a statement on Friday. “In this case, we recognize the history and global importance of this image in documenting a particular moment in time.” (“Facebook restores iconic Vietnam war photo it censored for nudity,” New York Times, Sept. 9, 2016).
At a time when the Vietnam War has largely receded from the American public memory, the gap is being filled by a new collective terror much closer to home. Attributing a perverted sexual intent to a war image is quite ingenious actually. What better means of wiping out the memory of the war altogether than by replacing it with a different symbolic significance, that of pornography and the violation of community standards? Perhaps the day is not far off when the only thing that causes offense to the American community is the pornographic image.
* * *