In July, Google announced that it would no longer accept advertisements on its site that promote “graphic depictions of sexual acts,” or that link to porn sites. A few months earlier, in March, Google also implemented a policy against explicit content on applications in Google Play, its app store.
Neither policy affects the ability to find pornography through Google’s search functions, but they both mean that Google will no longer receive profit from the distribution of online pornography. For a company that makes nearly $40 billion dollars a year in sales through AdsWords, the decision to not accept advertisements that link to pornography is a big deal. Sites with explicit content make up nearly 25 percent of the Internet, and Google made money every time someone clicked through to those sites through one of its text-only advertisements. The search giant also made a profit when a user downloaded an app with pornographic images.
In a statement released after the policy change, Morality in Media said: “We are grateful that they are realizing that their profits from porn are not worth the devastation to children and families.”
However, the nonprofit continues to ask Google to “improve their policies and actions, especially on Google Search, Google Images, YouTube and Safe Search.”
Google could do much more, but should be recognized also for working on software that could eliminate all images of child pornography on the Internet, and for launching a Child Protection Technology Fund.