Two months ago, Habbo Hotel, the world's largest social game and online community for teenagers age 13 to 21, with 10 million monthly unique visitors, was front-page news--for all the wrong reasons.

Rachel Seifert, a reporter for Channel 4 News in London, had spent two months interacting in the game and on the site, and reported that most of her interactions were perverse, and certainly not suitable for teens.

"The chat was very sexual, perverse, violent, pornographic, overtly sexual acts, people saying they were going to do things to others, and it was very graphic," Seifert said. "Within two minutes I was being asked individually 'do you have a webcam?', 'can we chat on (instant messenger service) MSN, on Skype?' I was also, within a couple of minutes, asked to strip, fully naked, and asked what would I do on a webcam."

Immediately, Paul LaFontaine, the CEO of Habbo's parent company, Sulake, responded, telling the BBC that "Since hearing about the findings of the investigation we have increased the number of active moderators at any given time and strengthened our automated filtering technology."

The day the investigation was published, LaFontaine wrote on the company blog that Habbo employed 225 moderators tasked with monitoring 70 million lines of typed conversations--every single day.

The next morning, the company disabled Habbo's chat functionality, and scrambled to launch a "Parental Advisory Summit," a cobbled-together forum in which users and their parents could submit feedback on how to make the site safer for kids.

But it may have been too late to salvage Habbo's reputation.

By the time the fiasco was over, just two days later, two major company investors—Balderton, which owned 13.5 percent of Habbo, and 3i, which owned 16 percent—dumped their shares of the company.

Sadly, other game companies have faced even more serious safety issues.

In June, Skout, a location-based social-networking mobile app, was forced to shut down its service for minors for nearly a month after it was revealed that three children had—allegedly—been statutorily raped by adults they met on the service.

"One case is too many," Skout founder Christian Wiklund told the New York Times. "When you have three, it looks like a pattern. This is my worst fear."

As the online video game industry surges—PriceWaterhouseCooper expects the the interactive gaming market to surge from $60.3 billion in 2012 to $80.3 billion in 2016—the government is increasing protection for the youngest consumers. But that also means more tricky regulations and complications for game designers and gaming-company founders.

Most entrepreneurs in the gaming community are well-acquainted with the Children's Online Privacy Protection Act, better known as COPPA, a 1998 law passed by the Federal Trade Commission that sets the rules on how marketers, websites, and game makers build safety into products that are targeted towards children age 12 and younger.

But as countless start-ups have flooded into the online gaming space, several have tripped up, and COPPA has pounced.

Last summer, mobile-application developer W3 Innovations was fined $50,000 for collecting for collecting information on kids. In March, COPPA slapped a $250,000 fine on RockYou, a social gaming site headquartered in Redwood City, California, for collected about 179,000 children's email addresses without parental consent. And last May, the FTC issued its biggest COPPA fine ever--a $3 million charge against Playdom, a Silicon Valley-based operator of online social games, for using kids' information without parents' knowledge. 

The venture capital community took notice, too.

"A lot of VCs won't invest unless the sites come to someone like me," says Parry Aftab, the managing director of, a compliancy and consulting firm that helps websites and game companies comply with COPPA. "They want us to look at it and advise them on it before they will fund. Because it's serious stuff. You don't want to be on the front page of the Wall Street Journal for the wrong reasons."

Jeff Clavier, the managing partner of SoftTech VC, a Silicon Valley-based venture capital firm, has made several investments in the gaming space, and says safety is an important part of his initial conversation with gaming entrepreneurs.

"When we meet founders of gaming companies, they explain the type of interaction, and if ever it's specific to a certain demographic, like seven to 13-year old-kids," he says. "We try to understand what sort of interactions the kids are going to have and is just kids to game or kids to kids. And if it's kids to kids we try to understand the safety and the environment. It's rare for us to invest in first-time game designers. The guys we've backed were very experienced game designers and this is the type of thing that they master."

The market for safety online has spawned an ecosystem of its own. Everloop, a social network for kids based in Danville, California, scored $3.1 million in funding last year. It aspires to be the Facebook for children younger than 13. (Currently, Facebook does not allow children younger than 13, but has intimated that it will in the future.)

And as games become more social, the market for online moderators becomes the crucial lynchpin for ensuring a safe experience for kids.

Dylan Collins, an experienced serial game entrepreneur who is now the principal investor in Fight My Monster, a UK-based game for young boys that has more than 1.5 million downloads, says the community and its moderators are often the last line of defense against cyber crime, bullying, and any other type of illegal activity.

"In any games company the community is always the critical thing," he says. "For any game, it doesn't matter what the demographic is or what the platform is, at the end of the day, it's the community that makes or breaks it. You've got to think about that first."

He adds: "Kids will be kids. The things you see in the playgrounds and in schools tends to be replicated within the games."

But reconciling the design of the game and its safety features can trip up a young company, says Aftab.

"A lot of them don't get it," she says. "They have technology programmers that are really good at design but their policy and compliance people are just not up to the task. Right now the VCs are worried because no one knows what COPPA is going to say."

Aftab is referring to new regulations expected out from the FTC this fall. The new regulations could seriously impact how sites target younger users.

"We're all guessing," she says. "We're reading tea leaves to figure out what they're going to say and if they're coming out with something new. They all want to make sure that they're cleaner than clean without hurting their business model."