Archive
Virtual Worlds and Youth: Accessing Explicit Content
FTC Report Finds Sexually and Violently Explicit Content in Online Virtual Worlds Accessed by Minors
Recommends Best Practices to Shield Children and Teens
The Federal Trade Commission today issued a report that examines the incidence of
sexually and violently explicit content in online virtual worlds. The congressionally mandated report, “Virtual Worlds and Kids: Mapping the Risks,” urges operators of virtual worlds to take a number of steps to keep explicit content away from children and teens, and recommends that parents familiarize themselves with the virtual worlds their kids visit.The report analyzes how easily minors can access explicit content in virtual worlds, and the measures virtual world operators take to prevent minors from viewing it. According to the findings, although little explicit content appeared in child-oriented virtual worlds, a moderate to heavy amount appeared in virtual worlds that are designed for teens and adults.
Virtual worlds are popular with children and adults because they blend 3-D environments with online social networking, allowing users to interact in and shape their own online content. Through avatars – digital representations controlled by humans in real time – virtual world users socialize, network, play, or even conduct business in graphics-intensive landscapes using text or voice chat, sounds, gestures, and video. Despite the educational, social, and creative opportunities virtual worlds offer, the FTC’s report found that explicit content exists, free of charge, in online virtual worlds that minors are able to access. In fact, some virtual worlds designed for teens and adults allow – or even encourage – younger children to get around the worlds’ minimum age requirements.
“It is far too easy for children and young teens to access explicit content in some of these virtual worlds,” said FTC Chairman Jon Leibowitz. “The time is ripe for these companies to grow up and implement better practices to protect kids.”
The FTC surveyed 27 online virtual worlds – including those specifically intended for young children, worlds that appealed to teens, and worlds intended only for adults. The FTC found at least one instance of either sexually or violently explicit content in 19 of the 27 worlds. The FTC observed a heavy amount of explicit content in five of the virtual worlds studied, a moderate amount in four worlds, and only a low amount in the remaining 10 worlds in which explicit content was found.
Of the 14 virtual worlds in the FTC’s study that were, by design, open to children under age 13, seven contained no explicit content, six contained a low amount of such content, and one contained a moderate amount. Almost all of the explicit content found in the child-oriented virtual worlds appeared in the form of text posted in chat rooms, on message boards, or in discussion forums.
FTC Report Finds Sexually and Violently Explicit Content in Online Virtual Worlds Accessed by Minors
HEEEEEEEEEEEEEERRRRRRRREEEEEEEEEEEEE WE GOOOOOOOOOOOOO!
Okay, for as much as I would love (and you know I would) to ramble ramble ramble about my opinions on this piece, I am going to stay MUM.
Why, you ask? Well, because according to engageexpo.com, I am (and very happily so) speaking on this VERY topic with Phyllis Marcus, who was commissioned by the FTC to research and report on youth and virtual worlds.
Safety in Online Worlds: How the Federal Trade Commission Sees It
In March of 2009, Congress mandated that the Federal Trade Commission study the types of content available in online virtual worlds — paying close attention to explicit sexual and violent content — and the mechanisms those worlds use to manage access by minors. In this unique session, the Commission’s senior most attorney assigned to the 2009 Virtual Worlds Report to Congress will present results and discuss the agency’s recommendations for strengthening access controls to virtual worlds while allowing free expression to flourish online. This first-ever analysis of virtual worlds by the FTC will be discussed by senior attorney Phyllis H. Marcus who heads the Commission’s children’s privacy program and is responsible for enforcing the Children’s Online Privacy Protection Act (COPPA). Marcus expects this session to be the first detailed public reveal of her division’s nine-month study of virtual world content. She will present data, offer recommendations, and participate in a lively one-on-one interview with virtual world child safety advocate and online community activist Izzy Neis.
Phyllis H. Marcus, senior attorney, Div of Advertising Practices, FTC’s COPPA lead
Izzy Neis, Senior Community Safety Lead, Gazillion Entertainment
http://www.engageexpo.com/ny2010/schedule/track2.html
Score, right? Right. Couldn’t have asked for a better opportunity 🙂
I’m looking forward to this, especially after reading the article on the FTC page, and subsequently skimming through the document while printing (it’s a relatively good sized print, fyi).
I’ve never been shy to discuss the social (and sometimes sexual) exploration of youth in free, identity-less (or identity-filled) web environments – from Language play in phrases to bumping to sexting to warplay. Playgrounds can be a very confusing/odd place for those who do not understand or are not a part of the intricate socialization patterns and learning curve. And even for those of us who DO understand these same things, it’s still nerve-wrecking and frightening to behold (don’t even get me started on my 13 year old cousin’s behavior on facebook). But, we react that way because we MUST. It’s the elder’s duty to help guide and educate the young. But, that’s not always enough (this doesn’t mean stop, it just means, more is needed).
We cannot expect kids to just inherently know NOT to behave certain ways – especially if that behavior or action can illicit some sort of euphoria or adrenaline rush. They don’t learn “No, don’t do that” through osmosis. Fire = bright & warm & pretty & powerful, but you don’t know it hurts until you touch it… you could listen to your folks who say “don’t touch the fire, it burns”, but the curiosity will always be there because you don’t precisely understand the magnitude of “it burns”.
Naturally, someone has to say it – NO, don’t do that. And when youth refuse to listen (and when they decide to touch the fire), we have to be there to guide, educate, and then PICK THEM UP once they learn their lessons, or after they suffer the consequences… and then, encourage them to share their lessons with others – peer mentorship.
Also, as businesses we need to EMPLOY WELL EQUIPPED, HIGHLY CAPABLE, HIGHLY TRAINED MODERATORS & COMMUNITY MANAGEMENT STAFF… and give them time to DO THE JOB RIGHT.
Moderation is expensive. It just is… Before you even contemplate the idea of “moderation” and how to lower the cost for a teen & younger site – companies really, truly need to accept it. Say it out loud. Do a little jig. Throw a party. Make a badge and wear it everyone “YOUTH MODERATION AND ONLINE COMMUNITIES ARE EXPENSIVE”, and then swallow that pain. NO amount of cheating or pinching the system is going to replace the expense without putting your audience or your brand at risk, UNLESS you employ full restrictions. Full. Restrictions. As in, no UGC – this includes user created avatarnames/usernames, open or filtered or dictionary chat, no pictures, or uploads, no fan fiction, no forums, no blog posts, videos, podcasts, art, nothing. Kinda takes the community out of community, doesn’t it? Yep – remember that jig you did and that badge you wore… there’s your reason.
If a user can type or upload and submit – that’s UGC, and it needs moderation before it ever appears on any live site.
User Generated Content is a privilege for your audience, but also a privilege for your site/brand/ip/experience as a company. With privilege comes someone else’s responsibility, and that lies with the company to offer opportunity. Think about it 😉
Anyway… take a look at the FTC article (link above) and follow the white link rabbit to the pdf itself. Happy reading!!
Annnnnnnnnd, if you’re going to be in attendance for Engage Expo on February 16th & 17th in New York City, please bring some of your lovely and oh-so-brazilliant questions to the 3:30-4:30 chat on Weds the 17th. I would love to see your smiling faces and bask in your question-filled glory. 😀
Great review of Moderation & tools
Via yesterday’s New York Times, a great article by Leslie Berlin (who is the project historian for the Silicon Valley Archives at Stanford) about the different techniques and technologies used to moderate children’s virtual worlds for inappropriate/dangerous content and risky behaviours. The article focuses on vw’s that try to monitor “intent as well as content”…rather than simply blocking keywords or limiting communication altogether. It also describes that bullying and disclosing personal info remain the most common dangers faced by young people online. According to Berlin, the biggest challenges for vw moderators are keeping up with “user innovations” aimed at bypassing moderation tools (such as “workarounds” or “secret codes”), as well as striking a balance between technological solutions and human judgement when it comes to deciding which words, workarounds and behaviours should ultimately be filtered out. However, this “balance” is becoming increasingly reliant on technology and sophisticated in-game surveillance tools.
Gamine Expedition: Virtual Playground Monitors
Rock on Gamine Expedition! This is GREAT READ for anyone in this biz who needs an intro to the importance of moderation & mod tools, so please click the link above and get to reading. If you want, bring back questions and we can start some discussions about this stuff.
Cheers!
Noteworthy: FusionFall Wrap Report
The Project
FusionFall is a high-quality, browser-based MMOG that takes place in a re-imagined Cartoon Network universe. It is the ultimate crossover, with characters from classic shows like Dexter’s Laboratory and The Powerpuff Girls and more recent ones like Ben 10 Alien Force.we knew we wanted to build a game that was as universally loved as the cartoons it’s based on. To this end, we began looking for the perfect partner to help us make one that could be successful both here in the US and abroad.
From a story standpoint, you play as a boy or girl helping to fend off Planet Fusion, a giant mass of planets that is trying to absorb ours. Of course, you will have help from Ben Tennyson, the Powerpuff Girls, Samurai Jack and the Kids Next Door along the way. Because the threat is so huge, even some of the bad guys offer their assistance.
IGN: Cartoon Network Universe: FusionFall Wrap Report
Check this out!! I still say (no matter the bugs or scaling, etc) that this is one of the most interesting MMO experiences for tween demo to date – simply because of the scale of the project (multi-multi IP’s & styles) and the complex nature of the gaming system. It’s worth exploring, understanding, and reading insight behind – because a lot of the backend thought processes on this could redefine the way some entertainment IP’s approach the VW experience. It’s worth playing to see if you build any connections between what has been said, what they could do in the future, what they should have done, and all the in-betweens of time & logic. The whole adventure – from start to finish, with all the ups and downs, could probably give someone an entire thesis on the art of youth & gaming & web.
A Parents Eye View of the Ole Club Penguin
In the middle of a playdate with one of his best buddies a few months ago, my then-8-year-old came over and asked me how to spell “penguin.”
“Penguin?” I asked, puzzled. “As in Mr. Popper’s Penguins?”
“No,” Jake clarified. “As in Club Penguin. We want to play, but we can’t get to the Website.” And just like that, my third grader’s age of digital innocence ended, as both of us dove headfirst into the junior cyber-social world.
And I do mean both of us. Because after Jake went to bed that night (giddy with excitement over the creation of his penguin alter ego — or “avatar”), I decided I needed to find out just what was going on in those millions of online igloos that have kids so addicted.
Club Penguin – Kids Online Games – Goodhousekeeping.com
Here’s an IN-TER-ESTING piece on Club Penguin from a parent’s perspective from Good Housekeeping. I HIGHLY suggest you check it out.
Look – where ever you find free will, you will find a variety of troubles. Playground play is playground play, and kids are not going to stop unless you take away their free-will-choice of experimenting in social situations. You got to be able to find the good in the bad, and move forward from there with honest understandings and expectations.
BONKERS: Real world Jail for Virtual Drama
TOKYO – A 43-year-old Japanese piano teacher’s sudden divorce from her online husband in a virtual game world made her so angry that she logged on and killed his digital persona, police said Thursday.
The woman, who has been jailed on suspicion of illegally accessing a computer and manipulating electronic data, used his identification and password to log onto popular interactive game “Maple Story” to carry out the virtual murder in mid-May, a police official in northern Sapporo City said on condition of anonymity, citing department policy.
“I was suddenly divorced, without a word of warning. That made me so angry,” the official quoted her as telling investigators and admitting the allegations.
The woman had not plotted any revenge in the real world, the official said.
She has not yet been formally charged, but if convicted could face a prison term of up to five years or a fine up to $5,000.
Online divorcee jailed after killing virtual hubby – Yahoo! News
I’m sorry… I had to share this. Mainly because I found it slightly insane-hilarious, very interesting for the future state of online affairs (see what I did there, you can choose which “affair” meaning you wish to use), but also because I am NOT a fan of Maple Story. Sure, cute, but I have seen MORE than my share of insane content & inappropriate interactivity for the users there.
Thoughts?
Ramble: Adults hanging with Kids in Social Networks
My headline’s referring to “slang for how students feel creeped out by school teachers and college professors who are using Facebook and MySpace to interact with their students online,” the Dallas Morning News reports, adding that “the term derives from urban legends about sexual predators luring children into treehouses.” Of course that’s not fair to a lot of teachers who are in social-network sites to understand their students’ real, outside-of-school lives. In any case, there are now student Facebook groups on both sides of the question: “Teachers … please stop going on Facebook,” “Students should get over Teachers being on Facebook,” and “No … it’s not awkward being friends with my teachers on Facebook.” Check out the article to see what some principals says, as well as some examples of “Creepy Treehouse.” See also “Online student-teacher friendships can be tricky” at CNN..
Okay… sigh. As someone who has worked with kids – camp, classroom, field, and now in the online space, I honestly cannot agree with teachers and kids friending each other in social networks… UNLESS they’re educationally focused & completely fortified with protection for both parties.
Why does a teacher need to “understand” the outside lives of their students in a personal sense?
An adult individual should be able to go into Facebook/Myspace, etc… but as a person of their own identity & right. Teachers keep their lives privy to themselves for specific safe reasons. Students under the age of 18 (22 for that matter), do not have the development and growth necessary to be able to fully understand & judge situations like that. A 17 year old doesn’t necessarily see “what’s wrong” with forming relationships with older people because they believe themselves to be MATURE, on the brink of adulthood. Um. No.
Any social network that allows for private messaging/personal emailings? To me that’s a BIG red flag for misunderstandings/inappropriate liaisons, and youthful assumptions. Might as well be passing secret notes, or hanging out before & after school in each others’ car. CREEPY. NO.
Do you see where I am going with this?
I’ve worked in various situations with kids of all ages, and yes – you do have some sort of Mr. Rogers fondness – in a completely genuine, looking-out-for-the-kid way… and even THAT connection can sometimes be mistaken or judged improper by others… and once that yellow flag has been raised regarding a person’s connection to a minor, then assumptions happen & trouble brews.
Side tangent: I get crankypants when people “jokingly” question Mr. Roger’s relationship to his audience – and the man was practically a saint. Jokes? Sure… but jokes are always stemming from some sort of assumption/theory/truth, right? And that’s just Mr. “I’m on television and not seeing your kids every day both online and in the classroom” Rogers. But you know what? It keeps happening… and the best bet is to remove yourself from situations like that.
There’s always Halmark-movie-worthy teachers who jump in and save the day for their student. The working hero, that’s what teachers are in many cases. I just don’t see how jumping into the peer-to-peer sphere of social networking sites can be in ANY way safe. Sure, there’s always the case where teachers/parents/educators are trying to help kids by preventing bullying, or finding ways to empower youth – but there should be well documented practices and procedures. And – even if a teacher is trying to “save” one of their wayward students by invading the net & friending that child… being a “maverick” is dangerous. Document, document, document. Back your shiznit up, yo.
Again, there is nothing wrong with an adult going into a social network site for their own personal life – with their adult friends & family, where they have various elements of their life HOPEFULLY on lock-down (with permissions). But students & impressionable youth who spend their days looking to you as a role model/idealist educator/the law… they shouldn’t be invited to feel like they belong in your personal life as a peer & friend. At least, that’s my opinion.
I remind my community dept team nigh-on daily that they cannot form connections to regular users on the site, or other research sites. It’s just smart. They have gone through rigorous training to be able to work on the net with adolescents. They are aware of the situations, liabilities, issues, and problematic situations that can happen, and are taught out to avoid. I’ve always encouraged my staff to talk, talk, talk about the stuff they see so that they don’t with-hold any happenings, and so they can laugh & talk as ADULTS and third-party entities… helping each other to recognize the fact that our audience = children, and not peers. To help each other remember that we cannot play favorites from one user to another in situations, and to always be as removed & unbiased as possible.
Believe it or not, it can be slightly hard some times. When you have the same ROCKSTAR users coming in daily, who help re-enforce community and provide awesome content… those kids, sure they rock. But they’re also children and have bad days, or watch to occasionally push the limits. Moderators have to stay as personally removed & unbiased as possible, otherwise issues start up.
That’s why it’s difficult to have known-moderators in virtual worlds. Kids become attached, kids form friendship expectations. Kids push for special treatment, and they demand acknowledgment. They can identify you, and see how you are different from other mods – and then build personality around you, and help stoke fire of “drama”.
There’s a relatively new virtual world out there that I dig. They’re doing some great & innovative things. My main warning flag is their use of moderators. It’s funny how much I’ve flipped from being someone who DEMANDS VISIBLE MODS to someone to someone who avoids that kind of liability. Mods & mod staff should always be 24/7, and well maintained – and that accountability should be available in some form of communication to parents. However, when you have live, visible mods – certain expectations come around from the users/audience.
The mods need to almost always be visible (“What? It says there isn’t a moderator on!). Having one or two mods available also raises questions to parents about how many mods should be visible to them at ANY time (“What? Only 1 mod to these 40 kids? That doesn’t seem like a good ratio”). There also comes questions of accountability (“If this mod is playing tag with the kids, who is watching the others? Who is dealing with behavior reports?”). Granted, some of my questions seem a bit advanced for parents to grasp from a community dept stance, but nonetheless – they will get asked sooner or later. Not to mention, one of my researching mods already has formed opinions about that VW’s “game moderator. She identified with him and was curious about him as a person, and not as a character-mod in the world. As soon as she realized this, she saw the liability of becoming a live, roaming moderator, and what caution needs to come with the job.
Back in the days of TalkCity, there were always rumors of romance between moderators & peeps in the chat room. At the time, the average age had to be around 15 (I was 19 at the time). The good moderators who stayed out of the drama were VERY distant. Kind to noobs, like a hostess/host, but then quiet and despondent to anything other than chat room inquiries. Then there were the ones who become too “friendly” with the users, and who the users then depended on to lead conversations & roleplay & declare friends… Users always accused them of “playing favorites” (even when they weren’t), and reported mods who may have offended the game play if they didn’t play the way that user wanted them too. These kinds of things can put your job in jeopardy, as well as raise yellow flags on your behavior as a responsible adult patrol moderating chat rooms of minors.
Basically – when it comes to online behavior & social networks, there are just TOO MANY sketchtastic variables that can put an adult in awkward situations, or worse – in real trouble, despite good (or bad) intentions. For teachers especially – who are ALWAYS (bless them) in the cross-fire of high expectations, parents, school legislation, and the needs of our nation’s future… it’s just not a good idea. AT all.
Passing this along: Marketing & Children, some reading
On Mondays I feel like I have all the time in the world to read — not skim — anything and everything.
Even long, complicated articles.
You too? Here’s some recommended reading:
At Sea in a Marketing-Saturated World: The Eleventh Annual Report on Schoolhouse Commercialism Trends: 2007-2008. From the Commercialism in Education Research Unit at Arizona State University. Browse CERU’s other publications.
Monograph 19: The Role of the Media in Promoting and Reducing Tobacco Use. From the National Cancer Institute. Seems that tobacco marketing tactics are mimicked by the food industry.
Consumer Behavior: The Psychology of Marketing. From Dr. Lars Perner at the University of Southern California. This is how it’s done.
Corporate Babysitter » Blog Archive » Heavy Monday morning reading on marketing to children
Ooo! These are great tidbits o readin’. Thanks and props to the ever vigilant Corp Babysister!
Community Peeps & Parents: Jargonbuster!
Sometimes it may seem as though you need a translator to understand what your teenager is talking about. With a new slang word seemingly invented every week; many parents find it difficult to keep up with the latest phrases and sayings. This is why we’ve come up with a teen-speak jargon buster. Put together by teenagers and parents, this dictionary of ‘teenglish’ (complete with definitions) should help break down the language barrier.
For those boosting up their chat/phrase filters – here’s a UK tool that will help you stay on top of the times! I digg it (see what I did there – “digg” being a social bookmarking device… and dig, as in- i like this tool… Okay, I’m a dork)
Props to ypulse for the find!
Awesome VW Community Partnerships
Crisp Thinking and the Metaverse Mod Squad announced a strategic partnership today to train the Mod Squad’s moderators on Crisp’s NetModerator automatic moderation tool, launched earlier this summer. The Metaverse Mod Squad, an avatar staffing provider for moderation and events, drew over $200,000 in angel investments this summer as well. The partnership will allow the two to mutually market their two products.
Virtual Worlds News: Crisp Thinking Partners with Metaverse Mod Squad
Sorry, sorry for not posting in NEARLY A WHOLE WEEK? Ugh. Part of the fun of going to conferences is avoiding the a-typical day-to-day. Unforch, for yours truly, that means playing catch-up straight through the weekend and onward.
The other part of fun in regards to conferences??? MEETING WEB-FRIENDS! Or web-acquaintances, etc. I finally got to touch base with the kind folks at Metaverse Mod Squad (props to you, Amy & Mike) & Crisp Thinking (w00t, Campbell)!! And kind they are. So to have friends to partner together? Why, it’s like a party of awesomeness. Word on the street is that Crisp Thinking has also paired up with eModeration (also, awesome people that I used to work with!):
Ground-breaking technology and online moderation techniques are being combined to make virtual communities and MMOGs (Massive Multi-player Online Games) safer for children. eModeration, the user-generated content moderation company, has partnered with online child protection technology company, Crisp Thinking, to offer ‘NetModerator’, Crisp’s new technology, which is the most comprehensive anti-grooming and anti-bullying system available.
NetModerator works by analysing online chat between users of a virtual community, or MMOG, as it happens. Its software searches for phrases, words, or patterns of behaviour that might indicate inappropriate behaviour online, assessing and ranking risk. eModeration’s team of moderators are alerted to any issues, so they can take appropriate action.
It is already possible to use technology to monitor and block obviously inappropriate behaviour – such as sexually explicit language, giving out personal details and bullying – but this is the first time that technology has been used to alert moderators to patterns of behaviour and relationships over a period of time that, taken on a case by case basis, might seem innocent.
http://www.emoderation.com/news/emoderation-and-crisp-thinking/
Congrats, everyone!!!
GREAT interview from Virtual World News and Cartoon Network
VWN: Have you seen any other patterns in the way kids are using Mini Match?
MC: We’ve only been live since June, so we’re still trying to figure out that patterns. They play a lot–that is something we noticed about other virtual worlds. After you’ve got in there and are playing for a bit, the thing you really do is play games and accumulate points. That should be easier. The other thing is that the avatar should stay the same. There are some communities where the avatar changes in different environments.
And they love mysteries. They love these environmental games we’ve included where you bump into an item, and you’re turned into an alien, things like that. We’ve added mysteries and puzzles like that all over, and we’re adding more. It’s like Lost, except for I’ll promise you that you won’t have have to wait for six years to find out the answers.
The other thing we’ve tried to introduce is a mix of modern fashion and a little bit of the fantastical. If you feel like looking like a pirate or alien or whatever or just layering your clothes, that’s there.
In one of our focus groups we looked at how boys and girls asked differently. Then the moderator asked them to enter the game. All but one girl had entered. The moderator went to make sure everything was okay. She said, “oh yeah, yeah. I just need to change clothes before I play.”
Virtual Worlds News: Q&A on Mini Match: Molly Chase, VP and Executive Producer, Cartoon Network
This is CHALK FULL of excellent tidbits. It’s SO GREAT to hear CN talk with experience and understanding about their demographic.
I wish I could ramble, elaborate, and story tell more right now – because this is just filled to the brim with gems… but alas, time escapes me.
I HIGHLY HIGHLY suggest you head over to VWN to read a bit more about how a company (Cartoon Network) assesses their audience, provides, and plans… I’m very impressed.
There’s always a difference between what Beta sites look like and what they have planned… For me – someone who follows this market so closely, it’s great to know that Mini Match – as exists now – is just the beginning. I can’t wait to see how they use what they’ve assessed to the advantage of their audience, and what (if any) revolutionary things they bring up (which I’m sure they will, since the UI is already different than many sites).