{"id":7090,"date":"2026-04-23T15:35:01","date_gmt":"2026-04-23T15:35:01","guid":{"rendered":"https:\/\/beteja.com\/index.php\/2026\/04\/23\/australian-government-demands-to-know-what-roblox-minecraft-fortnite-and-steam-are-doing-to-prevent-grooming-radicalisation\/"},"modified":"2026-04-23T15:35:01","modified_gmt":"2026-04-23T15:35:01","slug":"australian-government-demands-to-know-what-roblox-minecraft-fortnite-and-steam-are-doing-to-prevent-grooming-radicalisation","status":"publish","type":"post","link":"https:\/\/beteja.com\/index.php\/2026\/04\/23\/australian-government-demands-to-know-what-roblox-minecraft-fortnite-and-steam-are-doing-to-prevent-grooming-radicalisation\/","title":{"rendered":"Australian Government Demands to Know What Roblox, Minecraft, Fortnite, and Steam are Doing to Prevent Grooming, Radicalisation"},"content":{"rendered":"<p><\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">The Australian Government\u2019s eSafety office has formally asked Roblox, Microsoft, Epic, and Valve to specifically outline how their systems are preventing child grooming and the spread of extremism. The eSafety office is an independent agency that was initially established in 2015 to combat youth cyberbullying and the online distribution of child sexual abuse material, but its role has since expanded to cover protections for all Australians from a spectrum of online risks.<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">As per eSafety\u2019s announcement, legally enforceable transparency notices have been issued to the aforementioned companies in the wake of its continuing concerns about platforms like Roblox, Minecraft, Fortnite, and Steam itself \u201cbeing used by sexual predators to groom children and by extremist groups to spread violent propaganda and radicalise young people.\u201d<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">\u201cWhat we often see after these offenders make contact with children in online game environments, they then move children to private messaging services,\u201d said eSafety Commissioner Julie Inman Grant in a published statement. \u201cGaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate. Our own research into children and gaming showed around 9 in 10 children aged 8 to 17 in Australia had played online games.\u201d<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">Inman Grant went on to point out that predatory adults are well aware of this, and \u201ctarget children through grooming or embedding terrorist and violent extremist narratives in gameplay.\u201d<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">Inman Grant subsequently made reference to \u201cnumerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay.\u201d<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">Examples included \u201cIslamic State-inspired games and recreations of mass shootings on Roblox, as well as far right groups recreating fascist imagery in Minecraft,\u201d plus Fortnite games based upon World War II concentration camps and events like the US Capitol Building riot of January 6, 2021. Inman Grant added that \u201cSteam is reportedly a hub for a number of extreme-right communities.\u201d No specific examples were noted, though Valve has been previously scrutinized over being home to \u201ctens of thousands of groups\u201d that amplify Nazi and other hate-based content.<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">\u201cThese online game and gaming-adjacent platforms are used by millions of children and so it is imperative that they take every possible step to protect them and continue to improve safeguards,\u201d said Inman Grant.<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">The eSafety office notes that compliance with a transparency reporting notice is mandatory, and penalties of up to AUD$825,000 a day can be issued to companies for failure to respond.<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">In a response provided to IGN, Roblox has outlined a selection of measures it currently employs.<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">\u201cWe welcome engagement with eSafety on this important topic,\u201d said a company spokesperson in the statement. &#8220;Roblox has policies that strictly prohibit content or behaviour that incites, condones, supports, glorifies, or promotes any terrorist or extremist organisation or individual, which we work tirelessly to enforce. We swiftly remove such content and take immediate account level action when we find it. We also use advanced AI technology to review all images, text, and avatar items prior to publishing, in order to prevent known extremist iconography from being published. We encourage anyone who sees anything concerning on Roblox to report it to us. Our team works regularly with law enforcement, civil society groups, and other organisations with specific subject matter expertise in countering those who would seek to promote violent extremism. <\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">\u201cLast week, we announced that Roblox will soon introduce new age-based accounts for children under the age of 16. These accounts will more closely align content access, communication settings, and parental controls with a user\u2019s age. While no system is perfect, our commitment to safety never ends, and we will continue to collaborate closely with eSafety on our shared goal of keeping Australian children safe.\u201d <\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">Fortnite developer Epic Games also provided its own statement on the matter to IGN:<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\">&#8220;The islands mentioned in the story violated our rules and we took action against them in 2024,&#8221; said Cat McCormack, Senior Communications Manager, Epic Games. &#8220;Epic\u2019s text chat filters remove mature language including hate speech, and our systems automatically report potentially high-harm interactions in text chat with players under 18 so we can take action. Fortnite has built-in protections for younger players including high-privacy default settings for players under 18 and voice and text chat are off for players under 16 until a parent consents. Using Epic\u2019s Parental Controls, parents can customise their family\u2019s experience including choosing who their child can communicate with.&#8221;\n<\/p>\n<p data-cy=\"paragraph\" class=\"paragraph jsx-2269604527\"><em>Luke is a Senior Editor on the IGN reviews team. You can track him down on Bluesky @mrlukereilly to ask him things about stuff.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The Australian Government\u2019s eSafety office has formally asked Roblox, Microsoft, Epic, and Valve to specifically outline how their systems are preventing child grooming and the spread of extremism. The eSafety office is an independent agency that was initially established in 2015 to combat youth cyberbullying and the online distribution of child sexual abuse material, but<\/p>\n","protected":false},"author":1,"featured_media":7091,"comment_status":"","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"fifu_image_url":"","fifu_image_alt":"","footnotes":""},"categories":[33],"tags":[6080,7002,1610,7001,7004,140,7003,7005,1658,343],"class_list":{"0":"post-7090","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-console-gaming","8":"tag-australian","9":"tag-demands","10":"tag-fortnite","11":"tag-government","12":"tag-grooming","13":"tag-minecraft","14":"tag-prevent","15":"tag-radicalisation","16":"tag-roblox","17":"tag-steam"},"_links":{"self":[{"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/posts\/7090","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/comments?post=7090"}],"version-history":[{"count":0,"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/posts\/7090\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/media\/7091"}],"wp:attachment":[{"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/media?parent=7090"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/categories?post=7090"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/beteja.com\/index.php\/wp-json\/wp\/v2\/tags?post=7090"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}