Roblox, Fortnite, Minecraft and Steam have received a notice from Australia's eSafety Commissioner requiring them to explain how they are identifying, preventing and responding to serious online harms.
Concerns have been raised about platforms being used as a point of first contact by sexual predators to groom children, or by extremists to spread violent propaganda and radicalise them.
The "global precedent" has been praised by Australian Catholic University professor Niusha Shafiabady as the latest step in safeguarding children.
"In the scale that a game like Roblox has, controlling everything is impossible ... but any ways to mitigate these risks that exist is better than nothing," she told AAP on Wednesday.
Roblox and Fortnite are among the most popular games for younger children, but both have been embroiled in various controversies.
Neo-Nazi, anti-Semitic and violent content has been found on Fortnite, including a map based on a concentration camp where 100,000 people were killed during World War II.
Terrorist attacks and mass shootings have reportedly been recreated on Roblox.
University of Sydney researcher Milica Stilinovic described the commission's attempts as "essentially playing whack-a-mole" because the internet is fluid, but said compelling platforms to be forthcoming about the user experience was needed.
"Seeking transparency from these particular platforms is crucial because they're not coming to the table in terms of how the plumbing works on the back end," Dr Stilinovic said.
The video game platforms face fines of up to $825,000 per day should they fail to comply with the commissioner's notice.
"Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate," eSafety Commissioner Julie Inman Grant said.
"Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms."
About nine in 10 Australian children between the ages of eight and 17 have played games online.
Online services are required to implement processes to protect Australians from illegal and restricted material, including measures to address risks of grooming.
Roblox has pledged to make private by default those accounts belonging to children under 16, and will introduce tools to prevent adults from contacting them without parental consent.
Fortnite developer Epic Games uses chat filters to remove hate speech and has implemented systems to automatically report potentially harmful chat interactions with those under 18.
Players under 16 are not allowed to use text or voice chat until a parent consents.