The communications watchdog will be given powers to monitor and regulate content on digital platforms under legislation introduced to federal parliament on Thursday.
But the potential laws against misinformation and disinformation would stop short of allowing the authority to issue take-down notices for online content or for individual social media accounts.
Communications Minister Michelle Rowland said misinformation on digital platforms had led to significant risks.
"The rapid spread of seriously harmful mis and disinformation poses a significant challenge to the functioning of societies around the world."
Under the changes, the Australian Communications and Media Authority would be able monitor digital platforms and require them to keep records about misinformation and disinformation on their networks.
The watchdog would also be able to approve an enforceable industry code of conduct or introduce standards for social media companies if self-regulation was deemed to fail.
Not all digital platforms have signed up to the code, with companies such as X and Snapchat not participating.
The proposed law comes one month after a national survey by three Australian universities found most Australians wanted more action to stop the spread of misinformation online.
The survey of more than 4400 people, overseen by researchers at Western Sydney University, the University of Canberra and QUT, found 80 per cent of Australians wanted a crackdown on misinformation - up by six per cent compared to 2021.
"It is digital platforms that remain responsible and accountable for the content they host and promote to Australian users," Ms Rowland said.
The minister said the threshold for misinformation would be high, and would have to demonstrate it would have far-reaching consequences.
Greens communications spokeswoman Sarah Hanson-Young said the laws needed to be examined in an inquiry to ensure the social media crackdown would work effectively.
"With both the US election and the looming federal election here in Australia, we must ensure that the community is protected from the vicious and deliberate spread of information designed to trick, manipulate and harm the community," she said.
"We want to see the tech giants regulated properly with the onus on them to make their platforms safe."
By Andrew Brown and Jennifer Dudley-Nicholson