You are not permitted to download, save or email this image. Visit image gallery to purchase the image.
This is one of a raft of changes announced by Facebook, some of which were prompted by the Christchurch Call - a commitment by countries and tech companies to eliminate violent extremist content online - following the mosque attacks.
In May, Facebook signed up to the Christchurch Call to Action in Paris, co-chaired by Prime Minister Jacinda Ardern and French President Emmanuel Macron, and announced restrictions on who can use Facebook Live.
The live tool was what was used by the accused Christchurch gunman at the mosque shootings, which killed 51 people and injured another 40 during the attack on March 15 this year.
Facebook today announced it was expanding a programme that redirects people to extremism intervention sites in Australia and Indonesia.
"We plan to continue expanding this initiative and we're consulting partners to further build this program in Australia and explore potential collaborations in New Zealand,'' Facebook said in a statement.
"In Australia and Indonesia, when people search for terms associated with hate and extremism, they will be directed to EXIT Australia and ruangngobrol.id respectively. These are local organisations focused on helping individuals leave the direction of violent extremism and terrorism."
Facebook uses automated techniques to identify and remove terrorist content, which has already lead to more than 200 white supremacist organisations being banned.
However, tactics keep changing, which is why the video of the Christchurch attack was not picked up by Facebook.
That was because Facebook "did not have enough content depicting first-person footage of violent events to effectively train our machine learning technology''.
To stay one step ahead, Facebook is obtaining footage from US and UK international firearms training programmes to help it detect scenes like the mosques shootings.
"With this initiative, we aim to improve our detection of real-world, first-person footage of violent events and avoid incorrectly detecting other types of footage such as fictional content from movies or video games,'' the statement said.
Facebook has also developed a definition to guide its decision-making on enforcing against terrorist organisations.
"We are always looking to see where we can improve and refine our approach and we recently updated how we define terrorist organisations in consultation with counter-terrorism, international humanitarian law, freedom of speech, human rights and law enforcement experts.
"The updated definition still focuses on the behaviour, not ideology, of groups. But while our previous definition focused on acts of violence intended to achieve a political or ideological aim, our new definition more clearly delineates that attempts at violence, particularly when directed toward civilians with the intent to coerce and intimidate, also qualify,'' the statement said.
In order to carry out this work Facebook has expanded its team focusing on counter-terrorism to also include "efforts against all people and organisations that proclaim or are engaged in violence leading to real-world harm".
"This new structure was informed by a range of factors, but we were particularly driven by the rise in white supremacist violence and the fact that terrorists increasingly may not be clearly tied to specific terrorist organizations before an attack occurs," the statement said.
Ms Ardern is en route to Japan for trade talks and the Rugby World Cup before heading to the United Nations General Assembly in New York.
While at the UN Ms Ardern is expected to make more announcements about the Christchurch Call.