Facebook is to add 3,000 people to its “community operations team”, to help stop hate speech, child abuse and self-harm being broadcast on the website.
Chief executive Mark Zuckerberg said it had been “heartbreaking” to see people “hurting themselves and others” in videos streamed live on Facebook.
He added he would make reporting problematic videos easier.
The move follows cases of murder and suicide being broadcast live on the social network.
Mr Zuckerberg said the additional staff, joining the 4,500 existing people on the community operations team, would help the company respond more quickly when content was reported.
In a post on his Facebook profile, he said the company would develop new tools to manage the millions of content reports it received every week.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help,” he said.
The post suggested Facebook’s moderators would contact law enforcement, rather than contacting members directly if they were at risk of harm.
“Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself”, said Mr Zuckerberg.