Facebook has agreed to investigate whether hate speech against Muslims was covered by its community standards despite announcing in June that it would not police online posts about Islam.

The social media giant said two weeks ago it had reviewed all the relevant posts and that while some meet the community standards, others “do not”.

In June Facebook revealed it would not be removing hate speech against Muslims – including comments that “Muslims are animals” and “kill them all” – because they are regulated by a subsidiary organisation, Oculus.

Only a very small number of posts online contain specific comments about Islam and others could not be shown on Facebook because it does not want to prevent any reasonable and valid discussions of these topics.

Facebook’s announcement came after pressure from several organisations who wanted to know why it was still not policing comments against Muslims and other faiths.

Facebook signalled it was investigating in June, saying it had “launched a broad internal review” to determine what would get reported to it by a community member, and “will investigate whether these comments meet our community standards”.

A spokeswoman said the company was still investigating but it now appeared some posts do not meet its standards.

“That means we won’t show them to people, and we’re taking proactive steps to understand how we can help our community members remove these type of posts,” she said.

“We started this review in June and have completed it recently, using third-party reviews and fact checks to ensure the content doesn’t violate our policies.

“While some of these posts do meet our community standards, others do not.”

She declined to comment further, saying Facebook’s policy is to say whether a post is covered by Facebook’s policies, but that the company is not confirming which posts made it into the third review.

The spokeswoman said it would post this review on its community standards webpage for anyone to see.

Four years ago, the case of the “Ching” video campaign brought to light the diversity of the content that is policed by Facebook.

A political satire video that parodied an Islamic jihadi named after the gunman who shot dead 14 people at Fort Hood in Texas, it drew concern about its content, but was given the green light by the company’s community standards.