Honestly? Yes. At least that way there's impartiality, accountability, an appeals process, and enforcement.
If Facebook self-moderates you get all of the same downsides of "big government"[1] moderation and none of the benefits listed above.
[1] I assume your argument is predicated in terms of "big government" as though an unregulated and for-profit company having a near-monopoly on key parts of modern society is somehow superior to any kind of state involvement in reigning-in excesses that the free market fails to address.
In situations regarding literal terrorist propaganda, and active calls for violence (Which were the examples given), which are already illegal?
Yes. The courts are the proper place for determining how literal terrorism/imminent threats of violence should be handled.
I don't think this is controversial, to say that people who make imminent calls to violence, as has been already defined as being illegal by the court system, should be handled by the law.
Most everything else should be not blocked by the platform, though.
But on a publishing platform where a posted article can have millions of viewers with no connection to the author who would miss out on important context... that won't end well.
There's a difference between Facebook facilitating private communication between individuals and small groups with inherently limited information-spread (e.g. phone calls, emails, IM) and Facebook operating a publishing platform that allows for mass communication. The problems we're seeing today stem from that very same mass-communication publishing platform being used as a state-level propaganda tool to sway public opinion (e.g. Russia discouraging Dem-leaning voters in 2016) at one end, to Facebook knowingly allowing and facilitating extremist groups to operate on their platform and coordinate real-life terroristic assaults at the other end.