Usually? Nothing.
Occasionally, if it's a common piece of misinformation that's been widely-shared enough to justify manual intervention, a fact-check box may appear underneath. This happens to "facts" with a conservative political slant, as well as "facts" with a liberal political slant. If you run a page, your page can get "strikes" for this sort of thing that will cause you to down-rank in people's pages, but if anything, Facebook has been caught out for tipping towards the conservative side of that scale, suppressing that penalty on some popular conservative pages (https://www.engadget.com/facebook-overruled-fact-checkers-to...).
Rarely, you'll share "misinformation" that is also in violation of community standards (I'll leave that to the reader's imagination) and get a time-out proportional to how often that happens.
To understand Facebook's behavior, it's useful to remember that their goal is growth and retention. They want everybody using the service. I suspect, based on observation of their behavior, that they've discovered for themselves that without those measures, growth and retention are being harmed more than they'e harmed via time-outs and fact-checking (i.e. organized boycotts over Facebook being a place where falsehoods spread wildly, people encouraging their more vulnerable friends and relatives to stay off FB so the misinformation parade doesn't convince them to take horse-dewormer, etc.).