Celerybeat has the advantage of better visibility e.g. you can configure them in the Django admin and check when they are running. If you are not using Celery though and your needs are simple, it's easier to just use plain crons.
For background tasks - you can just spawn a background process and keep a simple status table in the db so the main app can check if it's completed (assuming you even need that)
And for task queues that can handle the traffic most sites will need there's things like django-huey.
I don't know if making my own bespoke queue system is a great idea. It seems simple enough, but it gets so much more complicated once you start seeing issues with it. Orphaned task processes sticking around on the server forever, concurrency control, error handling, etc. I'll pretty much always just use celery and not have to worry about it.
It's not so bad now as CI/CD, Docker etc have made complex deployments easier to handle. But back when I was wrestling with Django simply deploying Celery on a new host could easily waste an afternoon and all those dependencies made me very nervous about the overall complexity.
I still weigh carefully anything that adds another long-running process or non-Python dependency to my sites.