I recently deleted over 2 million rows from my db using this (I wanted to clean out any User history where the user doesn't exist any more). I'm using Django simple history.
active_ids = User.objects.values_list("id", flat=True)
User.history.filter(~Q(id__in=active_ids)).delete()
This took about 5 minutes, and deleted all the rows I wanted. However, before I ran the operation, my table size was about 900 MB. After deleting the rows, it's still 700 MB -- way more than I expected. Upon further inspection, it looks like the index_length size of this table is still huge, at 424 MB. I don't think it actually went down at all.
Does this index size go down ever? The User model has a few db_indices, like id, username, and email.
These are mysql issues rather than django issues.
In any RDBMS, be it mysql or postgresql or anything else, bulk insert, delete and update will always be slow if you have indexes. That's the nature of the beast.
The best way to speed it up is to examine your indexes and drop any that are not being used in queries. If you want to keep all indexes, optimize the key buffer size.
As a regular user of postgresql, I assure you that this is not reason enough to switch to postgreql from mysql. If you look at the bigger picture and all other factors, definitely postgresql is better but don't let this issue be the deciding factor.
Unless you are really really pressed for hard disk space, just let it go. That space will be reused by the database as the table grows again. If you add two million new records you will find that the disc usage hasn't increased at all.
If you are really pressed for space. OPTIMIZE TABLE. This operation too will take a long time.