I don’t think that’s right. I looked into the way Twitter counts characters when I was trying to work out the largest prime number that could be written out in full, in base ten, in a single tweet[1]; the rules are more complicated than you might expect, and have changed several times.
The current rule seems to be that all Unicode characters count as two, except for the ranges 0–4351, 8192–8205, 8208–8223 and 8242–8247 which count as one.
Good point! Still, I could swear I saw someone (@FakeUnicode?) do exactly this once, but of course I can’t find that tweet any more, partly because it turns out that search engines don’t handle ﷽ well at all, and I don’t feel like testing it on my own followers somehow.
Edit: it looks like it might count it as two characters, so that’s only 140 per tweet.
That’s definitely possible! @FakeUnicode mentioned in the discussion that, when 280-character tweets were first introduced in September 2017, it was possible to tweet 280 single-codepoint emoji using TweetDeck.
Fun fact: it’s a single Unicode character.