Western feminism is no longer a movement that does any appreciable work towards equal rights for women. They, uhh, kind of already have that in the developed world.
The fact that "western feminism" doesn't appear to believe that women have achieved equal rights seems to me like maybe there's some more discussion to be had and changes to be made?
Feminists are still trying to change society and still have a narrative of being about "equal rights" as a concept. They have equality under the law by any reasonable definition. Reproductive health access and family leave laws are the best example I can think of, and even that's a stretch. Do you have better examples?