The problem with assistants and the like is that they hide features even more. On iOS you can browse through the apps and settings, on Siri you can't. Sure you can try some but there's a lot of "Why didn't I think of that?" options available there.
That's true, but I think there's a lot of less technologically proficient people for whom this actually works better.
Like, as tech nerds we have no problem navigating lists of lists of settings, we can usually guess where a setting will be if it exists. But for a lot of people, just asking "Hey Siri, can you make the letters bigger?" would be more accessible.
But that's where the behavior analysis here would be key. Someone mentioned it in a previous post but imagine if Siri, by way of the FaceID sensors, detected that someone's face was closer to the screen or that they were squinting and recommended that she could increase the size of the text on whatever they're reading. Better yet, imagine if someone repeatedly hit a point on the screen followed by the back button and Siri suggested making the buttons on screen larger. At that point, people would view Siri as more of the personal assistant she's meant to be and ask her if she can do things rather than what she can do.