I'm not sure how a python annotation/type system could possibly do that? If numpy/pandas had different types for different cardinalities it would work today.
You just need those libraries to embrace it really, then you could theoretically have type constructors that provide well-typed NxM matrix types or whatever, allowing you to enforce that [[1,2],[3,4]] is an instance of matrix_t(2, 2).
I don't see how python could possibly make such inferences for arbitrary libraries.
PEP 646 Variadic generics, https://peps.python.org/pep-0646/, was made for this specific use case but mypy is still working on implementing it. And even with it, it's expected several more peps are needed to make operations on variadic types powerful enough to handle common array operations. numpy/tensorflow/etc do broadcasting a lot and that probably would need a type level operator Broadcast just to encode that. I also expect the type definitions for numpy will go fairly complex similar to template heavy C++ code after they add shape types.
You just need those libraries to embrace it really, then you could theoretically have type constructors that provide well-typed NxM matrix types or whatever, allowing you to enforce that [[1,2],[3,4]] is an instance of matrix_t(2, 2).
I don't see how python could possibly make such inferences for arbitrary libraries.