If you do care about what second it is, you can still be 59s off: eg. if the "rounded" clock shows 12:30, it could be 12:29:30 all the way to 12:30:29.
The error is reduced on average if you only care how far away from exact minute you are, but you simultaneously never know when it is the exact minute.
The question is really about what's more generally useful: the offset of 30s is usually not important enough, which is why most clocks only show hours and minutes. Where it matters, higher precision is used.
Some of the social constructs would be less meaningful (like New Year countdown) as most watches and clocks would show midnight 30s early.
The error is reduced on average if you only care how far away from exact minute you are, but you simultaneously never know when it is the exact minute.
The question is really about what's more generally useful: the offset of 30s is usually not important enough, which is why most clocks only show hours and minutes. Where it matters, higher precision is used.
Some of the social constructs would be less meaningful (like New Year countdown) as most watches and clocks would show midnight 30s early.