I'm not sure if we should blame the designers of PowerShell (Jeffrey Snover and his team). If you followed the history of their project, it seemed like they had very good ideas, but it was very hard to get buy-in for anything CLI-oriented in Longhorn/Vista-era Microsoft[1]. They've all had experience with Unix shell and their ideas for object based pipes were truly innovative as far as I know.
I can't speak for Jeffrey and his team, but I feel like a lot of their decisions came from trying to get corporate behind the shell and present it as a shell for Windows. They avoided picking political battles outside of their main goals (a modern shell to replace cmd.exe and the object-based pipeline model). What we've got are a set of decisions that aligned with Windows and Microsoft practices of that day and age:
Microsoft is focusing on .Net as their general platform? We'll implement PowerShell on .Net.
Windows's standard for Unicode text files is either UTF-16 LE or having a BOM for any other Unicode transformation? We'll do UTF-16 by default and always add a BOM if you choose UTF-8.
Windows is using CRLF? Well, we'd pipe CRLF-delimtied text by default.
Visual Basic and C# programmers expect functions to have names like "GetChildItem" instead of something like "dir" or "ls"? No problem, we will set the canonical command name to be a long, programming-language-like name and set up aliases that look (but don't behave) like Unix and cmd.exe commands.
The result was not pretty, but I still appreciate the ideas we got from PowerShell. nushell took these ideas and implemented them in a more modern way.
I work regularly in PowerShell and this bugged tf out of me. The result of my digging was: .NET libraries emit textual output in <whatever the offending encoding was, UTF-16 LE or something> -- and since PowerShell is implemented in .NET, it necessarily inherited that default.
Things have improved since then now that everything is implemented in .NET (Core), so if you're working with PoSH 7.x, this is no longer a headache. (I just tested in 5.1, and it seems fixed now, as well.)
Of course, by now, I've just developed muscle memory to add '-enc UTF8' to the end of everything that writes text to disk.
I can't speak for Jeffrey and his team, but I feel like a lot of their decisions came from trying to get corporate behind the shell and present it as a shell for Windows. They avoided picking political battles outside of their main goals (a modern shell to replace cmd.exe and the object-based pipeline model). What we've got are a set of decisions that aligned with Windows and Microsoft practices of that day and age:
Microsoft is focusing on .Net as their general platform? We'll implement PowerShell on .Net.
Windows's standard for Unicode text files is either UTF-16 LE or having a BOM for any other Unicode transformation? We'll do UTF-16 by default and always add a BOM if you choose UTF-8.
Windows is using CRLF? Well, we'd pipe CRLF-delimtied text by default.
Visual Basic and C# programmers expect functions to have names like "GetChildItem" instead of something like "dir" or "ls"? No problem, we will set the canonical command name to be a long, programming-language-like name and set up aliases that look (but don't behave) like Unix and cmd.exe commands.
The result was not pretty, but I still appreciate the ideas we got from PowerShell. nushell took these ideas and implemented them in a more modern way.
[1] https://corecursive.com/building-powershell-with-jeffrey-sno...