this goes beyond getting some things wrong. It's conscious effort to be biased in a very specific way. When it struggles to put a white person in the generated images, or has trouble deciding whether Musk or Hitler were worse for humanity, you know it's been trained and guardrailed hard the wrong way