You wouldn't even need the model to be trained in real time. I'd love to see OpenAI buy Wolfram Research. WolframAlpha has managed to integrate tons of external data into a natural language interface. ChatGPT already knows when to insert placeholders, such as "$XX.XX" or "[city name]" when it doesn't know a specific bit of information. Combining the two could be very powerful. You could have data that's far more current than what's possible by retraining a large model.
Forget nudes in generated images. This is the real ethics issue!