Do you know comparatively how much GPU time training the models which run Waymo costs compared to Gemini? I'm genuinely curious, my assumption would be that Google has devoted at least as much GPU time in their datacenters to training Waymo models as they have Gemini models. But if it's significantly more efficient on training (or inference?) that's very interesting.