Someone mentioned Neural Radiance Caching to me recently, which Nvidia's been working on for a while. They presented about it at an event in 2023 (disclaimer: account-gated and I haven't watched - but a 6-minute "teaser" is available: YouTube).
I don't really understand how it works after having skimmed through some stuff about it, but it sounds like it could be one of several ways to improve this specific problem?