An LLM cannot be anything other than a bullshit machine. It just guesses at what the next word would likely be. And because it’s trained on source data that contains truths as well as non truths, by chance sometimes what comes out is true. But it doesn’t “know” what is true and what isn’t.
No matter what they try to do, this won’t change. And is one of the main reasons the LLM path will never lead to AGI, although parts of what makes up an LLM could possibly be used inside something that gets to the AGI level.
I would think all the incidents with fracking have shown rock not to be as impermeable as one would expect or want. Doing this and not causing huge issues seems very hard to me. And also very situational, which is a big problem pumped hydro has.
Pumped hydro works really well and is just about as efficient as we can realistically do, but you need to have the right circumstances. Like a biggish elevation difference, a place to store enough water at the top and bottom for it to be worth while and a connection in between to pump through and take out the energy in the other direction. Plus close enough to a place that needs the power not to be killed by transport losses.
This thing seems to require the perfect conditions as well, which may prove even harder to find compared to places for pumped hydro.