Latent semantic indexing is a very nifty technology with obvious usefulness for search engines. But what would be really cool is latent semantic linking -- where the latent semantics in the content were less analyzed than the latent semantics in the links. Here's why: if you google "Rails 1.1.6 1.2.2", you get pages referencing train rails of 1.2.2 gauge thickness (or something).
Latent semantic link analysis would basically apply AI math to PageRank. Instead of just saying, this page is a popular page containing these terms, LSL would say, this page is a popular page containing these terms, and which is also linked to by pages which are also known to be relevant.
You wouldn't need it every single time, but it'd make a great filter on existing searches. A little link in the search result: "results relevant to this one."
If sites were mapped in subsets this way, it'd be easy to "search all Rails blogs" or whatever.