If you just want the k shortest paths between two vertices in a graph, and you don't care whether a path might have repeated vertices in it, you can do it in constant time per path after a near-linear amount of preprocessing — this is the result from my FOCS'94 / SICOMP'98 paper "Finding the k shortest paths", still my most heavily cited paper. This works well, for instance, when the input is a DAG, because in that case repeated vertices are impossible. But it's less interesting in road networks: you wouldn't want to ask your GPS for an alternative route and be told to follow the same route as before except for turning into a short cul-de-sac and then coming back out of it, somewhere in the middle of the route.
What is often used instead is an algorithm for k shortest simple paths: each path has to avoid repeated vertices and edges. They can still be found, but not as efficiently. Although heuristic improvements have been developed, the best algorithm (in terms of its worst-case complexity) is still the one from a 1971 paper by Jin-Yu Yen, whose time is O(mn) per path, much slower. And there's some evidence in a FOCS'10 paper of Williams and Williams that this time bound, which is cubic for dense graphs, is the best possible: it can't be improved to have a better exponent of n than three without making similar improvements to several other famous algorithmic problems. Their result doesn't rule out moving the cubic part of the cost from being per-path to being part of the preprocessing, but I don't know how to do that either.
So instead, Denis and I considered whether the time for finding simple paths could be improved if we assumed that the graphs had some additional structure. The answer is yes: for graphs of bounded treewidth we can find the k shortest simple paths in only logarithmic time per path. It's not quite constant, but it is exponentially faster per path than Yen.
It's not entirely surprising that an improvement is possible for bounded-treewidth graphs (a lot of things are easier for them) but proving it involved a novel combination of algorithmic components including shallow tree decompositions, logical formulations of graph properties, dynamic graph data structures, path-copying partial persistence, and selection in heap-ordered infinite trees.
In part because of the generality of these tools, our main result also ended up being very general: instead of just applying to simple paths, we can find the k best solutions to a broad family of combinatorial optimization problems on graphs, basically the problems that can be solved efficiently using Courcelle's theorem. For instance, we can find the k best traveling salesperson tours, no less efficiently than the k shortest simple paths.