Theory of Computing Seminar
Fast distance product of bounded difference matrices with
applications to language edit distance and RNA folding
Abstract: The distance product of two matrices A and B is the matrix
C defined as C[i,j]=min_k A[i,k]+B[k,j].
Computing the distance product of arbitrary n x n matrices is
equivalent (wrt worst case runtime) to the All Pairs Shortest Paths
problem on n-node graphs, a problem whose best runtime has been
notoriously stuck at n^{3-o(1)} for decades. Nevertheless, when A
and B are structured, sometimes truly subcubic, O(n^{3-eps}) time,
for constant eps>0, algorithms are possible.
An interesting structured case that has so far resisted improvements
(and whose best distance product algorithm is not known to take
truly subcubic time) is that of bounded differences (BDD) matrices.
A matrix A is said to be an L- bounded difference (L-BDD) matrix if
for all i,j: |A[i,j] - A[i,j+1]| <= L and |A[i,j] - A[i+1,j]| <= L.
Via an approach of Leslie Valiant, any algorithm for distance
product of O(1)-BDD matrices can be used to solve two other very
different problems in essentially the same time: RNA folding (a
problem formalizing how to find the secondary structure of an RNA
sequence) and Context Free Language Edit Distance (a problem in
which given a context free grammar defining a language L and a
string x, one needs to compute the minimum, over all words y in L,
of the edit distance between x and y.).
In this work we design the first truly subcubic algorithm for L-BDD
matrices for small L, thus obtaining the first truly subcubic
algorithms for RNA folding and Context Free Language Edit Distance.