# Matrix Multiplication Sees Breakthrough; First In 24 Years

Question asked by smriti in #Coffee Room on Dec 11, 2011

smriti · Dec 11, 2011

Rank C3 - EXPERT

Matrix Multiplication is an inherent part of maths as well as computer science, like Gaussian elimination, LUP decomposition and the determinant or the inverse of a matrix. Matrix Multiplication algorithms have hence been a focus of computer scientists for a while now. Considering its applications in computing, an efficient algorithm for matrix multiplication would be necessary for reducing the time complexity of the operations.

Until late 1960s, the matrix multiplication of two n x n matrices required a cubic number of operations, running in O(n^3) time. In 1969, a faster algorithm was devised by Strassen which reduced the number of operations to O(n^2.808 ) time. This was a noteworthy advancement which reduced the matrix multiplication coefficient 'w' over the time. The next major discovery was made in 1978, by Coppersmith and Winograd who combined the Strassen's algorithm with a form of analysis used for large sets. This averted algorithmic progressions and obtained the famous bound w < 2.376.

Though, there has been a common belief among researchers that w = 2, it hasn't yet been proved on paper. Only recently, another breakthrough was achieved with w< 2.3727. This was shown by Virginia Vassilevska Williams, who is a theoretical computer scientist, in her paper. Though, this is only a drift of .0028 from the Coppersmith-Winograd algorithm but it is considered to be a promising discovery. The method that was used to manage such a bound was recursion. Basically meaning that a 4 x 4 matrix is broken down into four 2 x 2 matrices and then, recursion is applied to the products.

While the algorithms are not likely to be used in practice, an amazing leap has been made in a key mathematical operation, which will help in a richer understanding of complexity theory.

Source: Godel's Lost Letter Image Credit: xkcd Posted in: #Coffee Room

Until late 1960s, the matrix multiplication of two n x n matrices required a cubic number of operations, running in O(n^3) time. In 1969, a faster algorithm was devised by Strassen which reduced the number of operations to O(n^2.808 ) time. This was a noteworthy advancement which reduced the matrix multiplication coefficient 'w' over the time. The next major discovery was made in 1978, by Coppersmith and Winograd who combined the Strassen's algorithm with a form of analysis used for large sets. This averted algorithmic progressions and obtained the famous bound w < 2.376.

Though, there has been a common belief among researchers that w = 2, it hasn't yet been proved on paper. Only recently, another breakthrough was achieved with w< 2.3727. This was shown by Virginia Vassilevska Williams, who is a theoretical computer scientist, in her paper. Though, this is only a drift of .0028 from the Coppersmith-Winograd algorithm but it is considered to be a promising discovery. The method that was used to manage such a bound was recursion. Basically meaning that a 4 x 4 matrix is broken down into four 2 x 2 matrices and then, recursion is applied to the products.

While the algorithms are not likely to be used in practice, an amazing leap has been made in a key mathematical operation, which will help in a richer understanding of complexity theory.

Source: Godel's Lost Letter Image Credit: xkcd Posted in: #Coffee Room