Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Laying A Foundation For Distributed Computing's Next-Gen: Page 6 of 9

We are interested to see if recent advances in statistical learning theory will allow us to make interesting insights into systems behavior more rapidly and accurately than human operators are doing now.

We have a faculty member here I refer to as the Michael Jordan of statistical learning theory — his name is Michael Jordan [see www.cs.berkeley.edu/~jordan/]. In his view, statistical learning theory has made great strides in the last decade. [Statistical learning theorists] have proved theorems that allow [developers] to do things they have never done before. Thanks to these proofs they can do amazingly complicated things in seconds that before these proofs would have taken years to compute.

They have been using these proofs to control chemical processes and other fields. We want to see if we can use these theorems so computers can help run computer systems.

EET: How does this relate to the growing complexity of systems? For instance, I heard a recent presentation from an Amazon.com executive who said the company analyzes a terabyte of Web transaction data every day to do what he called computational marketing.

Patterson: Statistical learning theory is making dramatic strides that are potentially much greater than the complexity growth of systems. What's interesting about statistical learning theory is it is at its best when you have phenomenal amounts of data. It has its advantage when there is too much data for a human being to analyze.