Scalable Analytics


A Note on Scalability, Dr. Alan Wagner Scalability as it used in the computer industry is such a ill-defined term that Mark Hill, a well-known computer architect, once challenged researchers to either define it or stop using the term. However, parallel computing is one area where the term does have a rigorous meaning. In parallel computing scalability refers to the ability to improve performance by using more than one computing device. Performance does not just refer to solving a given problem faster, it is more general than that. Performance may be solving a larger problem in the same amount of time and the problem may be larger in terms of the amount of data or in terms of doing more computation. For example, weather prediction does not need to be faster, but can benefit from either shrinking the grid size or doing more accurate modeling within each grid sector, thus leading to better predictions of the weather.
Download this articleDownload full article

Use Cases
Smart Grid use caseSmart Grid data is outpacing our ability to analyze it. Scalable Analytics was built to scale to accommodate real-time grid analytics on extreme data.

Wall Street use caseScalable Analytics delivers extreme scalability to Wall Street. Financial models built on real-time streaming data with statistical correlations for critical decision making.

Hadoop use caseFrom unstructured textual data to numerical data, Scalable Analytics governs extreme data sets. Scalable Analytics provides solutions for Data Center management.

Ebay use caseFrom correlation to clustering, Scalable Analytics has demonstrated the capability to analyze, in real-time, unstructured textual data to cluster eBay auction data to live stock market feeds.

Home | Use Cases | Solutions | Benefits | Technology | Resources | Investor Relations | Contact Us

© 2011 - Scalable Analytics, Inc. All Rights Reserved.
Scalable Analytics and its logo are trademarks of Scalable Analytics, Inc.