Modeling Big Data Systems by Extending the Palladio Component Model

Johannes Kroß , Andreas Brunnert und Helmut Krcmar

Softwaretechnik-Trends, 35(3)

November 2015

Zusammenfassung

The growing availability of big data has induced new storing and processing techniques implemented in big data systems such as Apache Hadoop or Apache Spark. With increased implementations of these systems in organizations, simultaneously, the requirements regarding performance qualities such as response time, throughput, and resource utilization increase to create added value. Guaranteeing these performance requirements as well as eciently planning needed capacities in advance is an enormous challenge. Performance models such as the Palladio component model (PCM) allow for addressing such problems. Therefore, we propose a meta-model extension for PCM to be able to model typical characteristics of big data systems. The extension consists of two parts. First, the meta-model is extended to support parallel computing by forking an operation multiple times on a computer cluster as intended by the single instruction, multiple data (SIMD) architecture. Second, modeling of computer clusters is integrated into the meta-model so operations can be properly scheduled on contained computing nodes.

Stichworte: Palladio Component Model; Performance Model; Big Data

Url: http://pi.informatik.uni-siegen.de/gi/stt/35_3/03_Technische_Beitraege/SSP_2015_paper_5.pdf