Print Email Facebook Twitter A Process Pattern Model for Tackling and Improving Big Data Quality Title A Process Pattern Model for Tackling and Improving Big Data Quality Author Wahyudi, A. (TU Delft Information and Communication Technology) Kuk, George (Nottingham Trent University) Janssen, M.F.W.H.A. (TU Delft Information and Communication Technology) Date 2018 Abstract Data seldom create value by themselves. They need to be linked and combined from multiple sources, which can often come with variable data quality. The task of improving data quality is a recurring challenge. In this paper, we use a case study of a large telecom company to develop a generic process pattern model for improving data quality. The process pattern model is defined as a proven series of activities, aimed at improving the data quality given a certain context, a particular objective, and a specific set of initial conditions. Four different patterns are derived to deal with the variations in data quality of datasets. Instead of having to find the way to improve the quality of big data for each situation, the process model provides data users with generic patterns, which can be used as a reference model to improve big data quality. Subject Big dataData processingData qualityInformation qualityProcess patternsReference model telecom To reference this document use: http://resolver.tudelft.nl/uuid:a410d5ed-8bef-41e8-8140-130ded9cb8ee DOI https://doi.org/10.1007/s10796-017-9822-7 ISSN 1387-3326 Source Information Systems Frontiers: a journal of research and innovation, 1-13 Part of collection Institutional Repository Document type journal article Rights © 2018 A. Wahyudi, George Kuk, M.F.W.H.A. Janssen Files PDF 10.1007_s10796_017_9822_7.pdf 1.65 MB Close viewer /islandora/object/uuid:a410d5ed-8bef-41e8-8140-130ded9cb8ee/datastream/OBJ/view