How Big Does Big Data Need to Be?

Research output: Contributions to collected editions/worksContributions to collected editions/anthologiesResearchpeer-review

Authors

Collecting and storing of as many data as possible is common practice in many companies these days. To reduce costs of collecting and storing data that is not relevant, it is important to define which analytical questions are to be answered and how much data is needed to answer these questions. In this chapter,
a process to define an optimal sampling size is proposed. Based on benefit/cost considerations, the authors show how to find the sample size that maximizes the utility of predictive analytics. By applying the proposed process to a case study is shown that only a very small fraction of the available data set is needed to make accurate predictions.
Original languageEnglish
Title of host publicationEnterprise Big Data Engineering, Analytics, and Management
EditorsMartin Atzmueller, Samia Oussena, Thomas Roth-Berghofer
Number of pages12
Place of PublicationHershey
PublisherBusiness Science Reference
Publication date06.2016
Pages1-12
ISBN (print)9781522502937
ISBN (electronic)9781522502944
DOIs
Publication statusPublished - 06.2016