Search for notes by fellow students, in your own course and all over the country.

Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.

My Basket

You have nothing in your shopping cart yet.

Title: HOW TO MAKE MONEY WITH FACEBOOK
Description: Facebook holds spectacular opportunities for making money. And you can indeed make money from facebook, no technical skills required! NO COST TO SETUP AND GET STARTED MAKING MONEY WITH FACEBOOK. This ebook contains easy step by step tutorial on on how to use your desired facebook page to make cool money online........

Document Preview

Extracts from the notes are below, to see the PDF you'll receive please use the links above


doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

The costs of poor data quality
Anders Haug, Frederik Zachariassen, Dennis van Liempd
University of Southern Denmark (DENMARK)
adg@sam
...
dk; frz@sam
...
dk; dvl@sam
...
dk
Received August 2010
Accepted January 2011

Abstract:
Purpose: The technological developments have implied that companies store
increasingly more data
...
This paper argues that perfect data quality should not be the goal, but
instead the data quality should be improved to only a certain level
...


Design/methodology/approach: The paper starts with a review of data quality
literature
...
These
propositions are investigated by a case study
...
A case study
illustrates the usefulness of these propositions
...
Future
research may build on these definitions
...


Practical implications: As illustrated by the case study, the definitions provided
by this paper can be used for determining the right data maintenance effort and
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
In many companies, such insights may lead to
significant savings
...
This
represents an original contribution of value to future research and practice
...
Poor quality data can, therefore, have
significantly negative impacts on the efficiency of an organization, while high
quality data are often crucial to a company's success (Madnick et al
...
, 2009; Batini et al
...
However,
several industry expert surveys indicate that data quality is an area, to which many
companies seem not to give sufficient attention or know how to deal with efficiently
(Marsh, 2005; Piprani & Ernst, 2008; Jing-hua et al
...

Vayghan et al
...
Master data are
defined as the basic characteristics of business entities, i
...
customers, products,
employees, suppliers, etc
...

Transaction data describe the relevant events in a company, i
...
orders, invoices,
payments, deliveries, storage records etc
...
g
...
In this context Knolmayer and Röthlin (2006)
argue that capturing and processing master data are error-prone activities where
inappropriate information system architectures, insufficient coordination with
business processes, inadequate software implementations or inattentive user
behaviour may lead to disparate master data
...
Haug; F
...
van Liempd

169

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

In spite of the importance of having correct and adequate data in a company, there
seems to be a general agreement in literature that poor quality data is a problem
in many companies
...

On the other hand, Eppler and Helfert (2004) argue that although there is much
literature that claims that the costs of poor data quality are significant in many
companies, only very few studies demonstrate how to identify, categorize and
measure such costs (i
...
how to establish the causal links between poor data
quality and monetary effects)
...
The efforts have been directed to investigating the effects of data
errors on computer-based models such as neural networks, linear regression
models, rule-based systems, etc
...
According to Kim
(2002), the types of damage that low quality data can cause depend on the nature
of data, the nature of the use of data, the types of responses (by the customers or
citizens) to the damages, etc
...
Firstly, companies incur costs when cleaning and ensuring high
master data quality
...
The purpose of this paper is to provide a better understanding of the
relationship between such costs
...

In this context the paper argues that there is a clear trade-off relationship between
these two cost types and that the task facing the companies in turn is to balance
this trade-off
...
Next, Section 3 proposes a model to determine the
optimal data maintenance effort and a classification of different types of costs
inflicted by poor quality data
...
The paper ends with a conclusion in section 5
...
Haug; F
...
van Liempd

170

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

2

Data quality literature

Firstly, this section makes a clarification of the term 'data quality' and then
provides a fundamental understanding of the impacts of poor quality data
...

2
...
Popular definitions of such
terms have been made by Davenport and Prusak (1998), who define data as
“discrete, objective facts about events” and information as data transformed by the
value-adding processes of contextualization, categorization, calculation, correction
and condensation
...
(2002), who
define data as “providing a record of signs and observations collected from various
sources” and information as when “data are presented in a particular way in
relation to a particular context of action”
...
, 2002)
...

Data quality is often defined as 'fitness for use', i
...
an evaluation of to which
extent some data serve the purposes of the user (e
...
Lederman et al
...
Another way to understand
the concept of data quality is by dividing it into subcategories and dimensions
...

They argue that the accuracy dimension is the easiest to evaluate as it is merely a
matter of analysing the difference between the correct value and the actual value
used
...
As for the evaluation of the completeness of some
data, this can also be done relatively straight forward, as long as the focus is on
whether the data are complete or not in contrast to defining the level of
completeness, e
...
the percentage of data completeness
...
Haug; F
...
van Liempd

171

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

evaluation of consistency is a little more complex, since this requires two or more
representation schemes in order to be able to make a comparison
...
They
limit their focus to intrinsic data qualities, of which they define four intrinsic
dimensions: completeness, unambiguousness, meaningfulness and correctness
...
e
...
(1995)
...
(1995) summarize the most often cited data quality
dimensions as shown in Table 1
...
“Cited data quality dimensions”
...
(1995)
...
For each category they define a set of dimensions, 18 in all
...
(2009) who
argue that 'representational data quality' can be perceived as a form of
'accessibility data quality' instead of a category of its own
...
(2009)
define three data quality categories: intrinsic, accessibility and usefulness
...
With a basis in this view of data as resources, Levitin and Redman
discuss how thirteen basic properties of organizational resources may be translated
into properties for data
...
2

Impacts of poor quality data

The development of information technology during the last decades has enabled
organizations to collect and store enormous amounts of data
...
Since larger and
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
Another often mentioned data related problem is
that companies often manage data at a local level (e
...
department or location)
...
, 2006; Smith, 2008; Vayghan et al
...
In
this vein, Lee et al
...
From a solution
perspective, ERP systems have been promoted as a panacea for dealing with the
lack of data integration by replacing inadequately coordinated legacy systems
(Davenport, 1998; Knolmayer & Röthlin, 2006)
...

Poor quality data can imply a multitude of negative consequences in a company
...
,
2004; Wang & Strong, 1996)
...
, 2003; Leo et al
...

Poor data quality also increases operational costs since time and other resources
are spent detecting and correcting errors
...
Thus, poor data quality can have negative effects on the
organizational culture (Levitin & Redman, 1998; Ryu et al
...
Poor data
quality also means that it becomes difficult to build trust in the company data,
which may imply a lack of user acceptance of any initiatives based on such data
...
According to Redman (1998),
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
Additionally, data quality research has not yet advanced to the
point of having standard measurement methods for any of these issues
...
According to Redman (1998), measured at the field level, the reported
error rates are in the interval of 0
...
Furthermore, Redman (1998) claims
that at least three proprietary studies have yielded estimates in the 8-12% of
revenue range, but informally 40-60% of the expense of the service organization
may be consumed as a result of poor data
...
Häkkinen and Hilmola
(2008) argue that marginal data inaccuracies (e
...
1-5%) may not necessarily
represent a major problem in manufacturing, but that such inaccuracies will have
direct effects in terms of lost sales and operational disruptions in the after-sales
organizations
...
These industry experts include Gartner Group, Price Waterhouse
Coopers and The Data Warehousing Institute, which claim to identify a crisis in
data quality management and a reluctance among senior decision-makers to do
enough about it (Marsh, 2005)
...
Haug; F
...
van Liempd

174

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423



"According to Gartner, bad data is the number one cause of CRM system
failure"



"Less than 50 per cent of companies claim to be very confident in the
quality of their data"



"Business intelligence (BI) projects often fail due to dirty data, so it is
imperative that BI-based business decisions are based on clean data"



"Only 15 per cent of companies are very confident in the quality of external
data supplied to them



"Customer data typically degenerates at 2 per cent per month or 25 per
cent annually"



"Organisations

typically

overestimate

the

quality

of

their

data

and

systems

and

underestimate the cost of errors"


"Business

processes,

customer

expectations,

source

compliance rules are constantly changing
...
3

Data maintenance effort and costs inflicted by poor quality data

As mentioned in the introduction, although there seems to be agreement in
literature that the costs of poor data quality are significant in many companies,
only very few studies demonstrate how to identify, categorize and measure such
costs (Eppler & Helfert, 2004; Kim & Choi, 2003)
...

Raman (2000) argues that evidence from previous studies shows that the quality
of point-of-sale data is often poor and that even at well-run retailers it cannot be
taken for granted
...
The focus of the
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
For the first-mentioned the consequence of inaccurate data is
simply a subtraction of the sum of overpriced items from the sum of underpriced
items
...
Raman recommends
two steps to improve data quality, which in headlines can be formulated as: (1)
“companies should make greater use of the data that they have stored”; and (2)
“that companies start measuring data quality to the extent possible”
...
In relation to information quality assessment,
among others, Ge and Helfert classify typical information quality problems which
are identified by previous research, as shown in Table 2
...
“Classification of information quality problems identified in literature”
...


On the issue of information quality management, Ge and Helfert (2007) argue that
this is an intersection between the fields of quality management, information
management and knowledge management
...
Haug; F
...
van Liempd

which

include:

database,

information
176

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

manufacture system, accounting, marketing, data warehouse, decision-making,
healthcare, enterprise resource planning, customer relationship management,
finance, e-business, World Wide Web and supply chain management
...
“A data quality cost taxonomy”
...


Eppler and Helfert (2004) review and categorize the potential costs associated with
low quality data
...
To address the lack of literature on poor data quality versus costs,
according to Eppler and Helfert, “cost classifications based on various criteria can
be applied to the data quality field in order to make its business impact more
visible”
...
Additionally, Eppler and Helfert identify 10
cost examples of assuring data quality, which are 1) information quality
assessment or inspection costs, 2) information quality process improvement and
defect prevention costs, 3) preventing low quality data, 4) detecting low quality
data, 5) repairing low quality data, 6) costs of improving data format, 7)
investment costs of improving data infrastructures, 8) investment costs of
improving data processes, 9) training costs of improving data quality know-how
and lastly 10) management and administrative costs associated with ensuring data
quality
...
Haug; F
...
van Liempd

177

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

two major types: improvement costs and costs due to low data quality
...

3

Proposition

This paper extends the literature on data quality costs, especially the work of
Eppler and Helfert (2004), by proposing:
(1) A definition of the optimal data maintenance effort
(2) A classification of costs inflicted by poor quality data
The two propositions are defined and discussed in the following subsections
...
1

Defining the optimal data maintenance effort

The first proposition of this paper is shown in Figure 1
...
The second and
horizontal axis deals with the quality of data
...
The costs inflicted by poor quality data are for example faulty
decisions based on poor data quality, whether this is of operational or strategic
character
...
The total costs associated with
data quality are the aggregated cost of the two explained curves
...
Firstly, during data maintenance the
focus is on the most critical data (i
...
the ones with the highest payoff per
resources spent) before moving on to less critical ones
...
e
...
The second assumption is that the
costs of the efforts to ensure high data quality are not causally related to the their
importance, i
...
focusing on a set of poor quality data with great impact on costs is
not necessarily cheaper than focusing on data with little impact on costs
...


The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
“Total costs incurred by data quality on the company”
...
The central thesis here is that extensively
cleaning data, thereby ensuring high quality of the data, becomes less profitable at
some point
...

Although Figure 1 seems to provide a very logical perspective on the estimation of
the optimal data quality maintenance efforts, there is still some way to go
...
e
...
The first (costs of assuring data quality) is relatively easy to evaluate, since
this is simply a question of registering resources used on this work, i
...
internal
hours spent, consultant fees, software, etc
...
To support the task of estimating
the costs inflicted by poor quality data, the next section looks closer at the nature
of such costs
...
Haug; F
...
van Liempd

179

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

3
...
The first dichotomy relates to how visible the costs are,
namely direct versus hidden costs
...
, 2001; Srinidhi, 1992) as well as
data quality literature (Kengpol, 2001)
...
For this reason
terms such as strategic activity based costing (Kaplan & Cooper, 1998), total cost
of ownership (Ellram & Siferd, 1993) and cost-to-serve (Braithwaithe & Samakh,
1998) have been invented and invested in to include all costs associated with a
given action taken by a company or department
...
An example
of such a cost could be the faulty decisions stemming from not knowing the
profitability of products
...
This could for example be faulty
delivery addresses for registered customers, resulting in wrong deliveries
...
More
specifically, the second dichotomy refers to the fact that data can be viewed on
both an operational and a strategic level
...
An example of operational data can be delivery
addresses, pricing of products and other order processing related data
...
On a more strategic level, data can
be seen as a basis for making decisions in companies, where the decisions can be
regarded as having a relatively longer time span when compared to operational
data
...
If the company is not able to track and locate
both its variable and its fixed costs, it will not be possible for the company to
determine a given price on a given product
...
If a company
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...

Here, it should be made clear that operational and strategic data can be one and
the same
...
As a result, while some data can be seen as
strategic in one company, other companies might regard them as operational
...
This level has not been included, as the purpose of
this paper is to provide an initial and better understanding of the relationship
between such costs
...

In Figure 2, the two dichotomies are combined to provide some general categories
of costs of poor quality data
...


Hidden
costs

E
...
long lead times, data
being registered multiple
times, employee
dissatisfaction, etc
...
g
...


Direct
costs

E
...
manufacturing errors,
wrong deliveries, payment
errors, etc
...
g
...


Effects of poor
quality data on
operational tasks

Effects of poor
quality data on
strategic decisions

Figure 2
...


In Figure 2, it is highlighted that depending on the two dimensions of direct costs
versus hidden costs and operational data versus strategic data, four types of costs
incurred by poor data quality can be operationalized
...
When the cost can be classified as a direct cost
with an operational view on data, costs can for example be associated with poor
order processing data
...
Another classical example is the
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
Contrarily,
when the cost can be categorized as a hidden cost, but still with an operational use
of data, the company will incur costs on a day-to-day level of which they are in fact
aware
...
A company that
has been producing products with the same lead time for a long period of time runs
the risk of taking this for granted, not realizing that the lead times could actually
be shorter if the data were corrected
...

When costs are direct but are instead considered from a strategic data perspective,
costs incurred stem from operations, which the company knows are inefficient and
have a big impact on the strategic direction in which the company is currently
heading
...
Not running the newly
placed strategic inventory location properly could be an example of costs incurred
due to data not being sufficiently cleaned and organized
...
In this case, an example would be a
wrong allocation of costs (typically fixed costs) regarding calculating individual
product profitability
...

3
...
The focus when using the two proposed models
could for example be on item data, sales order data, production planning data, etc
...
On the other hand, if using a scope that is too narrow, important data
may be neglected
...
Haug; F
...
van Liempd

182

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

common context
...

4

Case study

In this section, a case study illustrates the way in which a company attempted to
improve the quality of their data
...
000 Euro per year
...
In a normal
situation, a new car is equipped with parts produced by the same auto
manufacturer
...
Some parts in cars, vans and trucks are, however, more
prone to breaking, compared to others
...
Typical causes for the breakdown of these car parts are
normal wear and tear, but also (head-on) collisions with other cars
...
As this is a costly endeavour for the
original car manufacturer, an after sales market for car, van and truck spare parts
exists
...
Before turning to the empirical data, a short section
denoting the methodological choices taken is given
...
1

Methodology

A qualitative and exploratory research design was undertaken in order to
investigate the level of master data quality by the focal company (Stake, 2000)
...
Using semi-structured
interview protocols gave the interviewer the flexibility to focus on what the
company believed was the most important problems as regards their current level
of data quality
...
The single case study can be reported as being a holistic, representative
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
The case
is representative because the case company is typical of many other major
manufacturing companies as the company has had problems in managing their
data quality, which is also the main sampling criterion
...
A statistical generalization is, therefore, not achievable, as this type
of research can be regarded as exploratory research
...

One researcher spent significant time in the actual focal company, participating for
6 months both at official meetings as an observer and in unofficial, unstructured
interviews with the company’s chief operating officer (COO), business intelligence
managers, supply chain manager and several sales managers
...
As one of the researchers participated in the
meetings, the researcher runs the risk of blurring his role as a researcher with his
role in the company
...
With respect to qualitative
validity criteria, credibility was ensured by checking the authenticity of the case
description with the case company, after which any discrepancies were changed
...
Even though only
one of the authors spent time at the case company, dependability was sought to be
ensured by comparing all three authors’ interpretations of the results, and working
out any disagreements on interpretation
...

4
...
In fact, the company
currently has a stock-keeping unit (SKU) count of approximately 8,500
...
This creates a complex situation for the organization, in which data
to be managed are abundant with pricing of products being a particularly timeThe costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
The company is currently employing two full-time business
intelligence managers whose sole task is to clean data and price products
...
As the market for the company’s products can be as seen as a
commodity market, precise pricing of the app
...
During recent years, Chinese competitors have entered the market,
which has had the consequence that the focal company has been put under
pressure in terms of maintaining profitability
...

The two business intelligence managers knew that the company incurred quite
heavy costs due to costs inflicted by relatively simple operational tasks
...
Additionally, many of the customers of
the company had had individual pricing agreements with the company but these
agreements were not systematized, which meant that the sales people of the
company used a lot of time on retrieving and processing individual and unique
customer data
...
For example, both managers would spend a lot of their time
recording, retrieving, systematizing and updating pricing information gathered from
the company’s nearest competitors
...
This updating of prices involved, however, many countries with many
individual pricing lists being gathered from many different competitors
...
That is, data were at times registered
twice
...
The COO of the company estimated that the costs of these unnecessary
activities were the equivalent of payroll costs for two full-time marketing
employees
...
Haug; F
...
van Liempd

185

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

update prices manually
...
That is, intelligence managers, the
COO and several marketing related employees expressed that it would never be
possible nor expedient to obtain 100% correct prices
...
Trying to obtain perfect prices would mean a far too time-consuming data
discipline, in which the company was not interested
...
Trying to maintain
data quality over a certain threshold will result in costs pertaining to data discipline
inexpediently exceeding costs saved by better decision-making due to better data
quality
...
Direct
costs were mainly associated with supply chain or logistical operations
...

Besides these, minor inventory locations were located in the different countries to
which the company was supplying
...
These arguments were, however, difficult to reach
an agreement upon since cost data pertaining to the use of the inventory locations
were either missing or faulty
...
This
meant costs regarding unnecessary transportation of goods, not meeting delivery
deadlines and that either stock-outs or limited capacity at inventory settings were
incurred by the company
...
That is, the company essentially had no calculations of customer
profitability, but only had rough guidelines such as the volume sold and
contribution margin
...

Not knowing the costs of having products produced, the sales staff were also quite
often not capable of determining the optimal price that the customer should pay for
the product
...
In order to improve data quality at this
strategic level, the company set out trying to gather information on costs related to
inventory capacity and transportation costs
...
Haug; F
...
van Liempd

186

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

inventories, thereby removing several smaller inventories located especially in
Europe
...
There were, however,
also doubts as to whether this decision actually would be the best for the company
logistically as data collected sometimes were not sufficiently reliable
...
It was, however, judged by the company that it
would be a too great a data exercise to gather precise information on all inventory
locations
...

In the case described, the proposed matrix (Figure 2) provided a perspective on
costs of poor data quality, which contributed to a better understanding of this
issue
...

5

Conclusions

This paper proposed a model for determining the optimal level of data maintenance
efforts from a cost perspective
...
As the model shows, the
optimal level of data maintenance is not to achieve perfect data, but only a level
where the costs of the maintenance work do not exceed savings from the costs
inflicted by poor quality data
...
Different industries have different
characteristics, i
...
the relation between costs of poor data quality and costs of
assuring data quality
...

While the first dimension (i
...
costs of data maintenance) is rather straight forward
to calculate, the costs inflicted by poor quality data are much more difficult to
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
To provide a better understanding of such costs, the paper proposed four
categories of such costs
...
These four categories provide a better
understanding of how to estimate data quality related costs
...
These types of costs are difficult to
track and the company might not notice that it is in fact incurring these costs
...
Contrarily, examples of effects of poor data quality on
strategic decisions on costs that are hidden are a focus on wrong customer
segments and poor price policies
...
However, estimates of costs related to poor quality
data would still be associated with great uncertainties
...
This was also
empirically illustrated by the use of single case study
...
This means that more detailed
methods for evaluating the different types of costs inflicted by poor quality data
need to be defined
...
The focus of this research project is to understand how data quality is
related to the expenses of a company
...
The ideas presented in this paper represent
the initial foundation for this work
...
Although these contributions are to be further elaborated on in future
research, in their present form they provide a better understanding of the topic
which hopefully aids companies in their data quality work
...
Haug; F
...
van Liempd

188

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

References
Ballou, D
...
, Madnick, S
...
(2004)
...
Journal
of Management Information Systems, 20, 9–11
...
P
...
(1985)
...
Management Science, 31(2), 150-162
...
1287/mnsc
...
2
...
, Cappiello, C
...
, & Maurino, A
...
Methodologies for
Data Quality Assessment and Improvement
...

Braithwaite, A
...
(1998)
...
International
Journal of Logistics Management, 9(1), 64-88
...
H
...
Putting the enterprise into the enterprise system
...

Davenport, T
...
, & Prusak, L
...
Working Knowledge: How Organizations
Manage What They Know
...

Ellram, L
...
, & Siferd, S
...
(1993)
...
Journal of Business Logistics, 14(1), 163-184
...
, & Helfert, M
...
A classification and analysis of data quality costs
...

Even, A
...
(2009)
...
Journal of Computer Information Systems, 50(2), 127-135
...
, & Helfert, M
...
A Review of Information Quality Research - Develop a
Research Agenda
...

Haug, A
...
, & Arlbjørn, J
...
(2009)
...
Industrial Management & Data Systems, 109(8), 1053-1068
...
1108/02635570910991292
The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
, & Hilmola, O-P
...
ERP evaluation during the shakedown phase:
Lessons from an after-sales division
...

Joshi, S
...
, & Lave, L
...
Estimating the hidden costs of
environmental regulation
...

doi:10
...
2001
...
2
...
, Kang, X
...
(2009)
...
16th International Conference on
Management Science & Engineering, September 14-16, 2009, Moscow, Russia
...
, Strong, D
...
(2003)
...
Communications of the ACM, 45, 184-192
...
1145/505248
...
S
...
(1998)
...
Boston: Harvard Business School Press
...
(2002)
...
Journal of
Object Technology, 1(4), 39-47
...
5381/jot
...
1
...
c3
Kim, W
...
(2003)
...
Journal of
Object Technology, 2(4), 69-76
...
5381/jot
...
2
...
c6
Kengpol, A
...
The Implementation of Information Quality for the Automated
Information Systems in the TDQM Process: A Case Study in Textile and Garment
Company in Thailand, in: Pierce, E
...
Katz-Haas (Eds
...
206-216, Boston
...
, & Röthlin, M
...
Quality of material master data and its effect
on the usefulness of distributed ERP systems
...

doi:10
...
, Shanks, G
...
R
...
Meeting privacy obligations:
the implications for information systems development
...
Haug; F
...
van Liempd

190

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

European Conference on Information Systems
...


Retrieved

June

29th,

2009,

from:

http://is2
...
ac
...
pdf
Lee, Y
...
, Funk, J
...
Y
...
Journey to data quality
...

Leo, L
...
Yang, W
...
, & Wang, R
...
(2002)
...

Communications of the ACM, 45(4), 211-218
...
V
...
C
...
Data as a resource: Properties, implications,
and prescriptions
...

Madnick, S
...
, & Xian, X
...
The design and implementation of a
corporate householding knowledge processor to improve data quality
...

Marsh, R
...
Drowning in dirty data? It’s time to sink or swim: A four-stage
methodology for total data quality management
...

doi:10
...
dbm
...
B
...
A
...
Qualitative Data Analysis: An Expanded
Sourcebook
...

Newell, S
...
, Scarbrough, H
...
(2002)
...
Basingstoke: Palgrave-Macmillan
...
, & Kusiak, A
...
Enterprise resource planning (ERP) operations
support system for maintaining process integration
...

doi:10
...
, & Ernst, D
...
A Model for Data Quality Assessment
...

doi:10
...
Haug; F
...
van Liempd

191

doi:10
...
2011
...
p168-193

JIEM, 2011 – 4(2): 168-193 – Online ISSN: 2013-0953
Print ISSN: 2013-8423

Raman, A
...
Retail-data quality: evidence, causes, costs, and fixes
...

doi:10
...
C
...
The impact of poor data quality on the typical enterprise
...

doi:10
...
269025
Ryu, K
...
, Park J
...
, & Park, J
...
(2006)
...
ETRI Journal, 28(2), 191-204
...
4218/etrij
...
0105
...
(2005)
...
London: Sage Publications
...
A
...
D
...
Master data management: Salvation or snake
oil? Export find similar
...

Srinidhi, B
...
The hidden costs of specialty products
...

Stake, R
...
(2000)
...
K
...
S
...
), The
handbook of qualitative research (pp
...
California: Sage Publications
...
K
...
P
...
Examining data quality
...

doi:10
...
269021
Vayghan, J
...
, Garfinkle, S
...
, Walenta, C
...
C
...
(2007)
...
IBM Systems Journal, 46(4), 669684
...
1147/sj
...
0669
Wand, Y
...
Y
...
Anchoring data quality dimensions in ontological
foundations
...

doi:10
...
240479
Wang, R
...
, & Strong, D
...
Beyond accuracy: What data quality means to
data consumers
...

The costs of poor data quality
A
...
Zachariassen; D
...
3926/jiem
...
v4n2
...
Y
...
C
...
P
...
A framework for analysis of data
quality research
...

doi:10
...
404034
Watts, S
...
, & Shankaranarayanan, A
...
(2009)
...
Decision Support Systems, 48, 202–211
...
1016/j
...
2009
...
012
Yin, R
...
(2009)
...
Los Angeles, LA: Sage
Publications
...
jiem
...
0 Creative commons license
...
It must not be used for commercial purposes
...
org/licenses/by-nc/3
...


The costs of poor data quality
A
...
Zachariassen; D
Title: HOW TO MAKE MONEY WITH FACEBOOK
Description: Facebook holds spectacular opportunities for making money. And you can indeed make money from facebook, no technical skills required! NO COST TO SETUP AND GET STARTED MAKING MONEY WITH FACEBOOK. This ebook contains easy step by step tutorial on on how to use your desired facebook page to make cool money online........