Data Quality Control and Assurance in Workflow Management
DescriptionWorkflow is an important component in most flexible systems for business process management. As the technology for business process automation, Workflow Management Systems provides a platform for coordinating, controlling, and integrating business processes in a distributed work environment. Given that a workflow is designed to produce specified information output for a particular customer or market, it is critical to ensure the proper execution of a workflow under the direction of quality information and in conformance to the required business policies, thus creating value for both stakeholders and customers.Given a well-designed workflow, unintentional human errors, such as data entry error, and intentional errors that may constitute a fraud can still be introduced through human interaction with workflow systems. Those errors can cause improper workflow executions and therefore result to huge economical losses to an organization. For instance, in the case described in (Zur Muehlen and Rosemann 2005), a simple data entry mistake with payroll date introduced by a staff member can cause massive business interruption. Intentional error such as insurance fraud can also cause significant financial consequence. It was estimated that more than one of three bodily injury claims from car crashes involves fraud; and 11 to 30 cents of every (property/casualty) claim dollar is lost to fraud (McKendrick 2007). Therefore, ensuring data quality is an urgent issue in workflow management. This has been echoed by the passage of the Sarbanes-Oxley Act that requires public companies to ensure that their processes, especially those involve financial transactions, to be strictly controlled. As a major effective means for detecting and correcting data errors in workflows in addition to incorporating data check rules into workflow engine, data quality review can be applied at certain steps in a workflow and all the critical data involved in the workflow can be sampled and scrutinized for data errors. Given that data quality review requires resources such as labor and time, it is cost–effective to minimize the number of steps where data quality review deems necessary.Part 1 of the project develops an analytical framework for ensuring data quality in workflow management. It is obtained that a workflow including both decision nodes and non-decision nodes will favor the application of the data quality review at a decision node instead of a non-decision node. Part 2 of the project establishes an optimization model to allocate data quality review in a workflow. This model tackles two questions –first, where to apply the data quality review and second, at each review node, how many workflow cases should be selected –such that the workflow is the most cost-effective. Part 3 of the project proposes a dynamic sampling method to adjust the case sampling rates dynamically so that the overall selection rate meets the minimum requirement set by process auditors and various special control standards, with the sampling rates obtained in Part 2 being the initial rates. Part 4 of the project applies the Statistical Process Control methodology, which has been widely adopted in manufacturing, for monitoring the variations of data quality in workflows, and moreover, presents different ways to implement the proposed data quality review method in workflow management.
|Effective start/end date||1/04/09 → 30/06/09|