Probabilistic Data Integration

Share/Save/Bookmark

van Keulen, M. (2009) Probabilistic Data Integration. In: 08421 Abstracts Collection - Uncertainty Management in Information Systems, 12 - 17 Oct 2008, Dagstuhl, Germany.

[img]
Preview
PDF
201Kb
Abstract:In data integration efforts such as in portal development, much development time is devoted to entity resolution. Often advanced similarity measurement techniques are used to remove semantic duplicates or solve other semantic conflicts. It proofs impossible, however, to automatically get rid of all semantic problems. An often-used rule of thumb states that about 90% of the development effort is devoted to semi-automatically resolving the remaining 10% hard cases. In an attempt to significantly decrease human effort at data integration time, we have proposed an approach that strives for a 'good enough' initial integration which stores any remaining semantic uncertainty and conflicts in a probabilistic XML database. The remaining cases are to be resolved during use with user feedback.
We conducted extensive experiments on the effects and sensitivity of rule denition, threshold tuning, and user feedback on the integration quality. We claim that our approach indeed reduces development effort - and not merely shifts the effort - by showing that setting rough safe thresholds and defining only a few rules suffices to produce a 'good enough' integration that can be meaningfully used, and that user feedback is effective in gradually improving the integration quality.
Item Type:Conference or Workshop Item
Faculty:
Electrical Engineering, Mathematics and Computer Science (EEMCS)
Research Group:
Link to this item:http://purl.utwente.nl/publications/65438
Official URL:http://drops.dagstuhl.de/opus/volltexte/2009/1942
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page