[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2525314.2525346acmconferencesArticle/Chapter ViewAbstractPublication PagesgisConference Proceedingsconference-collections
research-article

GeoTruCrowd: trustworthy query answering with spatial crowdsourcing

Published: 05 November 2013 Publication History

Abstract

With the abundance and ubiquity of mobile devices, a new class of applications, called spatial crowdsourcing, is emerging, which enables spatial tasks (i.e., tasks related to a location) assigned to and performed by human workers. However, one of the major challenges with spatial crowdsourcing is how to verify the validity of the results provided by workers, when the workers are not trusted equally. To tackle this problem, we assume every worker has a reputation score, which states the probability that the worker performs a task correctly. Moreover, we define a confidence level for every spatial task, which states that the answer to the given spatial task is only accepted if its confidence is higher than a certain threshold. Thus, the problem we are trying to solve is to maximize the number of spatial tasks that are assigned to a set of workers while satisfying the confidence levels of those tasks. Note that a unique aspect of our problem is that the optimal assignment of tasks heavily depends on the geographical locations of workers and tasks. This means that every spatial task should be assigned to enough number of workers such that their aggregate reputation satisfies the confidence of the task. Consequently, an exhaustive approach needs to compute the aggregate reputation score (using a typical decision fusion aggregation mechanism, such as voting) for all possible subsets of the workers, which renders the problem complex (we show it is NP-hard). Subsequently, we propose a number of heuristics and utilizing real-world and synthetic data in extensive sets of experiments we show that we can achieve close to optimal performance with the cost of a greedy approach, by exploiting our problem's unique characteristics.

References

[1]
Amazon mechanical turk. www.mturk.com.
[2]
Crowdflower. www.crowdflower.com.
[3]
Gowalla. www.wikipedia.org/wiki/Gowalla.
[4]
University of california berkeley, 2008-2009. traffic.berkeley.edu/.
[5]
O. Alonso, D. E. Rose, and B. Stewart. Crowdsourcing for relevance evaluation. SIGIR Forum'08, 42(2):9--15.
[6]
F. Alt, A. S. Shirazi, A. Schmidt, U. Kramer, and Z. Nawaz. Location-based crowdsourcing: extending crowdsourcing to the real world. In NordiCHI '10, pages 13--22.
[7]
M. Bulut, Y. Yilmaz, and M. Demirbas. Crowdsourcing location-based queries. In PERCOM Workshops'11, pages 513--518.
[8]
C. S. Campbell, P. P. Maglio, A. Cozzi, and B. Dom. Expertise identification using email communications. In CIKM '03, pages 528--531.
[9]
C. C. Cao, J. She, Y. Tong, and L. C. 0002. Whom to ask? jury selection for decision making tasks on micro-blog services. PVLDB'12, 5(11):1495--1506.
[10]
A. Doan, R. Ramakrishnan, and A. Y. Halevy. Crowdsourcing systems on the world-wide web. Commun. ACM'11, 54(4):86--96.
[11]
D. Du, K. Ko, and X. Hu. Design and Analysis of Approximation Algorithms. Springer Optimization and Its Applications. 2011.
[12]
A. Dua, N. Bulusu, W.-C. Feng, and W. Hu. Towards trustworthy participatory sensing. In HotSec'09, pages 8--8.
[13]
M. J. Franklin, D. Kossmann, T. Kraska, S. Ramesh, and R. Xin. Crowddb: answering queries with crowdsourcing. In SIGMOD '11, pages 61--72.
[14]
M. R. Garey and D. S. Johnson. Computers and Intractability: A Guide to the Theory of NP-Completeness. 1979.
[15]
P. Gilbert, L. P. Cox, J. Jung, and D. Wetherall. Toward trustworthy mobile sensing. In HotMobile '10, pages 31--36.
[16]
B. Hecht and D. Gergle. On the "localness" of user-generated content. In CSCW'10, pages 229--232.
[17]
B. Hull, V. Bychkovsky, Y. Zhang, K. Chen, M. Goraczko, A. Miu, E. Shih, H. Balakrishnan, and S. Madden. Cartel: a distributed mobile sensor computing system. In SenSys'06, pages 125--138.
[18]
C. A. J. Hurkens and A. Schrijver. On the size of systems of sets every t of which have an sdr, with an application to the worst-case ratio of heuristics for packing problems. SIAM J. Discret. Math., 2(1):68--72, 1989.
[19]
P. G. Ipeirotis, F. Provost, and J. Wang. Quality management on amazon mechanical turk. In HCOMP '10, pages 64--67.
[20]
L. Kazemi and C. Shahabi. Geocrowd: Enabling query answering with spatial crowdsourcing. In ACM SIGSPATIAL GIS'12.
[21]
T. Lappas, K. Liu, and E. Terzi. Finding a team of experts in social networks. In KDD '09, pages 467--476.
[22]
A. Marcus, E. Wu, S. Madden, and R. C. Miller. Crowdsourced databases: Query processing with people. In CIDR'11, pages 211--214.
[23]
P. Mohan, V. N. Padmanabhan, and R. Ramjee. Nericell: rich monitoring of road and traffic conditions using mobile smartphones. In SenSys'08, pages 323--336.
[24]
V. C. Raykar, S. Yu, L. H. Zhao, G. H. Valadez, C. Florin, L. Bogoni, and L. Moy. Learning from crowds. Journal of Machine Learning Research, 11:1297--1322, 2010.
[25]
R. Snow, B. O'Connor, D. Jurafsky, and A. Y. Ng. Cheap and fast---but is it good?: evaluating non-expert annotations for natural language tasks. In EMNLP '08, pages 254--263.
[26]
J. Surowiecki. The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations. 2004.
[27]
T. Yan, V. Kumar, and D. Ganesan. Crowdsearch: exploiting crowds for accurate real-time image search on mobile phones. In MobiSys '10, pages 77--90.

Cited By

View all
  • (2024)Longer Pick-Up for Less Pay: Towards Discount-Based Mobility ServicesIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.336289336:8(3992-4006)Online publication date: Aug-2024
  • (2024)Trajectory-Aware Task Coalition Assignment in Spatial CrowdsourcingIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.333664236:11(7201-7216)Online publication date: Nov-2024
  • (2024)Task Assignment With Efficient Federated Preference Learning in Spatial CrowdsourcingIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.331181636:4(1800-1814)Online publication date: Apr-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGSPATIAL'13: Proceedings of the 21st ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems
November 2013
598 pages
ISBN:9781450325219
DOI:10.1145/2525314
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 November 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. spatial crowdsourcing
  2. trustworthy query answering

Qualifiers

  • Research-article

Conference

SIGSPATIAL'13
Sponsor:

Acceptance Rates

Overall Acceptance Rate 257 of 1,238 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)0
Reflects downloads up to 05 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Longer Pick-Up for Less Pay: Towards Discount-Based Mobility ServicesIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.336289336:8(3992-4006)Online publication date: Aug-2024
  • (2024)Trajectory-Aware Task Coalition Assignment in Spatial CrowdsourcingIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.333664236:11(7201-7216)Online publication date: Nov-2024
  • (2024)Task Assignment With Efficient Federated Preference Learning in Spatial CrowdsourcingIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.331181636:4(1800-1814)Online publication date: Apr-2024
  • (2024)TrendSharing: A Framework to Discover and Follow the Trends for Shared Mobility Services2024 IEEE 40th International Conference on Data Engineering (ICDE)10.1109/ICDE60146.2024.00333(4370-4382)Online publication date: 13-May-2024
  • (2023)Preference-Aware Group Task Assignment in Spatial Crowdsourcing: Effectiveness and EfficiencyIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.326673535:10(10722-10734)Online publication date: 1-Oct-2023
  • (2023)Worker-Churn-Based Task Assignment With Context-LSTM in Spatial CrowdsourcingIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.324982835:9(9783-9796)Online publication date: 1-Sep-2023
  • (2023)A Trustworthiness-Aware Spatial Task Allocation using a Fuzzy-based Trust and Reputation System ApproachExpert Systems with Applications10.1016/j.eswa.2022.118592211(118592)Online publication date: Jan-2023
  • (2023)Assuring quality and waiting time in real-time spatial crowdsourcingDecision Support Systems10.1016/j.dss.2022.113869164:COnline publication date: 1-Jan-2023
  • (2023)Robust and High-Accessibility Ranking Method for Crowdsourcing-Based Decision MakingGroup Decision and Negotiation10.1007/s10726-023-09840-232:5(1211-1236)Online publication date: 27-Jun-2023
  • (2023)Coalition-based task assignment with priority-aware fairness in spatial crowdsourcingThe VLDB Journal10.1007/s00778-023-00802-333:1(163-184)Online publication date: 7-Jul-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media