• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 289
Next 10 →

Crowdsourcing user studies with Mechanical Turk

by Aniket Kittur, H. Chi, Bongwon Suh - Proc. CHI 2008, ACM Pres , 2008
"... User studies are important for many aspects of the design process and involve techniques ranging from informal surveys to rigorous laboratory studies. However, the costs involved in engaging users often requires practitioners to trade off between sample size, time requirements, and monetary costs. M ..."
Abstract - Cited by 461 (8 self) - Add to MetaCart
. Micro-task markets, such as Amazon’s Mechanical Turk, offer a potential paradigm for engaging a large number of users for low time and monetary costs. Here we investigate the utility of a micro-task market for collecting user measurements, and discuss design considerations for developing remote micro

Crowdsourcing graphical perception: using mechanical turk to assess visualization design. In:

by Jeffrey Heer , Michael Bostock - Proceedings of the 28th international conference on Human factors in computing systems. ACM; , 2010
"... ABSTRACT Understanding perception is critical to effective visualization design. With its low cost and scalability, crowdsourcing presents an attractive option for evaluating the large design space of visualizations; however, it first requires validation. In this paper, we assess the viability of A ..."
Abstract - Cited by 154 (8 self) - Add to MetaCart
of Amazon's Mechanical Turk as a platform for graphical perception experiments. We replicate previous studies of spatial encoding and luminance contrast and compare our results. We also conduct new experiments on rectangular area perception (as in treemaps or cartograms) and on chart size and gridline

Crowdsourcing document relevance assessment with Mechanical Turk

by Catherine Grady, Matthew Lease - Proc. NAACL HLT Wkshp. Creating Speech and Language Data with Amazon’s Mechanical Turk , 2010
"... We investigate human factors involved in designing effective Human Intelligence Tasks (HITs) for Amazon’s Mechanical Turk1. In particular, we assess document relevance to search queries via MTurk in order to evaluate search engine accuracy. Our study varies four human factors and measures resulting ..."
Abstract - Cited by 35 (3 self) - Add to MetaCart
We investigate human factors involved in designing effective Human Intelligence Tasks (HITs) for Amazon’s Mechanical Turk1. In particular, we assess document relevance to search queries via MTurk in order to evaluate search engine accuracy. Our study varies four human factors and measures resulting

Quality Management on Amazon Mechanical Turk

by Panagiotis G. Ipeirotis, Foster Provost, Jing Wang
"... Crowdsourcing services, such as Amazon Mechanical Turk, allow for easy distribution of small tasks to a large number of workers. Unfortunately, since manually verifying the quality of the submitted results is hard, malicious workers often take advantage of the verification difficulty and submit answ ..."
Abstract - Cited by 177 (8 self) - Add to MetaCart
Crowdsourcing services, such as Amazon Mechanical Turk, allow for easy distribution of small tasks to a large number of workers. Unfortunately, since manually verifying the quality of the submitted results is hard, malicious workers often take advantage of the verification difficulty and submit

Cheap and Fast — But is it Good? Evaluating Non-Expert Annotations for Natural Language Tasks

by Rion Snow, Daniel Jurafsky, Andrew Y. Ng
"... Human linguistic annotation is crucial for many natural language processing tasks but can be expensive and time-consuming. We explore the use of Amazon’s Mechanical Turk system, a significantly cheaper and faster method for collecting annotations from a broad base of paid non-expert contributors ove ..."
Abstract - Cited by 247 (4 self) - Add to MetaCart
Human linguistic annotation is crucial for many natural language processing tasks but can be expensive and time-consuming. We explore the use of Amazon’s Mechanical Turk system, a significantly cheaper and faster method for collecting annotations from a broad base of paid non-expert contributors

Crowddb: answering queries with crowdsourcing.

by J Franklin , D Kossmann , T Kraska , S Ramesh , R S Xin , ; J Wang , T Kraska , M J Franklin , J Feng - In SIGMOD, , 2011
"... ABSTRACT Many data management problems are inherently vague and hard for algorithms to process. Take for example entity resolution, also known as record linkage, the process to resolve records for the same entity from heterogeneous sources. Properly resolving such records require not only the synta ..."
Abstract - Cited by 145 (11 self) - Add to MetaCart
functions, and for matching, ranking, or aggregating results based on fuzzy criteria. The rise of microtask crowdsourcing platforms, e.g. Amazon's Mechanical Turk, provides a unique opportunity to integrate human inputs into the algorithmic data flow. There are two recent work, CrowdDB [1] and Crowd

Online Task Assignment in Crowdsourcing Markets

by Chien-ju Ho, Jennifer Wortman Vaughan
"... We explore the problem of assigning heterogeneous tasks to workers with different, unknown skill sets in crowdsourcing markets such as Amazon Mechanical Turk. We first formalize the online task assignment problem, in which a requester has a fixed set of tasks and a budget that specifies how many tim ..."
Abstract - Cited by 33 (3 self) - Add to MetaCart
We explore the problem of assigning heterogeneous tasks to workers with different, unknown skill sets in crowdsourcing markets such as Amazon Mechanical Turk. We first formalize the online task assignment problem, in which a requester has a fixed set of tasks and a budget that specifies how many

Evaluating Amazon’s Mechanical Turk as a Tool for Experimental Behavioral Research

by Matthew J. C. Crump, John V. Mcdonnell, Todd M. Gureckis , 2013
"... Amazon Mechanical Turk (AMT) is an online crowdsourcing service where anonymous online workers complete web-based tasks for small sums of money. The service has attracted attention from experimental psychologists interested in gathering human subject data more efficiently. However, relative to tradi ..."
Abstract - Cited by 48 (1 self) - Add to MetaCart
Amazon Mechanical Turk (AMT) is an online crowdsourcing service where anonymous online workers complete web-based tasks for small sums of money. The service has attracted attention from experimental psychologists interested in gathering human subject data more efficiently. However, relative

Crowdforge: Crowdsourcing complex work.

by Aniket Kittur , Boris Smus , Susheel Khamkar , Robert E Kraut , Aniket Kittur , Boris Smus , Susheel Khamkar , Robert E Kraut , 2011
"... ABSTRACT Micro-task markets such as Amazon's Mechanical Turk represent a new paradigm for accomplishing work, in which employers can tap into a large population of workers around the globe to accomplish tasks in a fraction of the time and money of more traditional methods. However, such market ..."
Abstract - Cited by 101 (5 self) - Add to MetaCart
ABSTRACT Micro-task markets such as Amazon's Mechanical Turk represent a new paradigm for accomplishing work, in which employers can tap into a large population of workers around the globe to accomplish tasks in a fraction of the time and money of more traditional methods. However

Who are the crowdworkers?: shifting demographics in Mechanical Turk

by Joel Ross, Lilly Irani, M. Six Silberman, Andrew Zaldivar, Bill Tomlinson - In Proceedings of CHI 2010, Atlanta GA, ACM , 2010
"... Amazon Mechanical Turk (MTurk) is a crowdsourcing system in which tasks are distributed to a population of thousands of anonymous workers for completion. This system is increasingly popular with researchers and developers. Here we extend previous studies of the demographics and usage behaviors of MT ..."
Abstract - Cited by 127 (3 self) - Add to MetaCart
Amazon Mechanical Turk (MTurk) is a crowdsourcing system in which tasks are distributed to a population of thousands of anonymous workers for completion. This system is increasingly popular with researchers and developers. Here we extend previous studies of the demographics and usage behaviors
Next 10 →
Results 1 - 10 of 289
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University