• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 11 - 20 of 590
Next 10 →

Get Another Label? Improving Data Quality and Data Mining Using Multiple, Noisy Labelers

by Victor S. Sheng, Foster Provost, Panagiotis G. Ipeirotis
"... This paper addresses the repeated acquisition of labels for data items when the labeling is imperfect. We examine the improvement (or lack thereof) in data quality via repeated labeling, and focus especially on the improvement of training labels for supervised induction. With the outsourcing of smal ..."
Abstract - Cited by 252 (12 self) - Add to MetaCart
of small tasks becoming easier, for example via Rent-A-Coder or Amazon’s Mechanical Turk, it often is possible to obtain less-than-expert labeling at low cost. With low-cost labeling, preparing the unlabeled part of the data can become considerably more expensive than labeling. We present repeated

Conducting behavioral research on Amazon’s Mechanical Turk. Behav Res Methods 2012;44(1):1–23

by Winter Mason, Siddharth Suri
"... Amazon’s Mechanical Turk is an online labor market where requesters post jobs and workers choose which jobs to do for pay. The central purpose of this paper is to demonstrate how to use this website for conducting behavioral research and lower the barrier to entry for re-searchers who could benefit ..."
Abstract - Cited by 136 (6 self) - Add to MetaCart
Amazon’s Mechanical Turk is an online labor market where requesters post jobs and workers choose which jobs to do for pay. The central purpose of this paper is to demonstrate how to use this website for conducting behavioral research and lower the barrier to entry for re-searchers who could benefit

Crowdsourcing graphical perception: using mechanical turk to assess visualization design. In:

by Jeffrey Heer , Michael Bostock - Proceedings of the 28th international conference on Human factors in computing systems. ACM; , 2010
"... ABSTRACT Understanding perception is critical to effective visualization design. With its low cost and scalability, crowdsourcing presents an attractive option for evaluating the large design space of visualizations; however, it first requires validation. In this paper, we assess the viability of A ..."
Abstract - Cited by 154 (8 self) - Add to MetaCart
of Amazon's Mechanical Turk as a platform for graphical perception experiments. We replicate previous studies of spatial encoding and luminance contrast and compare our results. We also conduct new experiments on rectangular area perception (as in treemaps or cartograms) and on chart size and gridline

The Case of Amazon Mechanical Turk

by Lilly Irani
"... In 2006, Jeff Bezos announced a new labor service masquerading as ..."
Abstract - Add to MetaCart
In 2006, Jeff Bezos announced a new labor service masquerading as

Who are the crowdworkers?: shifting demographics in Mechanical Turk

by Joel Ross, Lilly Irani, M. Six Silberman, Andrew Zaldivar, Bill Tomlinson - In Proceedings of CHI 2010, Atlanta GA, ACM , 2010
"... Amazon Mechanical Turk (MTurk) is a crowdsourcing system in which tasks are distributed to a population of thousands of anonymous workers for completion. This system is increasingly popular with researchers and developers. Here we extend previous studies of the demographics and usage behaviors of MT ..."
Abstract - Cited by 127 (3 self) - Add to MetaCart
Amazon Mechanical Turk (MTurk) is a crowdsourcing system in which tasks are distributed to a population of thousands of anonymous workers for completion. This system is increasingly popular with researchers and developers. Here we extend previous studies of the demographics and usage behaviors

Using the amazon mechanical turk for transcription of spoken language

by Matthew R. Marge, Satanjeev Banerjee, Alexander I. Rudnicky, Matthew Marge, Satanjeev Banerjee, Er I. Rudnicky - In: Proc. ICASSP , 2010
"... We investigate whether Amazon’s Mechanical Turk (MTurk) service can be used as a reliable method for transcription of spoken language data. Utterances with varying speaker demographics (native and non-native English, male and female) were posted on the MTurk marketplace together with standard transc ..."
Abstract - Cited by 55 (2 self) - Add to MetaCart
We investigate whether Amazon’s Mechanical Turk (MTurk) service can be used as a reliable method for transcription of spoken language data. Utterances with varying speaker demographics (native and non-native English, male and female) were posted on the MTurk marketplace together with standard

Collecting image annotations using amazon’s mechanical turk

by Cyrus Rashtchian, Peter Young, Micah Hodosh, Julia Hockenmaier - In CSLDAMT , 2010
"... Crowd-sourcing approaches such as Amazon’s Mechanical Turk (MTurk) make it possible to annotate or collect large amounts of linguistic data at a relatively low cost and high speed. However, MTurk offers only limited control over who is allowed to particpate in a particular task. This is particularly ..."
Abstract - Cited by 80 (3 self) - Add to MetaCart
Crowd-sourcing approaches such as Amazon’s Mechanical Turk (MTurk) make it possible to annotate or collect large amounts of linguistic data at a relatively low cost and high speed. However, MTurk offers only limited control over who is allowed to particpate in a particular task

Creating Speech and Language Data With Amazon’s Mechanical Turk

by Chris Callison-burch, Mark Dredze
"... In this paper we give an introduction to using Amazon’s Mechanical Turk crowdsourcing platform for the purpose of collecting data for human language technologies. We survey the papers published in the NAACL-2010 Workshop. 24 researchers participated in the workshop’s shared task to create data for s ..."
Abstract - Cited by 74 (3 self) - Add to MetaCart
In this paper we give an introduction to using Amazon’s Mechanical Turk crowdsourcing platform for the purpose of collecting data for human language technologies. We survey the papers published in the NAACL-2010 Workshop. 24 researchers participated in the workshop’s shared task to create data

Active Learning with Amazon Mechanical Turk

by Florian Laws, Christian Scheible, Hinrich Schütze
"... Supervised classification needs large amounts of annotated training data that is expensive to create. Two approaches that reduce the cost of annotation are active learning and crowdsourcing. However, these two approaches have not been combined successfully to date. We evaluate the utility of active ..."
Abstract - Cited by 10 (0 self) - Add to MetaCart
Supervised classification needs large amounts of annotated training data that is expensive to create. Two approaches that reduce the cost of annotation are active learning and crowdsourcing. However, these two approaches have not been combined successfully to date. We evaluate the utility of active learning in crowdsourcing on two tasks, named entity recognition and sentiment detection, and show that active learning outperforms random selection of annotation examples in a noisy crowdsourcing scenario. 1

Financial incentives and the “performance of crowds

by Winter Mason, Duncan J. Watts - Proc. HCOMP ’09
"... The relationship between financial incentives and performance, long of interest to social scientists, has gained new relevance with the advent of web-based “crowd-sourcing ” models of production. Here we investigate the effect of compensation on performance in the context of two experiments, conduct ..."
Abstract - Cited by 192 (3 self) - Add to MetaCart
, conducted on Amazon’s Mechanical Turk (AMT). We find that increased financial incentives increase the quantity, but not the quality, of work performed by participants, where the difference appears to be due to an “anchoring ” effect: workers who were paid more also perceived the value of their work
Next 10 →
Results 11 - 20 of 590
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University