Misplaced Pages

Amazon Mechanical Turk

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from MTurk) Micro-work service subsidiary of Amazon

Amazon Mechanical Turk (MTurk) is a crowdsourcing website with which businesses can hire remotely located "crowdworkers" to perform discrete on-demand tasks that computers are currently unable to do as economically. It is operated under Amazon Web Services, and is owned by Amazon. Employers, known as requesters, post jobs known as Human Intelligence Tasks (HITs), such as identifying specific content in an image or video, writing product descriptions, or answering survey questions. Workers, colloquially known as Turkers or crowdworkers, browse among existing jobs and complete them in exchange for a fee set by the requester. To place jobs, requesters use an open application programming interface (API), or the more limited MTurk Requester site. As of April 2019, requesters could register from 49 approved countries.

History

The service was conceived by Venky Harinarayan in a U.S. patent disclosure in 2001. Amazon coined the term artificial artificial intelligence for processes that outsource some parts of a computer program to humans, for those tasks carried out much faster by humans than computers. It is claimed that Jeff Bezos was responsible for proposing the development of Amazon's Mechanical Turk to realize this process.

The name Mechanical Turk was inspired by "The Turk", an 18th-century chess-playing automaton made by Wolfgang von Kempelen that toured Europe, and beat both Napoleon Bonaparte and Benjamin Franklin. It was later revealed that this "machine" was not an automaton, but a human chess master hidden in the cabinet beneath the board and controlling the movements of a humanoid dummy. Analogously, the Mechanical Turk online service uses remote human labor hidden behind a computer interface to help employers perform tasks that are not possible using a true machine.

MTurk launched publicly on November 2, 2005. Its user base grew quickly. In early- to mid-November 2005, there were tens of thousands of jobs, all uploaded to the system by Amazon itself for some of its internal tasks that required human intelligence. HIT types expanded to include transcribing, rating, image tagging, surveys, and writing.

In March 2007, there were reportedly more than 100,000 workers in over 100 countries. This increased to over 500,000 registered workers from over 190 countries in January 2011. That year, Techlist published an interactive map pinpointing the locations of 50,000 of their MTurk workers around the world. By 2018, research demonstrated that while over 100,000 workers were available on the platform at any time, only around 2,000 were actively working.

Overview

A user of Mechanical Turk can be either a "Worker" (contractor) or a "Requester" (employer). Workers have access to a dashboard that displays three sections: total earnings, HIT status, and HIT totals. Workers set their own hours and are not under any obligation to accept any particular task.

Globe icon.The examples and perspective in this article may not represent a worldwide view of the subject. You may improve this article, discuss the issue on the talk page, or create a new article, as appropriate. (April 2023) (Learn how and when to remove this message)

Amazon classifies Workers as contractors rather than employees and does not pay payroll taxes. Classifying Workers as contractors allows Amazon to avoid things like minimum wage, overtime, and workers compensation—this is a common practice among "gig economy" platforms. Workers are legally required to report their income as self-employment income.

In 2013, the average wage for the multiple microtasks assigned, if performed quickly, was about one dollar an hour, with each task averaging a few cents. However, calculating people's average hourly earnings on a microtask site is extremely difficult and several sources of data show average hourly earnings in the $5–$9 per hour range among a substantial number of Workers, while the most experienced, active, and proficient workers may earn over $20 per hour.

Workers can have a postal address anywhere in the world. Payment for completing tasks can be redeemed on Amazon.com via gift certificate (gift certificates are the only payment option available to international workers, apart from India) or can be transferred to a Worker's U.S. bank account.

Requesters can ask that Workers fulfill qualifications before engaging in a task, and they can establish a test designed to verify the qualification. They can also accept or reject the result sent by the Worker, which affects the Worker's reputation. As of April 2019, Requesters paid Amazon a minimum 20% commission on the price of successfully completed jobs, with increased amounts for additional services. Requesters can use the Amazon Mechanical Turk API to programmatically integrate the results of the work directly into their business processes and systems. When employers set up a job, they must specify

  • how much are they paying for each HIT accomplished,
  • how many workers they want to work on each HIT,
  • the maximum time a worker has to work on a single task,
  • how much time the workers have to complete the work,

as well as the specific details about the job they want to be completed.

Location of Turkers

Workers have been primarily located in the United States since the platform's inception with demographics generally similar to the overall Internet population in the U.S. Within the U.S. workers are fairly evenly spread across states, proportional to each state’s share of the U.S. population. As of 2019, between 15 and 30 thousand people in the U.S. complete at least one HIT each month and about 4,500 new people join MTurk each month.

Cash payments for Indian workers were introduced in 2010, which updated the demographics of workers, who however remained primarily within the United States. A website showing worker demographics in May 2015 showed that 80% of workers were located in the United States, with the remaining 20% located elsewhere in the world, most of whom were in India. In May 2019, approximately 60% were in the U.S., 40% elsewhere (approximately 30% in India). In early 2023 about 90% of workers were from the U.S. and about half of the remainder from India.

Uses

Human-subject research

Since 2010, numerous researchers have explored the viability of Mechanical Turk to recruit subjects for social science experiments. Researchers have generally found that while samples of respondents obtained through Mechanical Turk do not perfectly match all relevant characteristics of the U.S. population, they are also not wildly misrepresentative. As a result, thousands of papers that rely on data collected from Mechanical Turk workers are published each year, including hundreds in top ranked academic journals.

A challenge with using MTurk for human-subject research has been maintaining data quality. A study published in 2021 found that the types of quality control approaches used by researchers (such as checking for bots, VPN users, or workers willing to submit dishonest responses) can meaningfully influence survey results. They demonstrated this via impact on three common behavioral/mental healthcare screening tools. Even though managing data quality requires work from researchers, there is a large body of research showing how to gather high quality data from MTurk. The cost of using MTurk is considerably lower than many other means of conducting surveys, so many researchers continue to use it.

The general consensus among researchers is that the service works best for recruiting a diverse sample; it is less successful with studies that require more precisely defined populations or that require a representative sample of the population as a whole. Many papers have been published on the demographics of the MTurk population. MTurk workers tend to be younger, more educated, more liberal, and slightly less wealthy than the U.S. population overall.

Machine Learning

Supervised Machine Learning algorithms require large amounts of human-annotated data to be trained successfully. Machine learning researchers have hired Workers through Mechanical Turk to produce datasets such as SQuAD, a question answering dataset.

Missing persons searches

Since 2007, the service has been used to search for prominent missing individuals. This use was first suggested during the search for James Kim, but his body was found before any technical progress was made. That summer, computer scientist Jim Gray disappeared on his yacht and Amazon's Werner Vogels, a personal friend, made arrangements for DigitalGlobe, which provides satellite data for Google Maps and Google Earth, to put recent photography of the Farallon Islands on Mechanical Turk. A front-page story on Digg attracted 12,000 searchers who worked with imaging professionals on the same data. The search was unsuccessful.

In September 2007, a similar arrangement was repeated in the search for aviator Steve Fossett. Satellite data was divided into 85-square-metre (910 sq ft) sections, and Mechanical Turk users were asked to flag images with "foreign objects" that might be a crash site or other evidence that should be examined more closely. This search was also unsuccessful. The satellite imagery was mostly within a 50-mile radius, but the crash site was eventually found by hikers about a year later, 65 miles away.

Artistic works

MTurk has also been used as a tool for artistic creation. One of the first artists to work with Mechanical Turk was xtine burrough, with The Mechanical Olympics (2008), Endless Om (2015), and Mediations on Digital Labor (2015). Another work was artist Aaron Koblin's Ten Thousand Cents (2008).

Third-party programming

Programmers have developed browser extensions and scripts designed to simplify the process of completing jobs. Amazon has stated that they disapprove of scripts that completely automate the process and preclude the human element. This is because of the concern that the task completion process—e.g. answering a survey—could be gamed with random responses, and the resultant collected data could be worthless. Accounts using so-called automated bots have been banned. There are services that extend the capabilities to MTurk.

API

Amazon makes available an application programming interface (API) for the MTurk system. The MTurk API lets a programmer submit jobs, retrieve completed work, and approve or reject that work. In 2017, Amazon launched support for AWS Software Development Kits (SDK), allowing for nine new SDKs available to MTurk Users. MTurk is accessible via API from the following languages: Python, JavaScript, Java, .NET, Go, Ruby, PHP, or C++. Web sites and web services can use the API to integrate MTurk work into other web applications, providing users with alternatives to the interface Amazon has built for these functions.

Use case examples

Processing photos / videos

Amazon Mechanical Turk provides a platform for processing images, a task well-suited to human intelligence. Requesters have created tasks that ask workers to label objects found in an image, select the most relevant picture in a group of pictures, screen inappropriate content, classify objects in satellite images, or digitize text from images such as scanned forms filled out by hand.

Data cleaning / verification

Companies with large online catalogues use Mechanical Turk to identify duplicates and verify details of item entries. For example: removing duplicates in yellow pages directory listings, checking restaurant details (e.g. phone number and hours), and finding contact information from web pages (e.g. author name and email).

Information collection

Diversification and scale of personnel of Mechanical Turk allow collecting information at a large scale, which would be difficult outside of a crowd platform. Mechanical Turk allows Requesters to amass a large number of responses to various types of surveys, from basic demographics to academic research. Other uses include writing comments, descriptions, and blog entries to websites and searching data elements or specific fields in large government and legal documents.

Data processing

Companies use Mechanical Turk's crowd labor to understand and respond to different types of data. Common uses include editing and transcription of podcasts, translation, and matching search engine results.

Research validity

The validity of research conducted with the Mechanical Turk worker pool has long been debated among experts. This is largely because questions of validity are complex: they involve not only questions of whether the research methods were appropriate and whether the study was well-executed, but also questions about the goal of the project, how the researchers used MTurk, who was sampled, and what conclusions were drawn.

Most experts agree that MTurk is better suited for some types of research than others. MTurk appears well-suited for questions that seek to understand whether two or more things are related to each other (called correlational research; e.g., are happy people more healthy?) and questions that attempt to show one thing causes another thing (experimental research; e.g., being happy makes people more healthy). Fortunately, these categories capture most of the research conducted by behavioral scientists, and most correlational and experimental findings found in nationally representative samples replicate on MTurk.

The type of research that is not well-suited for MTurk is often called "descriptive research." Descriptive research seeks to describe how or what people think, feel, or do; one example is public opinion polling. MTurk is not well-suited to such research because it does not select a representative sample of the general population. Instead, MTurk is a nonprobability, convenience sample. Descriptive research is best conducted with a probability-based, representative sample of the population researchers want to understand. When compared to the general population, people on MTurk are younger, more highly educated, more liberal, and less religious.

Labor issues

The neutrality of this section is disputed. Relevant discussion may be found on the talk page. Please do not remove this message until conditions to do so are met. (July 2023) (Learn how and when to remove this message)

Mechanical Turk has been criticized by journalists and activists for its interactions with and use of labor. Computer scientist Jaron Lanier noted how the design of Mechanical Turk "allows you to think of the people as software components" in a way that conjures "a sense of magic, as if you can just pluck results out of the cloud at an incredibly low cost". A similar point is made in the book Ghostwork by Mary L. Gray and Siddharth Suri.

Critics of MTurk argue that workers are forced onto the site by precarious economic conditions and then exploited by requesters with low wages and a lack of power when disputes occur. Journalist Alana Semuels’s article "The Internet Is Enabling a New Kind of Poorly Paid Hell" in The Atlantic is typical of such criticisms of MTurk.

Some academic papers have obtained findings that support or serve as the basis for such common criticisms, but others contradict them. A recent academic commentary argued that study participants on sites like MTurk should be clearly warned about the circumstances in which they might later be denied payment as a matter of ethics, even though such statements may not reduce the rate of careless responding.

A paper published by a team at CloudResearch shows that only about 7% of people on MTurk view completing HITs as something akin to a full-time job. Most people report that MTurk is a way to earn money during their leisure time or as a side gig. In 2019, the typical worker spent five to eight hours per week and earned around $7 per hour. The sampled workers did not report rampant mistreatment at the hands of requesters; they reported trusting requesters more than employers outside of MTurk. Similar findings were presented in a review of MTurk by the Fair Crowd Work organization, a collective of crowd workers and unions.

Monetary compensation

The minimum payment that Amazon allows for a task is one cent. Because tasks are typically simple and repetitive the majority of tasks pay only a few cents, but there are also well-paying tasks on the site.

Many criticisms of MTurk stem from the fact that a majority of tasks offer low wages. In addition, workers are considered independent contractors rather than employees. Independent contractors are not protected by the Fair Labor Standards Act or other legislation that protects workers’ rights. Workers on MTurk must compete with others for good HIT opportunities as well as spend time searching for tasks and other actions that they are not compensated for.

The low payment offered for many tasks has fueled criticism of Mechanical Turk for exploiting and not compensating workers for the true value of the task they complete. One study of 3.8 million tasks completed by 2,767 workers showed that "workers earned a median hourly wage of about $2 an hour" with 4% of workers earning more than $7.25 per hour.

The Pew Research Center and the International Labour Office published data indicating people made around $5.00 per hour in 2015. A study focused on workers in the U.S. indicated average wages of at least $5.70 an hour, and data from the CloudResearch study found average wages of about $6.61 per hour. Some evidence suggests that very active and experienced people can earn $20 per hour or more.

Fraud

The Nation magazine reported in 2014 that some Requesters had taken advantage of Workers by having them do the tasks, then rejecting their submissions in order to avoid paying them. Available data indicates that rejections are fairly rare. Workers report having a small minority of their HITs rejected, perhaps as low as 1%.

In the Facebook–Cambridge Analytica data scandal, Mechanical Turk was one of the means of covertly gathering private information for a massive database. The system paid people a dollar or two to install a Facebook-connected app and answer personal questions. The survey task, as a work for hire, was not used for a demographic or psychological research project as it might have seemed. The purpose was instead to bait the worker to reveal personal information about the worker's identity that was not already collected by Facebook or Mechanical Turk.

Labor relations

Others have criticized that the marketplace does not allow workers to negotiate with employers. In response to criticisms of payment evasion and lack of representation, a group developed a third-party platform called Turkopticon which allows workers to give feedback on their employers. This allows workers to avoid potentially unscrupulous jobs and to recommend superior employers. Another platform called Dynamo allows workers to collect anonymously and organize campaigns to better their work environment, such as the Guidelines for Academic Requesters and the Dear Jeff Bezos Campaign. Amazon made it harder for workers to enroll in Dynamo by closing the request account that provided workers with a required code for Dynamo membership. Workers created third-party plugins to identify higher paying tasks, but Amazon updated its website to prevent these plugins from working. Workers have complained that Amazon's payment system will on occasion stop working.

Related systems

Further information: Crowdsourcing

Mechanical Turk is comparable in some respects to the now discontinued Google Answers service. However, the Mechanical Turk is a more general marketplace that can potentially help distribute any kind of work tasks all over the world. The Collaborative Human Interpreter (CHI) by Philipp Lenssen also suggested using distributed human intelligence to help computer programs perform tasks that computers cannot do well. MTurk could be used as the execution engine for the CHI.

In 2014 the Russian search giant Yandex launched a similar system called Toloka that is similar to the Mechanical Turk.

See also

References

  1. "Amazon Mechanical Turk, FAQ page". Retrieved 14 April 2017.
  2. "Overview | Requester | Amazon Mechanical Turk". Requester.mturk.com. Retrieved 2011-11-28.
  3. "Amazon Mechanical Turk". www.mturk.com.
  4. Multiple sources:
  5. "Artificial artificial intelligence". The Economist. 2006-06-10.
  6. ^ "Mturk pricing". AWS. Amazon. 2019. Retrieved 16 April 2019.
  7. "AWS Developer Forums". Retrieved 14 November 2012.
  8. Tamir, Dahn. "50000 Worldwide Mechanical Turk Workers". techlist. Retrieved September 17, 2014.
  9. Djellel, Difallah; Filatova, Elena; Ipeirotis, Panos (2018). "Demographics and Dynamics of Mechanical Turk Workers". Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining (PDF). pp. 135–143. doi:10.1145/3159652.3159661. ISBN 9781450355810. S2CID 22339115.
  10. ^ "Amazon Mechanical Turk: The Digital Sweatshop" Ellen Cushing Utne Reader January–February 2013
  11. Berg, Janine (2015–2016). "Income Security in the On-Demand Economy: Findings and Policy Lessons from a Survey of Crowdworkers". Comparative Labor Law & Policy Journal. 37: 543.
  12. ^ Geiger, Abigail (2016-07-11). "Research in the Crowdsourcing Age, a Case Study". Pew Research Center: Internet, Science & Tech. Retrieved 2023-01-09.
  13. "Amazon Mechanical Turk -Fair Crowd Work". Retrieved 2023-01-09.
  14. ^ Moss, Aaron J; Rosenzweig, Cheskie; Robinson, Jonathan; Jaffe, Shalom Noach; Litman, Leib (2020-04-28). "Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages". doi:10.31234/osf.io/jbc9d. S2CID 236840556. {{cite journal}}: Cite journal requires |journal= (help)
  15. "MTurk is the most ethical way to recruit crowd workers". Blog | TurkerView. 2019-11-18. Retrieved 2023-01-09.
  16. Panos Ipeirotis (March 19, 2008). "Mechanical Turk: The Demographics". New York University. Retrieved 2009-07-30.
  17. Panos Ipeirotis (March 16, 2009). "Turker Demographics vs Internet Demographics". New York University. Retrieved 2009-07-30.
  18. ^ Litman, Leib (2020). Conducting online research on Amazon Mechanical Turk and beyond. Jonathan Robinson (1st ed.). Los Angeles. ISBN 978-1-5063-9111-3. OCLC 1180179545.{{cite book}}: CS1 maint: location missing publisher (link)
  19. Robinson, Jonathan; Rosenzweig, Cheskie; Moss, Aaron J.; Litman, Leib (2019-12-16). Sudzina, Frantisek (ed.). "Tapped out or barely tapped? Recommendations for how to harness the vast and largely unused potential of the Mechanical Turk participant pool". PLOS ONE. 14 (12): e0226394. Bibcode:2019PLoSO..1426394R. doi:10.1371/journal.pone.0226394. ISSN 1932-6203. PMC 6913990. PMID 31841534.
  20. Panos Ipeirotis (March 9, 2010). "The New Demographics of Mechanical Turk". New York University. Retrieved 2014-03-24.
  21. "MTurk Tracker". demographics.mturk-tracker.com. Retrieved 1 October 2015.
  22. "MTurk Tracker". demographics.mturk-tracker.com. Retrieved 2 May 2019.
  23. "MTurk Tracker". demographics.mturk-tracker.com. Retrieved 17 April 2023.
  24. Casey, Logan; Chandler, Jesse; Levine, Adam; Proctor, Andrew; Sytolovich, Dara (2017). "Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection". SAGE Open. 7 (2): 215824401771277. doi:10.1177/2158244017712774.
  25. Levay, Kevin; Freese, Jeremy; Druckman, James (2016). "The Demographic and Political Composition of Mechanical Turk Samples". SAGE Open. 6: 215824401663643. doi:10.1177/2158244016636433.
  26. Agley, Jon; Xiao, Yunyu; Nolan, Rachael; Golzarri-Arroyo, Lilian (2021). "Quality control questions on Amazon's Mechanical Turk (MTurk): A randomized trial of impact on the USAUDIT, PHQ-9, and GAD-7". Behavior Research Methods. 54 (2): 885–897. doi:10.3758/s13428-021-01665-8. ISSN 1554-3528. PMC 8344397. PMID 34357539.
  27. Hauser, David; Paolacci, Gabriele; Chandler, Jesse J. (2018-09-01). "Common Concerns with MTurk as a Participant Pool: Evidence and Solutions". doi:10.31234/osf.io/uq45c. S2CID 240258666. {{cite journal}}: Cite journal requires |journal= (help)
  28. Chandler, Jesse.; Shapiro, Danielle (2016). "Conducting Clinical Research Using Crowdsourced Convenience Samples". Annual Review of Clinical Psychology. 12: 53–81. doi:10.1146/annurev-clinpsy-021815-093623. PMID 26772208.
  29. Huff, Connor; Tingley, Dustin (2015-07-01). ""Who are these people?" Evaluating the demographic characteristics and political preferences of MTurk survey respondents". Research & Politics. 2 (3): 205316801560464. doi:10.1177/2053168015604648. ISSN 2053-1680. S2CID 7749084.
  30. ^ Clifford, Scott; Jewell, Ryan M; Waggoner, Philip D (2015-10-01). "Are samples drawn from Mechanical Turk valid for research on political ideology?". Research & Politics. 2 (4): 205316801562207. doi:10.1177/2053168015622072. ISSN 2053-1680. S2CID 146591698.
  31. Chandler, Jesse; Rosenzweig, Cheskie; Moss, Aaron J.; Robinson, Jonathan; Litman, Leib (October 2019). "Online panels in social science research: Expanding sampling methods beyond Mechanical Turk". Behavior Research Methods. 51 (5): 2022–2038. doi:10.3758/s13428-019-01273-7. ISSN 1554-3528. PMC 6797699. PMID 31512174.
  32. Rajpurkar, Pranav; Zhang, Jian; Lopyrev, Konstantin; Liang, Percy (2016). "SQuAD: 100,000+ Questions for Machine Comprehension of Text". arXiv:1606.05250 .
  33. Steve Silberman (July 24, 2007). "Inside the High-Tech Search for a Silicon Valley Legend". Wired magazine. Retrieved 2007-09-16.
  34. "AVweb Invites You to Join the Search for Steve Fossett". Avweb.com. 8 September 2007. Retrieved 2011-11-28.
  35. "Official Mechanical Turk Steve Fossett Results". 2007-09-24. Retrieved 2012-08-14.
  36. Jim Christie (October 1, 2008). "Hikers find Steve Fossett's ID, belongings". Reuters. Archived from the original on 20 December 2008. Retrieved 2008-11-27.
  37. "Let's Get Physical". 5 August 2008.
  38. "Mechanical Games, online sports video for turkers | Neural". 29 October 2010.
  39. "Jail Benches and Amazon.com at SanTana's Grand Central Art Center | OC Weekly". Archived from the original on 2015-09-06. Retrieved 2019-04-16.
  40. "Amazon Web Services Blog: Amazon Mechanical Turk Status Update". Aws.typepad.com. 2005-12-06. Retrieved 2011-11-28.
  41. "Documentation Archive : Amazon Web Services". Developer.amazonwebservices.com. Archived from the original on 2009-04-10. Retrieved 2011-11-28.
  42. "Amazon Mechanical Turk API Reference". Developer.amazonwebservices.com.
  43. ^ "Inside Amazon's clickworker platform: How half a million people are being paid pennies to train AI". TechRepublic. 16 December 2016.
  44. Landers, R. N.; Behrend, T. S. (2015). "Can I Use Mechanical Turk (MTurk) for a Research Study?". Industrial and Organizational Psychology. 8 (2).
  45. "External Validity - Generalizing Results in Research". explorable.com.
  46. Coppock, Alexander; Leeper, Thomas J.; Mullinix, Kevin J. (2018-12-04). "Generalizability of heterogeneous treatment effect estimates across samples". Proceedings of the National Academy of Sciences. 115 (49): 12441–12446. Bibcode:2018PNAS..11512441C. doi:10.1073/pnas.1808083115. ISSN 0027-8424. PMC 6298071. PMID 30446611.
  47. Chandler, Jesse; Rosenzweig, Cheskie; Moss, Aaron J.; Robinson, Jonathan; Litman, Leib (2019-10-01). "Online panels in social science research: Expanding sampling methods beyond Mechanical Turk". Behavior Research Methods. 51 (5): 2022–2038. doi:10.3758/s13428-019-01273-7. ISSN 1554-3528. PMC 6797699. PMID 31512174.
  48. Jaron Lanier (2013). Who Owns the Future?. Simon and Schuster. ISBN 978-1-4516-5497-4.
  49. "Ghost Work". Ghost Work. Retrieved 2023-01-24.
  50. Semuels, Alana (2018-01-23). "The Internet Is Enabling a New Kind of Poorly Paid Hell". The Atlantic. Retrieved 2023-01-24.
  51. Fort, K.; Adda, G.; Cohen, K.B. (2011). "Amazon Mechanical Turk: Gold mine or coal mine?". Computational Linguistics. 37 (2): 413–420. doi:10.1162/COLI_a_00057. S2CID 1051130.
  52. Horton, John J. (2011-04-01). "The condition of the Turking class: Are online employers fair and honest?". Economics Letters. 111 (1): 10–12. arXiv:1001.1172. doi:10.1016/j.econlet.2010.12.007. ISSN 0165-1765. S2CID 37577313.
  53. Agley, Jon; Mumaw, Casey (2024-05-29). "Warning Crowdsourced Study Participants About Possible Consequences for Inattentive Participation Relates to Informed Consent, Regardless of Effects on Data Quality". Health Behavior Research. 7 (2). doi:10.4148/2572-1836.1236. ISSN 2572-1836.
  54. Brühlmann, Florian; Memeti, Zgjim; Aeschbach, Lena F.; Perrig, Sebastian A. C.; Opwis, Klaus (2024-01-18). "The effectiveness of warning statements in reducing careless responding in crowdsourced online surveys". Behavior Research Methods. 56 (6): 5862–5875. doi:10.3758/s13428-023-02321-z. ISSN 1554-3528. PMC 11335820. PMID 38238528.
  55. "Amazon Mechanical Turk -Fair Crowd Work". Retrieved 2023-01-24.
  56. Ipeirotis, P. G. (2010). Analyzing the amazon mechanical turk marketplace. XRDS: Crossroads, The ACM magazine for students, 17(2), 16-21.
  57. Schmidt, Florian Alexander (2013). "The Good, the Bad and the Ugly: Why Crowdsourcing Needs Ethics". 2013 International Conference on Cloud and Green Computing. pp. 531–535. doi:10.1109/CGC.2013.89. ISBN 978-0-7695-5114-2. S2CID 18798641.
  58. Hara, Kotaro; Adams, Abigail; Milland, Kristy; Savage, Saiph; Callison-Burch, Chris; Bigham, Jeffrey P. (2018-04-21). "A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk". Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. CHI '18. New York, NY, USA: Association for Computing Machinery. pp. 1–14. doi:10.1145/3173574.3174023. ISBN 978-1-4503-5620-6. S2CID 5040507.
  59. Berg, J. (2015). Income security in the on-demand economy: Findings and policy lessons from a survey of crowdworkers. Comparative Labor Law and Policy Journal, 37, 543.
  60. Litman, Leib; Robinson, Jonathan; Rosen, Zohn; Rosenzweig, Cheskie; Waxman, Joshua; Bates, Lisa M. (2020-02-21). "The persistence of pay inequality: The gender pay gap in an anonymous online labor market". PLOS ONE. 15 (2): e0229383. Bibcode:2020PLoSO..1529383L. doi:10.1371/journal.pone.0229383. ISSN 1932-6203. PMC 7034870. PMID 32084233.
  61. "MTurk is the most ethical way to recruit crowd workers". Blog | TurkerView. 2019-11-18. Retrieved 2023-01-24.
  62. Marvit, Moshe Z. (February 5, 2014). "How Crowdworkers Became the Ghosts in the Digital Machine". www.thenation.com.
  63. New York Times (April 10, 2018). "Cambridge Analytica and the Coming Data Bust". The New York Times. Retrieved April 13, 2018.
  64. Hal Hodson (February 7, 2013). "Crowdsourcing grows up as online workers unite". New Scientist. Retrieved May 21, 2015.
  65. "turkopticon". turkopticon.ucsd.edu.
  66. Mark Harris (December 3, 2014). "'Amazon's Mechanical Turk workers protest: 'I am a human being, not an algorithm''". The Guardian. Retrieved October 6, 2015.
  67. Fingas, Jon (December 3, 2014). "'Amazon's Mechanical Turk workers want to be treated like humans'". Engadget. Retrieved October 6, 2015.
  68. James Vincent (December 4, 2014). "Amazon's Mechanical Turkers want to be recognized as 'actual human beings'". The Verge. Retrieved October 6, 2015.
  69. Sarah Kessler (February 19, 2015). "WHAT DOES A UNION LOOK LIKE IN THE GIG ECONOMY?". Fast Company. Retrieved October 6, 2015.
  70. ^ Semuels, Alana (23 January 2018). "The Internet Is Enabling a New Kind of Poorly Paid Hell". The Atlantic. Retrieved 25 April 2019.
  71. "Yandex.Toloka".

Further reading

External links

Cloud computing
Business models
Technologies
Applications
Platforms
Infrastructure
Amazon
People
Current
Former
Facilities
Products and
services
Subsidiaries
Cloud
computing
Services
Devices
Technology
Media
Retail
Logistics
Former
Litigation
Other
Unions
Category
Categories: