Amazon Mechanical Turk Explained

Amazon Mechanical Turk (MTurk) is a crowdsourcing website with which businesses can hire remotely located "crowdworkers" to perform discrete on-demand tasks that computers are currently unable to do as economically. It is operated under Amazon Web Services, and is owned by Amazon.[1] Employers, known as requesters, post jobs known as Human Intelligence Tasks (HITs), such as identifying specific content in an image or video, writing product descriptions, or answering survey questions. Workers, colloquially known as Turkers or crowdworkers, browse among existing jobs and complete them in exchange for a fee set by the requester. To place jobs, requesters use an open application programming interface (API), or the more limited MTurk Requester site.[2], requesters could register from 49 approved countries.[3]

History

The service was conceived by Venky Harinarayan in a U.S. patent disclosure in 2001.[4] Amazon coined the term artificial artificial intelligence for processes that outsource some parts of a computer program to humans, for those tasks carried out much faster by humans than computers. It is claimed that Jeff Bezos was responsible for proposing the development of Amazon's Mechanical Turk to realize this process.[5]

The name Mechanical Turk was inspired by "The Turk", an 18th-century chess-playing automaton made by Wolfgang von Kempelen that toured Europe, and beat both Napoleon Bonaparte and Benjamin Franklin. It was later revealed that this "machine" was not an automaton, but a human chess master hidden in the cabinet beneath the board and controlling the movements of a humanoid dummy. Analogously, the Mechanical Turk online service uses remote human labor hidden behind a computer interface to help employers perform tasks that are not possible using a true machine.

MTurk launched publicly on November 2, 2005. Its user base grew quickly. In early- to mid-November 2005, there were tens of thousands of jobs, all uploaded to the system by Amazon itself for some of its internal tasks that required human intelligence. HIT types expanded to include transcribing, rating, image tagging, surveys, and writing.

In March 2007, there were reportedly more than 100,000 workers in over 100 countries. This increased to over 500,000 registered workers from over 190 countries in January 2011.[6] That year, Techlist published an interactive map pinpointing the locations of 50,000 of their MTurk workers around the world.[7] By 2018, research demonstrated that while over 100,000 workers were available on the platform at any time, only around 2,000 were actively working.[8]

Overview

A user of Mechanical Turk can be either a "Worker" (contractor) or a "Requester" (employer). Workers have access to a dashboard that displays three sections: total earnings, HIT status, and HIT totals. Workers set their own hours and are not under any obligation to accept any particular task.

Amazon classifies Workers as contractors rather than employees and does not pay payroll taxes. Classifying Workers as contractors allows Amazon to avoid things like minimum wage, overtime, and workers compensation—this is a common practice among "gig economy" platforms. Workers are legally required to report their income as self-employment income.

In 2013, the average wage for the multiple microtasks assigned, if performed quickly, was about one dollar an hour, with each task averaging a few cents.[9] However, calculating people's average hourly earnings on a microtask site is extremely difficult and several sources of data show average hourly earnings in the $5–$9 per hour[10] [11] [12] [13] range among a substantial number of Workers, while the most experienced, active, and proficient workers may earn over $20 per hour.[14]

Workers can have a postal address anywhere in the world. Payment for completing tasks can be redeemed on Amazon.com via gift certificate (gift certificates are the only payment option available to international workers, apart from India) or can be transferred to a Worker's U.S. bank account.

Requesters can ask that Workers fulfill qualifications before engaging in a task, and they can establish a test designed to verify the qualification. They can also accept or reject the result sent by the Worker, which affects the Worker's reputation., Requesters paid Amazon a minimum 20% commission on the price of successfully completed jobs, with increased amounts for .[15] Requesters can use the Amazon Mechanical Turk API to programmatically integrate the results of the work directly into their business processes and systems. When employers set up a job, they must specify

as well as the specific details about the job they want to be completed.

Location of Turkers

Workers have been primarily located in the United States since the platform's inception[16] with demographics generally similar to the overall Internet population in the U.S.[17] Within the U.S. workers are fairly evenly spread across states, proportional to each state’s share of the U.S. population.[18], between 15 and 30 thousand people in the U.S. complete at least one HIT each month and about 4,500 new people join MTurk each month.[19]

Cash payments for Indian workers were introduced in 2010, which updated the demographics of workers, who however remained primarily within the United States.[20] A website showing worker demographics in May 2015 showed that 80% of workers were located in the United States, with the remaining 20% located elsewhere in the world, most of whom were in India.[21] In May 2019, approximately 60% were in the U.S., 40% elsewhere (approximately 30% in India).[22] In early 2023 about 90% of workers were from the U.S. and about half of the remainder from India.[23]

Uses

Human-subject research

, numerous researchers have explored the viability of Mechanical Turk to recruit subjects for social science experiments. Researchers have generally found that while samples of respondents obtained through Mechanical Turk do not perfectly match all relevant characteristics of the U.S. population, they are also not wildly misrepresentative.[24] [25] As a result, thousands of papers that rely on data collected from Mechanical Turk workers are published each year, including hundreds in top ranked academic journals.

A challenge with using MTurk for human-subject research has been maintaining data quality. A study published in 2021 found that the types of quality control approaches used by researchers (such as checking for bots, VPN users, or workers willing to submit dishonest responses) can meaningfully influence survey results. They demonstrated this via impact on three common behavioral/mental healthcare screening tools.[26] Even though managing data quality requires work from researchers, there is a large body of research showing how to gather high quality data from MTurk.[27] The cost of using MTurk is considerably lower than many other means of conducting surveys, so many researchers continue to use it.

The general consensus among researchers is that the service works best for recruiting a diverse sample; it is less successful with studies that require more precisely defined populations or that require a representative sample of the population as a whole.[28] Many papers have been published on the demographics of the MTurk population.[29] [30] MTurk workers tend to be younger, more educated, more liberal, and slightly less wealthy than the U.S. population overall.[31]

Machine Learning

Supervised Machine Learning algorithms require large amounts of human-annotated data to be trained successfully. Machine learning researchers have hired Workers through Mechanical Turk to produce datasets such as SQuAD, a question answering dataset.[32]

Missing persons searches

, the service has been used to search for prominent missing individuals. This use was first suggested during the search for James Kim, but his body was found before any technical progress was made. That summer, computer scientist Jim Gray disappeared on his yacht and Amazon's Werner Vogels, a personal friend, made arrangements for DigitalGlobe, which provides satellite data for Google Maps and Google Earth, to put recent photography of the Farallon Islands on Mechanical Turk. A front-page story on Digg attracted 12,000 searchers who worked with imaging professionals on the same data. The search was unsuccessful.[33]

In September 2007, a similar arrangement was repeated in the search for aviator Steve Fossett. Satellite data was divided into 85m2 sections, and Mechanical Turk users were asked to flag images with "foreign objects" that might be a crash site or other evidence that should be examined more closely.[34] This search was also unsuccessful. The satellite imagery was mostly within a 50-mile radius,[35] but the crash site was eventually found by hikers about a year later, 65 miles away.[36]

Artistic works

MTurk has also been used as a tool for artistic creation. One of the first artists to work with Mechanical Turk was xtine burrough, with The Mechanical Olympics (2008),[37] [38] Endless Om (2015), and Mediations on Digital Labor (2015).[39] Another work was artist Aaron Koblin's Ten Thousand Cents (2008).

Third-party programming

Programmers have developed browser extensions and scripts designed to simplify the process of completing jobs. Amazon has stated that they disapprove of scripts that completely automate the process and preclude the human element. This is because of the concern that the task completion process—e.g. answering a survey—could be gamed with random responses, and the resultant collected data could be worthless.[40] Accounts using so-called automated bots have been banned.

API

Amazon makes available an application programming interface (API) for the MTurk system. The MTurk API lets a programmer submit jobs, retrieve completed work, and approve or reject that work.[41] In 2017, Amazon launched support for AWS Software Development Kits (SDK), allowing for nine new SDKs available to MTurk Users. MTurk is accessible via API from the following languages: Python, JavaScript, Java, .NET, Go, Ruby, PHP, or C++.[42] Web sites and web services can use the API to integrate MTurk work into other web applications, providing users with alternatives to the interface Amazon has built for these functions.

Use case examples

Processing photos / videos

Amazon Mechanical Turk provides a platform for processing images, a task well-suited to human intelligence. Requesters have created tasks that ask workers to label objects found in an image, select the most relevant picture in a group of pictures, screen inappropriate content, classify objects in satellite images, or digitize text from images such as scanned forms filled out by hand.[43]

Data cleaning / verification

Companies with large online catalogues use Mechanical Turk to identify duplicates and verify details of item entries. For example: removing duplicates in yellow pages directory listings, checking restaurant details (e.g. phone number and hours), and finding contact information from web pages (e.g. author name and email).

Information collection

Diversification and scale of personnel of Mechanical Turk allow collecting information at a large scale, which would be difficult outside of a crowd platform. Mechanical Turk allows Requesters to amass a large number of responses to various types of surveys, from basic demographics to academic research. Other uses include writing comments, descriptions, and blog entries to websites and searching data elements or specific fields in large government and legal documents.

Data processing

Companies use Mechanical Turk's crowd labor to understand and respond to different types of data. Common uses include editing and transcription of podcasts, translation, and matching search engine results.

Research validity

The validity of research conducted with the Mechanical Turk worker pool has long been debated among experts.[44] This is largely because questions of validity[45] are complex: they involve not only questions of whether the research methods were appropriate and whether the study was well-executed, but also questions about the goal of the project, how the researchers used MTurk, who was sampled, and what conclusions were drawn.

Most experts agree that MTurk is better suited for some types of research than others. MTurk appears well-suited for questions that seek to understand whether two or more things are related to each other (called correlational research; e.g., are happy people more healthy?) and questions that attempt to show one thing causes another thing (experimental research; e.g., being happy makes people more healthy). Fortunately, these categories capture most of the research conducted by behavioral scientists, and most correlational and experimental findings found in nationally representative samples replicate on MTurk.[46]

The type of research that is not well-suited for MTurk is often called "descriptive research." Descriptive research seeks to describe how or what people think, feel, or do; one example is public opinion polling. MTurk is not well-suited to such research because it does not select a representative sample of the general population. Instead, MTurk is a nonprobability, convenience sample. Descriptive research is best conducted with a probability-based, representative sample of the population researchers want to understand. When compared to the general population, people on MTurk are younger, more highly educated, more liberal, and less religious.[47]

Labor issues

Mechanical Turk has been criticized by journalists and activists for its interactions with and use of labor.Computer scientist Jaron Lanier noted how the design of Mechanical Turk "allows you to think of the people as software components" in a way that conjures "a sense of magic, as if you can just pluck results out of the cloud at an incredibly low cost".[48] A similar point is made in the book Ghostwork by Mary L. Gray and Siddharth Suri.[49]

Critics of MTurk argue that workers are forced onto the site by precarious economic conditions and then exploited by requesters with low wages and a lack of power when disputes occur. Journalist Alana Semuels’s article "The Internet Is Enabling a New Kind of Poorly Paid Hell" in The Atlantic is typical of such criticisms of MTurk.[50]

Some academic papers have obtained findings that support or serve as the basis for such common criticisms,[51] but others contradict them.[52] A recent academic commentary argued that study participants on sites like MTurk should be clearly warned about the circumstances in which they might later be denied payment as a matter of ethics,[53] even though such statements may not reduce the rate of careless responding.[54]

A paper published by a team at CloudResearch shows that only about 7% of people on MTurk view completing HITs as something akin to a full-time job. Most people report that MTurk is a way to earn money during their leisure time or as a side gig. In 2019, the typical worker spent five to eight hours per week and earned around $7 per hour. The sampled workers did not report mistreatment at the hands of requesters; they reported trusting requesters more than employers outside of MTurk. Similar findings were presented in a review of MTurk by the Fair Crowd Work organization, a collective of crowd workers and unions.[55]

Monetary compensation

The minimum payment that Amazon allows for a task is one cent. Because tasks are typically simple and repetitive the majority of tasks pay only a few cents,[56] but there are also well-paying tasks on the site.

Many criticisms of MTurk stem from the fact that a majority of tasks offer low wages. In addition, workers are considered independent contractors rather than employees. Independent contractors are not protected by the Fair Labor Standards Act or other legislation that protects workers’ rights. Workers on MTurk must compete with others for good HIT opportunities as well as spend time searching for tasks and other actions that they are not compensated for.

The low payment offered for many tasks has fueled criticism of Mechanical Turk for exploiting and not compensating workers for the true value of the task they complete.[57] One study of 3.8 million tasks completed by 2,767 workers showed that "workers earned a median hourly wage of about $2 an hour" with 4% of workers earning more than $7.25 per hour.[58]

The Pew Research Center and the International Labour Office published data indicating people made around $5.00 per hour in 2015.[59] A study focused on workers in the U.S. indicated average wages of at least $5.70 an hour,[60] and data from the CloudResearch study found average wages of about $6.61 per hour. Some evidence suggests that very active and experienced people can earn $20 per hour or more.[61]

Fraud

The Nation magazine reported in 2014 that some Requesters had taken advantage of Workers by having them do the tasks, then rejecting their submissions in order to avoid paying them.[62] Available data indicates that rejections are fairly rare. Workers report having a small minority of their HITs rejected, perhaps as low as 1%.

In the Facebook–Cambridge Analytica data scandal, Mechanical Turk was one of the means of covertly gathering private information for a massive database.[63] The system paid people a dollar or two to install a Facebook-connected app and answer personal questions. The survey task, as a work for hire, was not used for a demographic or psychological research project as it might have seemed. The purpose was instead to bait the worker to reveal personal information about the worker's identity that was not already collected by Facebook or Mechanical Turk.

Labor relations

Others have criticized that the marketplace does not allow workers to negotiate with employers. In response to criticisms of payment evasion and lack of representation, a group developed a third-party platform called Turkopticon which allows workers to give feedback on their employers. This allows workers to avoid potentially unscrupulous jobs and to recommend superior employers.[64] [65] Another platform called Dynamo allows workers anonymously and organize campaigns to better their work environment, such as the Guidelines for Academic Requesters and the Dear Jeff Bezos Campaign.[66] [67] [68] [69] Amazon made it harder for workers to enroll in Dynamo by closing the request account that provided workers with a required code for Dynamo membership. Workers created third-party plugins to identify higher paying tasks, but Amazon updated its website to prevent these plugins from working. Workers have complained that Amazon's payment system will on occasion stop working.[70]

Related systems

Mechanical Turk is comparable in some respects to the now discontinued Google Answers service. However, the Mechanical Turk is a more general marketplace that can potentially help distribute any kind of work tasks all over the world. The Collaborative Human Interpreter (CHI) by Philipp Lenssen also suggested using distributed human intelligence to help computer programs perform tasks that computers cannot do well. MTurk could be used as the execution engine for the CHI.

In 2014 the Russian search giant Yandex launched a similar system called Toloka that is similar to the Mechanical Turk.[71]

See also

Further reading

External links

Notes and References

  1. Web site: Amazon Mechanical Turk, FAQ page. 14 April 2017.
  2. Web site: Overview | Requester | Amazon Mechanical Turk . Requester.mturk.com . 2011-11-28.
  3. Web site: Amazon Mechanical Turk. www.mturk.com.
  4. Multiple sources:
    • Web site: Hybrid machine/human computing arrangement. 2001. 28 July 2016. 12 June 2018. https://web.archive.org/web/20180612141325/http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=/netahtml/PTO/srchnum.htm&r=1&f=G&l=50&s1=7,197,459.PN.&OS=PN/7,197,459&RS=PN/7,197,459/. dead.
  5. News: Artificial artificial intelligence. . 2006-06-10.
  6. Web site: AWS Developer Forums. 14 November 2012.
  7. Web site: Tamir. Dahn. 50000 Worldwide Mechanical Turk Workers. techlist. September 17, 2014.
  8. Book: Djellel . Difallah . Filatova . Elena . Ipeirotis . Panos . Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining . Demographics and Dynamics of Mechanical Turk Workers . 2018 . 135–143 . 10.1145/3159652.3159661. 9781450355810 . 22339115 .
  9. http://www.utne.com/science-technology/amazon-mechanical-turk-zm0z13jfzlin.aspx "Amazon Mechanical Turk: The Digital Sweatshop"
  10. Berg . Janine . 2015–2016 . Income Security in the On-Demand Economy: Findings and Policy Lessons from a Survey of Crowdworkers . Comparative Labor Law & Policy Journal . 37 . 543.
  11. Web site: Geiger . Abigail . 2016-07-11 . Research in the Crowdsourcing Age, a Case Study . 2023-01-09 . Pew Research Center: Internet, Science & Tech . en-US.
  12. Web site: Amazon Mechanical Turk -Fair Crowd Work . 2023-01-09 . en.
  13. Moss . Aaron J . Rosenzweig . Cheskie . Robinson . Jonathan . Jaffe . Shalom Noach . Litman . Leib . 2020-04-28 . Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages . 10.31234/osf.io/jbc9d. 236840556 .
  14. Web site: 2019-11-18 . MTurk is the most ethical way to recruit crowd workers. . 2023-01-09 . Blog TurkerView . en.
  15. Web site: Mturk pricing. 2019. 16 April 2019. AWS. Amazon.
  16. News: Mechanical Turk: The Demographics. Panos Ipeirotis. March 19, 2008. New York University. 2009-07-30. Panos Ipeirotis.
  17. News: Turker Demographics vs Internet Demographics. Panos Ipeirotis. March 16, 2009. New York University. 2009-07-30.
  18. Book: Litman, Leib . Conducting online research on Amazon Mechanical Turk and beyond . 2020 . Jonathan Robinson . 978-1-5063-9111-3 . 1st . Los Angeles . 1180179545.
  19. Robinson . Jonathan . Rosenzweig . Cheskie . Moss . Aaron J. . Litman . Leib . 2019-12-16 . Sudzina . Frantisek . Tapped out or barely tapped? Recommendations for how to harness the vast and largely unused potential of the Mechanical Turk participant pool . PLOS ONE . en . 14 . 12 . e0226394 . 10.1371/journal.pone.0226394 . 1932-6203 . 6913990 . 31841534. 2019PLoSO..1426394R . free .
  20. News: The New Demographics of Mechanical Turk. Panos Ipeirotis. March 9, 2010. New York University. 2014-03-24.
  21. Web site: MTurk Tracker. demographics.mturk-tracker.com. 1 October 2015.
  22. Web site: MTurk Tracker. demographics.mturk-tracker.com. 2 May 2019.
  23. Web site: MTurk Tracker. demographics.mturk-tracker.com. 17 April 2023.
  24. Casey . Logan . Chandler . Jesse . Levine . Adam . Proctor . Andrew. Sytolovich. Dara. 2017 . Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection . SAGE Open . 7 . 2 . 215824401771277 . 10.1177/2158244017712774 . free .
  25. Levay . Kevin . Freese . Jeremy . Druckman . James N. Druckman . James. 2016 . The Demographic and Political Composition of Mechanical Turk Samples . SAGE Open . 6. 215824401663643 . 10.1177/2158244016636433 . free .
  26. Agley. Jon. Xiao. Yunyu. Nolan. Rachael. Golzarri-Arroyo. Lilian. 2021. Quality control questions on Amazon's Mechanical Turk (MTurk): A randomized trial of impact on the USAUDIT, PHQ-9, and GAD-7. Behavior Research Methods. 54 . 2 . 885–897 . en. 10.3758/s13428-021-01665-8. 34357539. 8344397. 1554-3528. free.
  27. Hauser . David . Paolacci . Gabriele . Chandler . Jesse J. . 2018-09-01 . Common Concerns with MTurk as a Participant Pool: Evidence and Solutions . 10.31234/osf.io/uq45c. 240258666 .
    • Clifford . Scott . Jerit . Jennifer . 2016 . Cheating on Political Knowledge Questions in Online Surveys: An Assessment of the Problem and Solutions . Public Opinion Quarterly . en . 80 . 4 . 858–887 . 10.1093/poq/nfw030 . 0033-362X. free .
    • Hauser . David J. . Moss . Aaron J. . Rosenzweig . Cheskie . Jaffe . Shalom N. . Robinson . Jonathan . Litman . Leib . 2022-11-03 . Evaluating CloudResearch's Approved Group as a solution for problematic data quality on MTurk . Behavior Research Methods . 55 . 8 . 3953–3964 . en . 10.3758/s13428-022-01999-x . 36326997 . 1554-3528. free . 10700412 .
  28. Chandler . Jesse. . Shapiro . Danielle . 2016 . Conducting Clinical Research Using Crowdsourced Convenience Samples . Annual Review of Clinical Psychology . 12 . 53–81 . 10.1146/annurev-clinpsy-021815-093623 . 26772208 . free.
  29. Huff . Connor . Tingley . Dustin . 2015-07-01 . "Who are these people?" Evaluating the demographic characteristics and political preferences of MTurk survey respondents . Research & Politics . en . 2 . 3 . 205316801560464 . 10.1177/2053168015604648 . 7749084 . 2053-1680. free .
  30. Clifford . Scott . Jewell . Ryan M . Waggoner . Philip D . 2015-10-01 . Are samples drawn from Mechanical Turk valid for research on political ideology? . Research & Politics . en . 2 . 4 . 205316801562207 . 10.1177/2053168015622072 . 146591698 . 2053-1680. free .
  31. Chandler . Jesse . Rosenzweig . Cheskie . Moss . Aaron J. . Robinson . Jonathan . Litman . Leib . October 2019 . Online panels in social science research: Expanding sampling methods beyond Mechanical Turk . Behavior Research Methods . en . 51 . 5 . 2022–2038 . 10.3758/s13428-019-01273-7 . 1554-3528 . 6797699 . 31512174.
  32. 1606.05250. SQuAD: 100,000+ Questions for Machine Comprehension of Text. cs.CL. Rajpurkar. Pranav. Zhang. Jian. Lopyrev. Konstantin. Liang. Percy. 2016.
  33. Inside the High-Tech Search for a Silicon Valley Legend. Steve Silberman. July 24, 2007. Wired magazine. 2007-09-16.
  34. Web site: AVweb Invites You to Join the Search for Steve Fossett . 8 September 2007 . Avweb.com . 2011-11-28.
  35. Web site: Official Mechanical Turk Steve Fossett Results. 2007-09-24. 2012-08-14.
  36. News: Hikers find Steve Fossett's ID, belongings. Jim Christie. October 1, 2008. Reuters. 2008-11-27. https://web.archive.org/web/20081220030716/https://www.reuters.com/article/peopleNews/idUSTRE4907G820081001. 20 December 2008. live.
  37. Web site: Let's Get Physical. 5 August 2008 .
  38. Web site: Mechanical Games, online sports video for turkers | Neural. 29 October 2010 .
  39. Web site: Jail Benches and Amazon.com at SanTana's Grand Central Art Center | OC Weekly . 2019-04-16 . https://web.archive.org/web/20150906074809/http://www.ocweekly.com/2015-05-28/culture/john-spiak-grand-central-art-center-santa-ana/ . 2015-09-06 . dead .
    • Project: http://www.missconceptions.net/mediations/
  40. Web site: Amazon Web Services Blog: Amazon Mechanical Turk Status Update . Aws.typepad.com . 2005-12-06 . 2011-11-28.
  41. Web site: Documentation Archive : Amazon Web Services . Developer.amazonwebservices.com . 2011-11-28 . dead . https://web.archive.org/web/20090410032147/http://developer.amazonwebservices.com/connect/kbcategory.jspa?categoryID=28 . 2009-04-10 .
  42. Web site: Amazon Mechanical Turk API Reference . Developer.amazonwebservices.com .
  43. Web site: Inside Amazon's clickworker platform: How half a million people are being paid pennies to train AI. TechRepublic. 16 December 2016 .
  44. Can I Use Mechanical Turk (MTurk) for a Research Study?. Industrial and Organizational Psychology. 8. 2. 2015. Landers. R. N.. Behrend. T. S..
  45. Web site: External Validity - Generalizing Results in Research . explorable.com.
  46. Coppock . Alexander . Leeper . Thomas J. . Mullinix . Kevin J. . 2018-12-04 . Generalizability of heterogeneous treatment effect estimates across samples . Proceedings of the National Academy of Sciences . en . 115 . 49 . 12441–12446 . 10.1073/pnas.1808083115 . 0027-8424 . 6298071 . 30446611. 2018PNAS..11512441C . free .
  47. Chandler . Jesse . Rosenzweig . Cheskie . Moss . Aaron J. . Robinson . Jonathan . Litman . Leib . 2019-10-01 . Online panels in social science research: Expanding sampling methods beyond Mechanical Turk . Behavior Research Methods . en . 51 . 5 . 2022–2038 . 10.3758/s13428-019-01273-7 . 1554-3528 . 6797699 . 31512174.
  48. Book: Jaron Lanier. Who Owns the Future? . 2013. Simon and Schuster. 978-1-4516-5497-4. Who Owns the Future? .
  49. Web site: Ghost Work . 2023-01-24 . Ghost Work . en-US.
  50. Web site: Semuels . Alana . 2018-01-23 . The Internet Is Enabling a New Kind of Poorly Paid Hell . 2023-01-24 . The Atlantic . en.
  51. Fort . K. . Adda . G. . Cohen . K.B. . 2011 . Amazon Mechanical Turk: Gold mine or coal mine? . Computational Linguistics . 37 . 2 . 413–420. 10.1162/COLI_a_00057 . 1051130 . free .
    • Williamson . Vanessa . January 2016 . On the Ethics of Crowdsourced Research . PS: Political Science & Politics . en . 49 . 1 . 77–81 . 10.1017/S104909651500116X . 155196102 . 1049-0965. free .
  52. Horton . John J. . 2011-04-01 . The condition of the Turking class: Are online employers fair and honest? . Economics Letters . en . 111 . 1 . 10–12 . 10.1016/j.econlet.2010.12.007 . 1001.1172 . 37577313 . 0165-1765.
  53. Agley . Jon . Mumaw . Casey . 2024-05-29 . Warning Crowdsourced Study Participants About Possible Consequences for Inattentive Participation Relates to Informed Consent, Regardless of Effects on Data Quality . Health Behavior Research . en . 7 . 2 . 10.4148/2572-1836.1236 . 2572-1836.
  54. Brühlmann . Florian . Memeti . Zgjim . Aeschbach . Lena F. . Perrig . Sebastian A. C. . Opwis . Klaus . 2024-01-18 . The effectiveness of warning statements in reducing careless responding in crowdsourced online surveys . Behavior Research Methods . en . 10.3758/s13428-023-02321-z . 1554-3528. free .
  55. Web site: Amazon Mechanical Turk -Fair Crowd Work . 2023-01-24 . en.
  56. Ipeirotis, P. G. (2010). Analyzing the amazon mechanical turk marketplace. XRDS: Crossroads, The ACM magazine for students, 17(2), 16-21.
  57. Book: 10.1109/CGC.2013.89. 531–535. 2013. Schmidt. Florian Alexander. The Good, the Bad and the Ugly: Why Crowdsourcing Needs Ethics . 2013 International Conference on Cloud and Green Computing. 978-0-7695-5114-2. 18798641.
  58. Book: Hara . Kotaro . Adams . Abigail . Milland . Kristy . Savage . Saiph . Callison-Burch . Chris . Bigham . Jeffrey P. . Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems . A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk . 2018-04-21 . https://doi.org/10.1145/3173574.3174023 . CHI '18 . New York, NY, USA . Association for Computing Machinery . 1–14 . 10.1145/3173574.3174023 . 978-1-4503-5620-6. 5040507 .
  59. Berg, J. (2015). Income security in the on-demand economy: Findings and policy lessons from a survey of crowdworkers. Comparative Labor Law and Policy Journal, 37, 543.
  60. Litman . Leib . Robinson . Jonathan . Rosen . Zohn . Rosenzweig . Cheskie . Waxman . Joshua . Bates . Lisa M. . 2020-02-21 . The persistence of pay inequality: The gender pay gap in an anonymous online labor market . PLOS ONE . en . 15 . 2 . e0229383 . 10.1371/journal.pone.0229383 . 1932-6203 . 7034870 . 32084233. 2020PLoSO..1529383L . free .
  61. Web site: 2019-11-18 . MTurk is the most ethical way to recruit crowd workers. . 2023-01-24 . Blog TurkerView . en.
  62. How Crowdworkers Became the Ghosts in the Digital Machine. Moshe Z.. Marvit. February 5, 2014. www.thenation.com.
  63. News: Cambridge Analytica and the Coming Data Bust. New York Times. April 10, 2018. April 13, 2018. The New York Times.
  64. Crowdsourcing grows up as online workers unite . Hal Hodson . February 7, 2013 . New Scientist . May 21, 2015.
  65. Web site: turkopticon.. turkopticon.ucsd.edu.
  66. News: 'Amazon's Mechanical Turk workers protest: 'I am a human being, not an algorithm' . Mark Harris . December 3, 2014 . The Guardian . October 6, 2015.
  67. 'Amazon's Mechanical Turk workers want to be treated like humans' . Jon . Fingas . December 3, 2014 . Engadget . October 6, 2015.
  68. Web site: Amazon's Mechanical Turkers want to be recognized as 'actual human beings' . James Vincent . December 4, 2014 . The Verge . October 6, 2015.
  69. WHAT DOES A UNION LOOK LIKE IN THE GIG ECONOMY? . Sarah Kessler . February 19, 2015 . Fast Company . October 6, 2015.
  70. Web site: Semuels . Alana . The Internet Is Enabling a New Kind of Poorly Paid Hell . The Atlantic . 25 April 2019 . 23 January 2018.
  71. Web site: Yandex.Toloka.