******************************************* Title: "Incentivizing High Quality Crowdwork" Speaker: Jennifer Wortman Vaughan. Microsoft Research, New York We study the causal effects of financial incentives on the quality of crowdwork. We focus on performance-based payments (PBPs), bonus payments awarded to workers for producing high quality work. We design and run randomized behavioral experiments on the popular crowdsourcing platform Amazon Mechanical Turk with the goal of understanding when, where, and why PBPs help, identifying properties of the payment, payment structure, and the task itself that make them most effective. We provide examples of tasks for which PBPs do improve quality. For such tasks, the effectiveness of PBPs is not too sensitive to the threshold for quality required to receive the bonus, while the magnitude of the bonus must be large enough to make the reward salient. We also present examples of tasks for which PBPs do not improve quality. Our results suggest that for PBPs to improve work quality, the task must be effort-responsive: the task must allow workers to produce higher quality work by exerting more effort. We also give a simple method to determine if a task is effort-responsive a priori. Furthermore, our experiments suggest that all payments on Mechanical Turk are, to some degree, implicitly performance-based in that workers believe their work may be rejected if their performance is sufficiently poor. Finally, we propose a new model of worker behavior that extends the standard principal-agent model from economics to include a worker's subjective beliefs about his likelihood of being paid, and show that the predictions of this model are in line with our experimental findings. I will discuss these results in the context of developing more realistic foundations for algorithmic theory for crowdsourcing markets and human computation more broadly. This talk is based on joint work with Chien-Ju Ho, Alex Slivkins, and Sid Suri. ******************************************* Title: "Bayesian Models for Powerful Crowdsourced Data Analysis" Speaker: Edwin Simpson. Oxford University Combining crowdsourcing with machine learning enables us to handle the unreliability of workers and the potentially large scale of datasets that require human analysis. By taking a Bayesian, probabilistic approach, we can aggregate information from the crowd intelligently, discover behaviour patterns that help us optimise the crowd, and assign tasks to increase the speed of learning from a crowd. In this talk I will present several pieces of work on Bayesian probabilistic models for intelligent aggregation and task assignment in crowdsourcing applications. In particular, I will examine efficient inference and extensions of Bayesian classifier combination. To kickstart the conversation about future opportunities for machine learning, I will discuss techniques to further optimise the deployment of crowds and learn to automate analysis tasks more rapidly with minimal initial training data. ******************************************* Title: "World's Largest Online Workplace at Elance-oDesk" Speaker: Panagiotis Papadimitriou and Ioannis Antonellis. Elance-oDesk Elance-oDesk is the world's largest online workplace, with 8M+ contractors working total billings >$900M in 2014 alone. With tens of thousands of jobs filled every day in our platform we have the world's best view on what it takes to find the best contractor for a job as a client or what is your ideal next job as a contractor. In this talk we will provide an overview on how we leverage machine learning and human intelligence to create products that help clients and contractors to become successful at Elance-oDesk and change their lives for ever. ******************************************* Title: "Incentivizing Users for Balancing Bike Sharing Systems" Speaker: Andreas Krause. ETH Zurich Human computation, a.k.a. crowdsourcing, aims to fuse human knowledge and expertise with computing to help solve problems that neither people nor machines can solve alone. The success of such human powered computing systems heavily depends on the active and effective participation of the users. This talk explores the research questions at the interplay of learning and incentives, with the goal of improving the overall effectiveness of such systems. In particular, we discuss the challenges faced by operators of the bike sharing systems from fluctuating and unpredictable demands, leading to imbalance problems such as unavailability of bikes or parking docks at stations. We present a crowdsourcing mechanism that incentive the users in the bike repositioning process by providing them with alternate choices to pick or return bikes in exchange for monetary incentives. We deployed the proposed mechanism through a smartphone app among users of a large-scale bike sharing system operated by a public transport company in a city of Europe, and we provide results from this experimental deployment. ******************************************* Title: "Crowdsourcing the Facebook Entity Graph" Speaker: Damien Peters and Julian Eisenschlos. Facebook's Crowdsourcing Team During this talk we will give an overview of both product and algorithmic aspects of crowdsourcing at Facebook. Facebook's Entity Graph powers products such as Checkins, Search, Ads and various recommendation features. In addition to automated methods such as imports, web extraction and machine learning, a crowdsourcing system that empowers people to curate the graph is essential to make it representative of its worldwide audience, and to keep it rich, accurate and responsive to changes in the real world. We will start by looking at the experience of making crowdsourced contributions on our platform, looking into the motivations that people have, and the challenges of keeping them engaged as well as guide them to provide quality information in the best possible format. In the core part of the talk we will dive into the data side of our work. We will go through the life of a crowdsourced contribution all the way until we write to the graph, with a particular emphasis in assessing the quality of the input we receive from our users. ******************************************* Title: "When Crowds Are Smarter Than Doctors" Speaker: Jared Heyman. CEO, CrowdMed CrowdMed harnesses the wisdom of crowds to solve even the world's most difficult medical cases online. Their patented crowdsourcing approach solves cases in just weeks that had stumped individual doctors for years, at a small fraction of the cost of the traditional medical system. Learn the mechanics of how CrowdMed works, including their unique process, incentive structure, reputation system, community moderation features, and communication and collaboration tools. *******************************************