Subject. CFP: NIPS'14 Workshop on Crowdsourcing and Machine Learning ============================================================== ***Call for Contributions*** Workshop on Crowdsourcing and Machine Learning at the Annual Conference on Neural Information Processing Systems (NIPS 2014) Montreal, Canada http://crowdml.cc/nips2014/ Workshop Date: Dec 13, 2014 Paper Submission Deadline: 20 Oct 2014, 11:59:00 PM PDT Author Notification: 25 Oct 2014, 11:59:00 PM PDT ============================================================== ***Overview*** Crowdsourcing aims to combine human knowledge and expertise with computing to help solve problems and scientific challenges that neither machines nor humans can solve alone. In addition to a number of human-powered scientific projects, including GalaxyZoo, eBird, and Foldit, crowdsourcing is impacting the ability of academic researchers to build new systems and run new experiments involving people, and is also gaining a lot of use within industry for collecting training data for the purpose of machine learning. There are a number of online marketplaces for crowdsourcing, including Amazon's Mechanical Turk, ODesk and MobileWorks. The fundamental question that we plan to explore in this workshop is: How can we build systems that combine the intelligence of humans and the computing power of machines for solving challenging scientific and engineering problems? The goal is to improve the performance of complex human-powered systems by making them more efficient, robust, and scalable. Current research in crowdsourcing often focuses on micro-tasking (for example, labeling a set of images), designing algorithms for solving optimization problems from the job requester's perspective and with simple models of worker behavior. However, the participants are people with rich capabilities including learning, collaboration and so forth, suggesting the need for more nuanced approaches that place special emphasis on the participants. Such human-powered systems could involve large numbers of people with varying expertise, skills, interests, and incentives. This poses many interesting research questions and exciting opportunities for the machine learning community. The goal of this workshop is to foster these ideas and work towards this goal by bringing together experts from the field of machine learning, cognitive science, economics, game theory, and human-computer interaction. Topics of interests in the workshop include (but are not limited to): *Social aspects and collaboration* How can systems exploit the social ties of the underlying participants or users to create incentives for users to collaborate? How can online social networks be used to create tasks with a gamification component and engage users in useful activities? With ever-increasing time on the Internet being spent on online social networks, there is a huge opportunity to elicit useful contributions from users at scale, by carefully designing tasks. *Incentives, pricing mechanisms and budget allocation* How to design the right incentive structure and pricing policies for participants that maximize the satisfaction of participants as well as utility of the job requester for a given budget? How can techniques from machine learning, economics and game theory be used to learn optimal pricing policies and to infer optimal incentive designs? *Learning by participants* How can we use insights from machine learning to build tools for training and teaching the participants for carrying out complex or difficult tasks? How can this training be actively adapted based on the skills or expertise of the participants and by tracking the learning process? *Peer prediction and knowledge aggregation* How can complex crowdsourcing tasks be decomposed into simpler micro-tasks? How can techniques of peer prediction be used to elicit informative responses from participants and incentivize effort? Can we design models and algorithms to effectively aggregate responses and knowledge, especially for complex tasks? *Privacy aspects* The question of privacy in human-powered systems has often been ignored and we seek to understand the privacy aspects both from job requester as well as privacy of the participants. How can a job requester (such as firm interested in translating legal documents) carry out crowdsourcing tasks without revealing private information to the crowd? How can systems negotiate the access to private information of participants (such as the GPS location in community sensing applications) in return of appropriate incentives? *Open theoretical questions and novel applications* What are the open research questions, emerging trends and novel applications related to design of incentives in human computation and crowdsourcing systems? ============================================================== ***Submissions*** NIPS 2014 format and are encouraged to be up to eight pages. Submission site: https://cmt.research.microsoft.com/CROWDW2014/ Paper Submission Deadline: 20 Oct 2014, 11:59:00 PM PDT More information is available at: http://crowdml.cc/nips2014/ ============================================================== ***Organizers*** Adish Singla (ETH Zurich) Chien-Ju Ho (University of California, Los Angeles) David Parkes (Harvard University) Nihar Shah (University of California, Berkeley) Dengyong Zhou (Microsoft Research, Redmond) ==============================================================