The Data for Policy 2016 conference began with strong support from major institutions such as the University of Cambridge and the European Commission. Building on the success of last year, five major themes formed the basis for discussion throughout the conference: the revolution in government service provision using ‘Big Data’, data collection and its ethical implications, computational methods and technical challenges, methods of data collection and modelling, new and emerging sources of data collection and finally, the application of data to various policy areas both new and old. The conference, which is now in its second year, provided a space for ideas and people to mix and for a healthy discussion to take place which allows academics, policy makers and technical experts to interact with people from beyond their own disciplines.
In the words of Professor Anthony Finkelstein, Chief Scientific Advisor for National Security to the UK government:
“Data for Policy provides a unique opportunity to have a dialogue across many disciplinary boundaries, it’s a forum in which we can ask questions and move collectively towards solutions, we can share methods. I think it’s invaluable.”
Professor Enrico Giovanini from the University of Rome ‘Tor Vergata’ added that conferences like Data for Policy “are very important to share good practices and identify the latest developments.” Meanwhile, David Mair, a Head of Unit at the Joint Research Centre of the European Commission appreciated the collaborative aspects of the conference:
“It’s fantastic to come here and take the temperature of what’s going on in this field and hopefully out of this we can begin to look ahead and see where the new connections are going to be made between ideas and data science.”
Attendees came from a wide range of disciplines and institutions, from Professor Andrew Blake of the Alan Turing Institute to Maive Rute, the Deputy Director-General of the Joint Research Centre at the European Commission. This close co-operation of policy makers and data scientists is unique in the UK and indeed, the world. Prestigious institutions such as the United Nations, the Office for National Statistics and the World Bank were represented among the attendees. Special sessions were also organised and held on a separate basis throughout the conference by teams from Technopolis Group, GovLab, Leiden University and the Bank of England. Technopolis Group organised a session chaired by Cristina Rosemberg which discussed the use of ‘Big Data for Science and Innovation Policy’. The presenters for this group discussion were from the National Physical Laboratory, Nesta and Innovate UK. The Bank of England also organised a session, which discussed a number of different topics including the use of big data for systemic risk assessment and the ‘History Dependence in the Housing Market”.
The main sessions which took place throughout the conference were formed through a rigorous peer-review process after a Call for Proposals was issued in March. A large number of submissions were received, of which, only 50% were selected for presentation. The opening plenary session featured talks from number of contributors. In her opening lecture Helen Margetts, director of the Oxford Internet Institute discussed ‘Platform Government and a Data Science for Policy-making’ while Philip Treleaven, Professor of Computing and Director of the Financial Computing Centre at UCL gave a talk entitled ‘Algorithmic Regulation: Using Blockchain smart contract technology to revolutionise Financial Regulation’. The main body of the conference material was formed eighteen regular panels discussing a variety of topics. Dr. Rayid Ghani, Director of Data Science and Public Policy at the University of Chicago gave a lecture entitled ‘Doing practical data science for social good and public policy’. Quentin Palfrey, Executive Director of J-PAL at the Massachusetts Institute of Technology and a former senior advisor at the White House highlighted the benefits of ‘Big Data’ in reducing poverty and inequality in his talk entitled ‘Translating Rigorous Evidence into Policies that Benefit the Poor’.
“One of the things that’s difficult in determining whether a social programme works is knowing what would have happened if you hadn’t had the intervention,” said Mr Palfrey, “so what we do is take the methods of scientific inquiry that have transformed medical science, such as randomised control trials and we apply them to large-scale social programmes. Conferences like Data for Policy are very helpful tools for increasing the dialogue between the scholarly community and policy makers about what works, about what resources exist to help inform policymaking and also to teach policy makers how to be effective producers and consumers of evidence.”
The conference ended with a closing plenary session which involved a number of different speakers. One such speaker was Barbara Ubaldi from the Organisation for Economic Co-operation and Development, whose lecture was entitled ‘A Data-driven Public Sector for Sustainable and Inclusive Governance’.
“I think the biggest challenge in the next five years…is moving from concepts to reality,” said Ms Ubaldi, “this means focusing on implementation – we need to implement these ‘nice’ ideas. This requires greater connectivity between the public and private sectors, as well as academia. Conferences like Data for Policy are good because on the one hand raise awareness and on the other they bring together stakeholders that normally would not interact with one another and they bridge the interactions – like how to make concrete use of data and technology to inform policy issues.”
A number of interesting points emerged over the course of the conference. Maive Rute from the European Commission drew attention to a “growing scepticism among the public towards scientific experts and the reliability of data and science in general in a ‘post fact’ climate” – a broad and cogent argument, especially given recent political developments around the world that have shown a growing unwillingness to heed the advice of experts. Similarly, in his talk entitled ‘Data for Policy: A Myth or a Must?’ Enrico Giovannini claimed that “the value added by statistics depends on their relevance, trust in the media and the consumers’ literacy’ and went on to say, very powerfully, that “data is the lifeblood of democracy.” In his popular and witty presentation, Professor Jim Waldo of Harvard University, highlighted the problem more succinctly when he said “in data science, the science is only as good as the data”. Building on this theme and focusing more on praxis, other speakers such as Sarah Geist of the University of Leiden posed the question: “why don’t computer scientists know how policy works?” and Dr. Rayid Ghani from the University of Chicago spoke for many when he said that “training is needed on both sides – computer scientists and government officials – to use data effectively.”
Alongside the conference a poster competition was also run featuring presentations by students from University College London, Carnegie Mellon University, Birmingham University and many other institutions. The overall winner of the competition, which relied on votes from attendees over the course of the conference, was Maria De-Arteaga from Carnegie Mellon University for her poster entitled ‘Discovery of Complex Anomalous Patterns of Sexual Violence in El Salvador’.
The Data for Policy 2016 conference ran from the 15th – 16th September. The conference took place in the pleasant surroundings of the University of Cambridge’s Computer Laboratory, making use of its modern lecture halls and superb catering facilities – attendees were well taken care of and had many opportunities to mingle informally and share ideas.
The conference also had extensive multimedia services including video platforms and social media such as Twitter. Videos of all the keynote and plenary sessions can be found on the Data for Policy YouTube channel alongside individual interviews with key participants with further information and real-time commentary available on the Data for Policy Twitter account.
Data for Policy Twitter: https://twitter.com/dataforpolicy
Data for Policy website: https://dataforpolicy.org/
Data for Policy YouTube: https://www.youtube.com/channel/UCsJUrj-FJ4qT9eNuTOPyYNQ
(Report by Nathaniel Hayward)
Data for Policy 2015 – Policy-making in the Big Data Era: Opportunities and Challenges
15-17 June 2015, Cambridge
Current decision-making processes are far from being optimal to represent the best interests of the public and stakeholders, as contemporary policy domains are very complex, high dimensional and include a large dose of uncertainty. The massive amounts of data captured in our physical world through sensors and electronic devices provide a huge potential to advance these processes. With the availability of new technologies, new formulations are needed on fundamental questions such as how to conduct a census, how to produce labour statistics, or how to incorporate data mined from social media and administrative operations. Efficient procedures to draw links between large-scale data-processing technologies and existing expert knowledge in major policy domains would potentially offer chances to make policy development processes more citizen-focused, taking into account public needs and preferences supported with actual experiences of public services. This however comes with serious privacy and security concerns as intersecting various data sources could reveal unprecedented private information.
The conference committee invited contributions covering the following topics:
- Information and evidence in digital age
- Policy-making mechanisms and modelling approaches
- Existing methodologies, case studies, best practices for use of Big Data in policy
- Data collection, storage, processing and access procedures
- Cumulative learning in digital environments, potentials in policy context, challenges and limitations
- Interaction of domain expertise with digital processing technologies; dealing with imperfect/uncertain data; psychology/behaviour of decision
- Security and privacy issues; ethics and law
Publications from the Data for Policy 2015 Conference:
Selected Papers for the Special Issue of Policy & Internet (Wiley) Read more>>
Meyer, E. T., Crowcroft, J., Engin, Z. and Alexander, A. (2017), Data for Public Policy. Policy & Internet, 9: 4–6. doi:10.1002/poi3.147. Read more>>
Guerrero, O. A. and López, E. (2017), Understanding Unemployment in the Era of Big Data: Policy Informed by Data-Driven Theory. Policy & Internet, 9: 28–54. doi:10.1002/poi3.136. Read more >>
Malomo, F. and Sena, V. (2017), Data Intelligence for Local Government? Assessing the Benefits and Barriers to Use of Big Data in the Public Sector. Policy & Internet, 9: 7–27. doi:10.1002/poi3.141. Read more>>
Piscopo, A., Siebes, R. and Hardman, L. (2017), Predicting Sense of Community and Participation by Applying Machine Learning to Open Government Data. Policy & Internet, 9: 55–75. doi:10.1002/poi3.145. Read more>>
Longo, J., Kuras, E., Smith, H., Hondula, D. M. and Johnston, E. (2017), Technology Use, Exposure to Natural Hazards, and Being Digitally Invisible: Implications for Policy Analytics. Policy & Internet, 9: 76–108. doi:10.1002/poi3.144. Read more>>
Multimedia from the Data for Policy 2015 Conference:
The Data for Policy 2015 Conference was held at the University of Cambridge.