A weekly curation of new findings and developments on innovation in governance.

Table of Contents

A Research Roadmap for Human Computation

Jun 05, 2015 02:49 am

Emerging Technology From the arXiv : “The wisdom of the crowd has become so powerful and so accessible via the Internet that it has become a resource in its own right. Various services now tap into this rich supply of human cognition, such as Wikipedia, Duolingo, and Amazon’s Mechanical Turk.

So important is this resource that scientists have given it a name; they call it human computation. And a rapidly emerging and increasingly important question is how best to exploit it.

Today, we get an answer of sorts thanks to a group of computer scientists, crowdsourcing pioneers, and visionaries who have created a roadmap for research into human computation. The team, led by Pietro Michelucci at the Human Computation Institute, point out that human computation systems have been hugely successful at tackling complex problems from identifying spiral galaxies to organizing disaster relief.

But their potential is even greater still, provided that human cognition can be efficiently harnessed on a global scale. Last year, they met to discuss these issues and have now published the results of their debate.

The begin by pointing out the extraordinary successes of human computation….then describe the kinds of projects they want to create. They call one idea Project Houston after the crowdsourced effort on the ground that helped bring back the Apollo 13 astronauts after an on-board explosion on the way to the moon.

Their idea is that similar help can be brought to bear from around the world when individuals on earth find themselves in trouble. By this they mean individuals who might be considering suicide or suffering from depression, for example.

The plan is to use state-of-the-art speech analysis and natural language understanding to detect stress and offer help. This would come in the form of composite personalities made up from individuals with varying levels of expertise in the crowd, supported by artificial intelligence techniques. “Project Houston could provide a consistently kind and patient personality even if the “crowd” changes completely over time,” they say.

Another idea is to build on the way that crowdsourcing helps people learn. One example of this is Duolingo, an app that offers free language lessons while simultaneously acting as a document translation service. “Why stop with language learning and translation?” they ask.

A similar approach could help people learn new skills as they work online, a process that should allow them to take on more complex roles. One example is in the field of radiology, where an important job is to recognize tumors on x-ray images. This is a task that machine vision algorithms do not yet perform reliably…..

Yet another idea would be to crowdsource information that helps the poorest families in America find social welfare programs. These programs are often difficult to navigate and represent a disproportionate hardship for the people who are most likely to benefit from them: those who are homeless, who have disabilities, who are on low income, and so on.

The idea is that the crowd should take on some of this burden freeing up this group for other tasks, like finding work, managing health problems and so on.

These are worthy goals but they raise some significant questions. Chief among these is the nature of the ethical, legal, and social implications of human computation. How can this work be designed to allow meaningful and dignified human participation? How can the outcomes be designed so that the most vulnerable people can benefit from it? And what is the optimal division of labor between machines and humans to produce a specific result?

Ref: : A U.S. Research Roadmap for Human Computation”

Full Post: A Research Roadmap for Human Computation

Nudges Do Not Undermine Human Agency

Jun 04, 2015 08:35 am

Cass R. Sunstein in the Journal of Consumer Policy: “Some people believe that nudges undermine human agency, but with appropriate nudges, neither agency nor consumer freedom is at risk. On the contrary, nudges can promote both goals. In some contexts, they are indispensable. There is no opposition between education on the one hand and nudges on the other. Many nudges are educative. Even when they are not, they can complement, and not displace, consumer education….(More)”.

Full Post: Nudges Do Not Undermine Human Agency

Field experimenting in economics: Lessons learned for public policy

Jun 04, 2015 08:32 am

Robert Metcalfe at OUP Blog: “Do neighbourhoods matter to outcomes? Which classroom interventions improve educational attainment? How should we raise money to provide important and valued public goods? Do energy prices affect energy demand? How can we motivate people to become healthier, greener, and more cooperative? These are some of the most challenging questions policy-makers face. Academics have been trying to understand and uncover these important relationships for decades.

Many of the empirical tools available to economists to answer these questions do not allow causal relationships to be detected. Field experiments represent a relatively new methodological approach capable of measuring the causal links between variables. By overlaying carefully designed experimental treatments on real people performing tasks common to their daily lives, economists are able to answer interesting and policy-relevant questions that were previously intractable. Manipulation of market environments allows these economists to uncover the hidden motivations behind economic behaviour more generally. A central tenet of field experiments in the policy world is that governments should understand the actual behavioural responses of their citizens to changes in policies or interventions.

Field experiments represent a departure from laboratory experiments. Traditionally, laboratory experiments create experimental settings with tight control over the decision environment of undergraduate students. While these studies also allow researchers to make causal statements, policy-makers are often concerned subjects in these experiments may behave differently in settings where they know they are being observed or when they are permitted to sort out of the market.

For example, you might expect a college student to contribute more to charity when she is scrutinized in a professor’s lab than when she can avoid the ask altogether. Field experiments allow researchers to make these causal statements in a setting that is more generalizable to the behaviour policy-makers are directly interested in.

To date, policy-makers traditionally gather relevant information and data by using focus groups, qualitative evidence, or observational data without a way to identify causal mechanisms. It is quite easy to elicit people’s intentions about how they behave with respect to a new policy or intervention, but there is increasing evidence that people’s intentions are a poor guide to predicting their behaviour.

However, we are starting to see a small change in how governments seek to answer pertinent questions. For instance, the UK tax office (Her Majesty’s Revenue and Customs) now uses field experiments across some of its services to improve the efficacy of scarce taxpayers money. In the US, there are movements toward gathering more evidence from field experiments.

In the corporate world, experimenting is not new. Many of the current large online companies—such as Amazon, Facebook, Google, and Microsoft—are constantly using field experiments matched with big data to improve their products and deliver better services to their customers. More and more companies will use field experiments over time to help them better set prices, tailor advertising, provide a better customer journey to increase welfare, and employ more productive workers…(More).

See also Field Experiments in the Developed World: An Introduction (Oxford Review of Economic Policy)

Full Post: Field experimenting in economics: Lessons learned for public policy

Why Technology Hasn’t Delivered More Democracy

Jun 04, 2015 03:38 am

Collection of POVs aggregated by Thomas Carothers at Foreign Policy: “New technologies offer important tools for empowerment — yet democracy is stagnating. What’s up?…

THe current moment confronts us with a paradox. The first fifteen years of this century have been a time of astonishing advances in communications and information technology, including digitalization, mass-accessible video platforms, smart phones, social media, billions of people gaining internet access, and much else. These revolutionary changes all imply a profound empowerment of individuals through exponentially greater access to information, tremendous ease of communication and data-sharing, and formidable tools for networking. Yet despite these changes, democracy — a political system based on the idea of the empowerment of individuals — has in these same years become stagnant in the world. The number of democracies today is basically no greater than it was at the start of the century. Many democracies, both long-established ones and newer ones, are experiencing serious institutional debilities and weak public confidence.

How can we reconcile these two contrasting global realities — the unprecedented advance of technologies that facilitate individual empowerment and the overall lack of advance of democracy worldwide? To help answer this question, I asked six experts on political change, all from very different professional and national perspectives. Here are their responses, followed by a few brief observations of my own.

1. Place a Long Bet on the Local By Martin Tisné

2. Autocrats Know How to Use Tech, Too By Larry Diamond

3. Limits on Technology Persist By Senem Aydin Düzgit

4. The Harder Task By Rakesh Rajani

5. Don’t Forget Institutions By Diane de Gramont

6. Mixed Lessons from Iran By Golnaz Esfandiari

7. Yes, It’s Complicated byThomas Carothers…(More)”

Full Post: Why Technology Hasn’t Delivered More Democracy

Signal: Understanding What Matters in a World of Noise,

Jun 04, 2015 01:51 am

Book by Stephen Few: “In this age of so-called Big Data, organizations are scrambling to implement new software and hardware to increase the amount of data they collect and store. However, in doing so they are unwittingly making it harder to find the needles of useful information in the rapidly growing mounds of hay. If you don’t know how to differentiate signals from noise, adding more noise only makes things worse. When we rely on data for making decisions, how do we tell what qualifies as a signal and what is merely noise? In and of itself, data is neither. Assuming that data is accurate, it is merely a collection of facts. When a fact is true and useful, only then is it a signal. When it’s not, it’s noise. It’s that simple. In Signal, Stephen Few provides the straightforward, practical instruction in everyday signal detection that has been lacking until now. Using data visualization methods, he teaches how to apply statistics to gain a comprehensive understanding of one’s data and adapts the techniques of Statistical Process Control in new ways to detect not just changes in the metrics but also changes in the patterns that characterize data…(More)”

Full Post: Signal: Understanding What Matters in a World of Noise,

CMS announces entrepreneurs and innovators to access Medicare data

Jun 04, 2015 12:26 am

Centers for Medicare and Medicaid Services Press Release: “…the acting Centers for Medicare & Medicaid Services (CMS) Administrator, Andy Slavitt, announced a new policy that for the first time will allow innovators and entrepreneurs to access CMS data, such as Medicare claims. As part of the Administration’s commitment to use of data and information to drive transformation of the healthcare delivery system, CMS will allow innovators and entrepreneurs to conduct approved research that will ultimately improve care and provide better tools that should benefit health care consumers through a greater understanding of what the data says works best in health care. The data will not allow the patient’s identity to be determined, but will provide the identity of the providers of care. CMS will begin accepting innovator research requests in September 2015.

“Data is the essential ingredient to building a better, smarter, healthier system. Today’s announcement is aimed directly at shaking up health care innovation and setting a new standard for data transparency,” said acting CMS Administrator Andy Slavitt. “We expect a stream of new tools for beneficiaries and care providers that improve care and personalize decision-making.”

Innovators and entrepreneurs will access data via the CMS Virtual Research Data Center (VRDC) which provides access to granular CMS program data, including Medicare fee-for-service claims data, in an efficient and cost effective manner. Researchers working in the CMS VRDC have direct access to approved privacy-protected data files and are able to conduct their analysis within a secure CMS environment….

Examples of tools or products that innovators and entrepreneurs might develop include care management or predictive modeling tools, which could greatly benefit the healthcare system, in the form of healthier people, better quality, or lower cost of care. Even though all data is privacy-protected, researchers also will not be allowed to remove patient-level data from the VRDC. They will only be able to download aggregated, privacy-protected reports and results to their own personal workstation.  …(More)”

Full Post: CMS announces entrepreneurs and innovators to access Medicare data

The Data That’s Hiding in Plain Sight

Jun 03, 2015 01:51 pm

Beth Noveck in Governing: “What makes open data a powerful tool for governing better is the ability of people inside and outside of institutions to use the same data to create effective policies and useful tools, visualizations, maps and apps. Open data also can provide the raw material to convene informed conversations about what’s broken and the empirical foundation for developing solutions. But to realize its potential, the data needs to be truly open: not only universally and readily accessible but also structured for usability and computability.

One area where open data has the potential to make a real difference — and where some of its current limitations are all too apparent — is in state-level regulation of nonprofits. In May, a task force comprising the Federal Trade Commission together with 58 agencies from all 50 states and the District of Columbia filed a lawsuit against the Cancer Fund group of nonprofits and the individuals who run them. The complaint alleges that the groups are sham charities that spend “the overwhelming majority of donated funds supporting the Individual Defendants, their families and friends, and their fundraisers.” State officials spotted telltale signs of abuse and fraud by studying information the organizations had submitted in their federal nonprofit tax returns and state-by-state registration forms.

Nonprofit tax returns and registration forms are the public’s (and government’s) primary window into the workings of America’s enormous and economically impactful nonprofit sector. Every year in the United States, approximately 1.5 million registered tax-exempt organizations file a version of the federal Form 990, the tax return for tax-exempt organization, with the Internal Revenue Service and state tax authorities. These forms collect details on the organizations’ financial, governance and organizational structure to the end of ensuring that they are deserving of their tax-exempt status. All but 10 states also require that nonprofits file state-specific registration forms. The information these filings contain about executive compensation, fundraising expenses and donation activities can help regulators spot possible bad actors and alert each other to targets for further investigation.

Yet despite the richness and utility of the information contained in these filings, major barriers prevent regulators from efficiently sharing and analyzing the data..(More)”

Full Post: The Data That’s Hiding in Plain Sight

New ODI research shows open data reaching every sector of UK industry

Jun 03, 2015 01:42 pm

ODI: “New research has been published today (1 June) by the Open Data Institute showing that open data is reaching every sector of UK industry.

In various forms, open data is being adopted by a wide variety of businesses – small and large, new and old, from right across the country. The findings from Open data means business: UK innovation across sectors and regions draw on 270 companies with a combined turnover of £92bn and over 500k employees, identified by the ODI as using, producing or investing in open data as part of their business. The project included desk research, surveys and interviews on the companies’ experiences.

Key findings from the research include:

  • Companies using open data come from many sectors; over 46% from outside the information and communication sector. These include finance & insurance, science & technology, business administration & support, arts & entertainment, health, retail, transportation, education and energy.
  • The most popular datasets for companies aregeospatial/mapping data (57%), transport data (43%) and environment data (42%).
  • 39% of companies innovating with open data are over 10 years old, with some more than 25 years old, proving open data isn’t just for new digital startups.
  • ‘Micro-enterprises’ (businesses with fewer than 10 employees) represented 70% of survey respondents, demonstrating athriving open data startup scene. These businesses are using it to create services, products and platforms. 8% of respondents were drawn from large companies of 251 or more employees….
  • The companies surveyed listed 25 different government sources for the data they use. Notably, Ordnance Survey data was cited most frequently, by 14% of the companies. The non-government source most commonly used was OpenStreetMap, an openly licenced map of the world created by volunteers….(More)

Full Post: New ODI research shows open data reaching every sector of UK industry

5 cool ways connected data is being used

Jun 03, 2015 01:35 pm

 at Wareable: “The real news behind the rise of wearable tech isn’t so much the gadgetry as the gigantic amount of personal data that it harnesses.

Concerns have already been raised over what companies may choose to do with such valuable information, with one US life insurance company already using Fitbits to track customers’ exercise and offer them discounts when they hit their activity goals.

Despite a mildly worrying potential dystopia in which our own data could be used against us, there are plenty of positive ways in which companies are using vast amounts of connected data to make the world a better place…

Parkinson’s disease research

Apple Health ResearchKit was recently unveiled as a platform for collecting collaborative data for medical studies, but Apple isn’t the first company to rely on crowdsourced data for medical research.

The Michael J. Fox Foundation for Parkinson’s Research recently unveiled a partnership with Intel to improve research and treatment for the neurodegenerative brain disease. Wearables are being used to unobtrusively gather real-time data from sufferers, which is then analysed by medical experts….

Saving the rhino

Connected data and wearable tech isn’t just limited to humans. In South Africa, the Madikwe Conservation Project is using wearable-based data to protect endangered rhinos from callous poachers.

A combination of ultra-strong Kevlar ankle collars powered by an Intel Galileo chip, along with an RFID chip implanted in each rhino’s horn allows the animals to be monitored. Any break in proximity between the anklet and horn results in anti-poaching teams being deployed to catch the bad guys….

Making public transport smart

A company called Snips is collecting huge amounts of urban data in order to improve infrastructure. In partnership with French national rail operator SNCF, Snips produced an app called Tranquilien to utilise location data from commuters’ phones and smartwatches to track which parts of the rail network were busy at which times.

Combining big data with crowdsourcing, the information helps passengers to pick a train where they can find a seat during peak times, while the data can also be useful to local businesses when serving the needs of commuters who are passing through.

Improving the sports fan experience

We’ve already written about how wearable tech is changing the NFL, but the collection of personal data is also set to benefit the fans.

Levi’s Stadium – the new home of the San Francisco 49ers – opened in 2014 and is one of the most technically advanced sports venues in the world. As well as a strong Wi-Fi signal throughout the stadium, fans also benefit from a dedicated app. This not only offers instant replays and real-time game information, but it also helps them find a parking space, order food and drinks directly to their seat and even check the lines at the toilets. As fans use the app, all of the data is collated to enhance the fan experience in future….

Creating interactive art

Don’t be put off by the words ‘interactive installation’. On Broadway is a cool work of art that “represents life in the 21st Century city through a compilation of images and data collected along the 13 miles of Broadway that span Manhattan”….(More)”

Full Post: 5 cool ways connected data is being used

The Missing Statistics of Criminal Justice

Jun 03, 2015 11:14 am

Matt Ford at the Atlantic: “An abundance of data has fueled the reform movement, but from prisons to prosecutors, crucial questions remain unquantified.

After Ferguson, a noticeable gap in criminal-justice statistics emerged: the use of lethal force by the police. The federal government compiles a wealth of data on homicides, burglaries, and arson, but no official, reliable tabulation of civilian deaths by law enforcement exists. A partial database kept by the FBI is widely considered to be misleading and inaccurate. (The Washington Post has just released a more expansive total of nearly 400 police killings this year.) “It’s ridiculous that I can’t tell you how many people were shot by the police last week, last month, last year,” FBI Director James Comey told reporters in April.

This raises an obvious question: If the FBI can’t tell how many people were killed by law enforcement last year, what other kinds of criminal-justice data are missing? Statistics are more than just numbers: They focus the attention of politicians, drive the allocation of resources, and define the public debate. Public officials—from city councilors to police commanders to district attorneys—are often evaluated based on how these numbers change during their terms in office. But existing statistical measures only capture part of the overall picture, and the problems that go unmeasured are often also unaddressed. What changes could the data that isn’t currently collected produce if it were gathered?….

Without reliable official statistics, scholars often must gather and compile necessary data themselves. “A few years ago, I was struck at how many police killings of civilians we seemed to be having in Philadelphia,” Gottschalk said as an example. “They would be buried in the newspaper, and I was stunned by how difficult it was to compile that information and compare it to New York and do it on a per-capita basis. It wasn’t readily available.” As a result, criminal-justice researchers often spend more time gathering data than analyzing it.

This data’s absence shapes the public debate over mass incarceration in the same way that silence between notes of music gives rhythm to a song. Imagine debating the economy without knowing the unemployment rate, or climate change without knowing the sea level, or healthcare reform without knowing the number of uninsured Americans. Legislators and policymakers heavily rely on statistics when crafting public policy. Criminal-justice statistics can also influence judicial rulings, including those by the Supreme Court, with implications for the entire legal system.

Beyond their academic and policymaking value, there’s also a certain power to statistics. They have the irreplaceable ability to both clarify social issues and structure the public’s understanding of them. A wealth of data has allowed sociologists, criminologists, and political scientists to diagnose serious problems with the American criminal-justice system over the past twenty years. Now that a growing bipartisan consensus recognizes the problem exists, gathering the right facts and figures could help point the way towards solutions…(More)”

Full Post: The Missing Statistics of Criminal Justice

Measuring ‘governance’ to improve lives

Jun 03, 2015 10:50 am

Robert Rotberg at the Conversation: “…Citizens everywhere desire “good governance” – to be governed well within their nation-states, their provinces, their states and their cities.

Governance is more useful than “democracy” if we wish to understand how different political rulers and ruling elites satisfy the aspirations of their citizens.

But to make the notion of “governance” useful, we need both a practical definition and a method of measuring the gradations between good and bad governance.

What’s more, if we can measure well, we can diagnose weak areas of governance and, hence, seek ways to make the weak actors strong.

Governance, defined as “the performance of governments and the delivery of services by governments,” tells us if and when governments are in fact meeting the expectations of their constituents and providing for them effectively and responsibly.

Democracy outcomes, by contrast, are much harder to measure because the meaning of the very word itself is contested and impossible to measure accurately.

For the purposes of making policy decisions, if we seek to learn how citizens are faring under regime X or regime Y, we need to compare governance (not democracy) in those respective places.

In other words, governance is a construct that enables us to discern exactly whether citizens are progressing in meeting life’s goals.

Measuring governance: five bundles and 57 subcategories

Are citizens of a given country better off economically, socially and politically than they were in an earlier decade? Are their various human causes, such as being secure or being free, advancing? Are their governments treating them well, and attempting to respond to their various needs and aspirations and relieving them of anxiety?

Just comparing national gross domestic products (GDPs), life expectancies or literacy rates provides helpful distinguishing data, but governance data are more comprehensive, more telling and much more useful.

Assessing governance tells us far more about life in different developing societies than we would learn by weighing the varieties of democracy or “human development” in such places.

Government’s performance, in turn, is according to the scheme advanced in my book On Governance and in my Index of African Governance, the delivery to citizens of five bundles (divided into 57 underlying subcategories) of political goods that citizens within any kind of political jurisdiction demand.

The five major bundles are Security and Safety, Rule of Law and Transparency, Political Participation and Respect for Human Rights, Sustainable Economic Opportunity, and Human Development (education and health)….(More)”

Full Post: Measuring ‘governance’ to improve lives

Remote Voting and Beyond: How Tech Will Transform Government From the Inside Out

Jun 03, 2015 10:41 am

Springwise: “…Technology, and in particular the internet, are often seen as potential stumbling blocks for government. But this perception acts as a brake on innovation in public services and in politics more generally. By embracing technology, rather than warily containing it, governments globally could benefit hugely. In terms of formulating and executing policy, technology can help governments become more transparent, accountable and effective, while improving engagement and participation from regular citizens.

On engagement, for instance, technology is opening up new avenues which make taking part in the political process far more straightforward. Springwise-featured Harvard startup Voatz are building a platform that allows users to vote, make campaign donations and complete opinion polls from their smartphones. The app, which uses biometric authentication to ensure that identities are comprehensively verified, could well entice younger voters who are alienated by the ballot box. Melding the simplicity of apps with sophisticated identity verification technology, Voatz is just one example of how tech can disrupt government for good.

From the Ground Up…

The potential for active participation goes far beyond voting. E-focus groups, online petitions and campaign groups have the power to transform the interaction between political establishments and citizens. From fact-checking charities enabled by crowdfunding such as UK-based Full Fact to massive national campaigns conducted online, citizens connected by technology are using their collective power to reshape government in democratic countries. Under other regimes, such as in the People’s Republic of China, vigilante citizens are circumventing extensive firewalls to shine a light on official misconduct.

…and the Top Down

As well as an abundance of citizen-led efforts to improve governance, there are significant moves from governments themselves to shake-up public service delivery. Even, flawed though the roll-out was, marks a hugely ambitious piece of government reform underpinned by technology. Indeed, Obama has shown an unprecedented willingness to embrace technology in his two terms, appointing chief information and technology officers, promising to open up government data and launching the @POTUS Twitter account last month. Clearly, recognition is there from governments that technology can be a game changer for their headline policies.

While many countries are using technology for individual projects, there is one government that is banking its entire national success on tech – Estonia. The tiny, sparsely populated country in Eastern Europe is one of the most technologically advanced in the world. Everything from citizen IDs to tax returns and health records make use of technology and are efficient and ‘future-proofed’ as a result.

Whether as a threat or an opportunity, technology represents a transformative influence on government. Its potential as a disruptive, reshaping force has fed a narrative that casts technology as a looming threat and a destabiliser of conventional power structures. But harnessed properly and executed effectively, technology can remold government for the better, improving big public service projects, raising participation and engaging a young population whose default is digital….(More)”

Full Post: Remote Voting and Beyond: How Tech Will Transform Government From the Inside Out

Open data for competitive advantage: insights from open data use by companies

Jun 03, 2015 03:37 am

Anneke Zuiderwijk et al in the Proceedings of the 16th Annual International Conference on Digital Government Research: “Politicians have high expectations for commercial open data use. Yet, companies appear to challenge the assumption that open data can be used to create competitive advantage, since any company can access open data and since open data use requires scarce resources. In this paper we examine commercial open data use for creating competitive advantage from the perspective of Resource Based Theory (RBT) and Resource Dependency Theory (RDT). Based on insights from a scenario, interviews and a survey and from RBT and RDT as a reference theory, we derive seven propositions. Our study suggests that the generation of competitive advantage with open data requires a company to have in-house capabilities and resources for open data use. The actual creation of competitive advantage might not be simple. The propositions also draw attention to the accomplishment of unique benefits for a company through the combination of internal and external resources. Recommendations for further research include testing the propositions….(More)”

Full Post: Open data for competitive advantage: insights from open data use by companies

The Diffusion and Evolution of 311 Citizen Service Centers in American Cities from 1996 to 2012

Jun 02, 2015 09:10 am

Phd thesis by John Christopher O’Byrne: “This study of the diffusion and evolution of the 311 innovation in the form of citizen service centers and as a technology cluster has been designed to help identify the catalysts for the spread of government-to-citizen (G2C) technology in local government in order to better position future G2C technology for a more rapid rate of adoption. The 311 non-emergency number was first established in 1996 and had spread to 80 local governments across the United States by 2012. This dissertation examines: what factors contributed to the adoption of 311 in American local governments over 100,000 in population; how did the innovation diffuse and evolve over time; and why did some governments’ communications with citizens became more advanced than others? Given the problem of determining causality, a three-part research design was used to examine the topic including a historical narrative, logistic regression model, and case studies from Pittsburgh, Minneapolis and St. Louis. The narrative found that the political forces of the federal government, national organizations, and policy entrepreneurs (Karch, 2007) promoted the 311 innovation to solve different problems and that it evolved beyond its original intent.

The logistic regression model found that there was a statistically significant relationship between 311 adoption and the variables of higher population, violent crime rate, and the mayor-council form of government. The case studies revealed that mayors played a strong role in establishing citizen service centers in all three cities while 311 adopter Pittsburgh and non-adopter St. Louis seemed to have more in common in their G2C evolution due to severe budget constraints. With little written about the 311 innovation in academic journals, practitioners and scholars will benefit from understanding the catalysts for the diffusion and evolution of the 311 in order to determine ways to increase the rate of adoption for future G2C communication innovations….(More)”

Full Post: The Diffusion and Evolution of 311 Citizen Service Centers in American Cities from 1996 to 2012

How Twitter Users Can Generate Better Ideas

Jun 02, 2015 08:23 am

Salvatore Parise, Eoin Whelan and Steve Todd in MIT Sloan Management Review: “New research suggests that employees with a diverse Twitter network — one that exposes them to people and ideas they don’t already know — tend to generate better ideas…. A multitude of empirical studies confirm what Jobs intuitively knew. The more diverse a person’s social network, the more likely that person is to be innovative. A diverse network provides exposure to people from different fields who behave and think differently. Good ideas emerge when the new information received is combined with what a person already knows. But in today’s digitally connected world, many relationships are formed and maintained online through public social media platforms such as Twitter, Facebook and LinkedIn. Increasingly, employees are using such platforms for work-related purposes.

Studying Twitter Networks

Can Twitter make employees more innovative? In particular, does having a greater diversity of virtual Twitter connections mean that good ideas are more likely to surface, as in the face-to-face world? To answer this question, we used a technique called organizational network analysis (ONA) to create visual representations of employee Twitter networks. We studied ten employee groups across five companies in a range of industries….

….in analyzing the structure of each employee’s Twitter network, we found that there was a positive relationship between the amount of diversity in one’s Twitter network and the quality of ideas submitted. However, Twitter activity and size measures (such as the number of tweets, number of followers and number of people followed) were not correlated with personal innovation….(More)

Full Post: How Twitter Users Can Generate Better Ideas

Governing methods: policy innovation labs, design and data science in the digital governance of education

Jun 02, 2015 07:31 am

Paper by Ben Williamson in the Journal of Educational Administration and History: “Policy innovation labs are emerging knowledge actors and technical experts in the governing of education. The article offers a historical and conceptual account of the organisational form of the policy innovation lab. Policy innovation labs are characterised by specific methods and techniques of design, data science, and digitisation in public services such as education. The second half of the article details how labs promote the use of digital data analysis, evidence-based evaluation and ‘design-for-policy’ techniques as methods for the governing of education. In particular, they promote the ‘computational thinking’ associated with computer programming as a capacity required by a ‘reluctant state’ that is increasingly concerned to delegate its responsibilities to digitally enabled citizens with the ‘designerly’ capacities and technical expertise to ‘code’ solutions to public and social problems. Policy innovation labs are experimental laboratories trialling new methods within education for administering and governing the future of the state itself….(More)”

Full Post: Governing methods: policy innovation labs, design and data science in the digital governance of education

Data (v.)

Jun 01, 2015 08:41 am

Jer Thorp in Journal 001 of The Office for Creative Research and Medium: “I data you, you data me. They data us, we data them.

As your Concise Oxford sails toward me from across the room, let’s take some time to consider the arguments:

The word data has been in a pronounced flux over the last ten years, as its role and function has been redefined by technology and culture. A decade ago, data was firmly a plural noun. Specifically, it was the plural of datum– one datum, two data. Back then, you could point and laugh at the data amateurs because they would say ‘data is’ rather than ‘data are’. Of course, those data newbies went on to form companies, make software, build databases, write books and give TED talks. And slowly, data did turn into a particular kind of singular: it has become, commonly, a mass noun…..

Data is not inert, yet its perceived passivity is one of its most dangerous properties. When we are warned that a government is collecting data about its citizens, we may be underwhelmed specifically because this act of collection seems to be so harmless, so indifferent. But of course data is not collected and then left alone: it is used as a substrate for decision making; and as an instrument for differentiation, discrimination and damage. Putting an active form of the word data into common parlance could serve as a reminder that the systems of data collection and uses are humming with capacity for influence, action and violence.

Making data a verb also exposes to us the power imbalances that have kept our collective endeavours drastically off-kilter. Grammatically speaking, data-as-verb would present a number of possibilities for subject/object combinations:

I data you. You data me. We data you. You data us. They data me. They data us. We data them.

Exposed to this rich possibility of cause and effect, the common usages of data today become strikingly narrow: in our lived data experiences we are objects, rather than subjects. Google reads our every e-mail, placing us ingloriously in marketing buckets based on what we write to our friends, colleagues and lovers. Uber’s algorithms note our late night voyages asrecords of romantic trysts. They data us, then they data us again.

Even the innocent fitness tracker, on paper an embodiment of ‘I data myself’ isn’t so much about quantified self as it is about quantified selves, less a tool for individuals to track their own beating hearts than a system to find an aggregated 24 year old Bay Area resident that can be marketed against. These devices are exciting toys for runners and walkers but also for lawyers, who have found in them a new way to argue against claims of personal injury.

Yet there is plenty of potential for us to data. Last year we built Floodwatch, a browser based tool that allows users to track the web advertising profiles that are being authored about them— empowering individuals to track the trackers. Mapping Police Violence, a project by Ferguson activists@samsway @Nettaaaaaaaa and @deray, keeps a record of every black American killed by police in the USA. In doing so, the project reminds us how powerful the simple act of data collection can be, particularly when that data is something that the powerful don’t want us to see.

These projects give us a glimpse of what can happen if we abandon our idea of data as an innocent, passive noun. By embracing the new verbal form of data, we might better understand its potential for action, and in turn move beyond our own prescribed role as the objects in data sentences.

In doing so, perhaps we can imagine a future perfect for data, where not only will they have dataed us, we will have dataed them. A future, perhaps, where we all data together….(More)”

Full Post: Data (v.)

Open data could save the NHS hundreds of millions, says top UK scientist

May 31, 2015 07:55 pm

The Guardian: “The UK government must open up and highlight the power of more basic data sets to improve patient care in the NHS and save hundreds of millions of pounds a year, Nigel Shadbolt, chairman of the Open Data Institute (ODI) has urged.

The UK government topped the first league table for open data (paywall)produced by the ODI last year but Shadbolt warns that ministers’ open data responsibilities have not yet been satisfied.

Basic data on prescription administration is now published on a monthly basis but Shadbolt said medical practitioners must be educated about the power of this data to change prescribing habits across the country.

Other data sets, such as trusts’ opening times, consultant lists and details of services, that are promised to make the NHS more accessible are not currently available in a form that is machine-readable.

“These basic sets of information about the processes, the people and places in the health system are all fragmented and fractured and many of them are not available as registers that you can go to,” Shadbolt said.

“Whenever you talk about health data people think you must be talking about personal data and patient data and there are issues, obviously, of absolutely protecting privacy there. But there’s lots of data in the health service that is not about personal patient data at all that would be hugely useful to just have available as machine-readable data for apps to use.”

The UK government has led the way in recent years in encouraging transparency and accountability within the NHS by opening league tables. The publication of league tables on MRSA was followed by a 76-79% drop in infections.

Shadbolt said: “Those hospitals that were worst in their league table don’t like to be there and there was a very rapid diffusion of understanding of best practice across them that you can quantify. It’s many millions of pounds being saved.”

The artificial intelligence and open data expert said the next big area for open data improvement in the NHS is around prescriptions.

Shadbolt pointed to the publication of data about the prescription of statins,which has helped identify savings worth hundreds of millions of pounds: “There is little doubt that this pattern is likely to exist across the whole of the prescribing space.”…(More)”

Full Post: Open data could save the NHS hundreds of millions, says top UK scientist

New Technologies and Civic Engagement

May 30, 2015 10:12 am

Book edited by Homero Gil de Zuniga Navajas: “First, this book pays attention to the overall impact of the Internet and people’s use of digital media and new technologies to analyze civic life at large, reconceptualizing what citizenship is today. Secondly, and more specifically, participants shed light over the intersection of a number of current new agendas of research in regards to some of the most rapidly growing technological advances (i.e., new publics and citizenship), and the emergence of sprouting structures of citizenship. The volume shows the implications that new technological advances carry with respect the possibilities, patterns and mechanisms for citizen communication, citizen deliberation, public sphere and civic engagement….(More)”

Full Post: New Technologies and Civic Engagement

The Open Seventeen

May 29, 2015 06:40 pm

Crowdsourcing the Verification of the Sustainable Development Goals with Open Data : In 2015, the United Nations is announcing seventeen Sustainable Development Goals (SDGs) for the world. Success at implementing the SDGs by 2030 could put the planet on the right course for the rest of the century. Failure could result in a breakdown of trust in global initiatives and cynical pursuit of self-interest by nations and corporations.

One way to ensure SDGs are achieved is to establish an independent means for verifying that all stakeholders – governments, corporations, NGOs and international organisations – live up to their promises. This requires harnessing the grassroots efforts of concerned citizens on a global scale.

To ignite this effort, ONE- in collaboration with the Citizen Cyberscience Centre and the Crowdcrafting platform for open research – is launching The Open Seventeen, a challenge to develop crowdsourcing projects that tackle SDGs using open data.

How does this challenge work?

You’ll find a big blue button further down this page. Use this to pitch a crowdsourcing project that tackles any of the 17 SDGs, at either a local, regional or global level, and tell us what open data set could be analysed for this purpose.

To inspire you, we’ve provided below some >examples of crowdsourcing projects that have already been tackling different aspects of the SDGs, from deforestation to corruption, and from drought to disease. Projects proposed for the challenge should have clear and realistic goals, and build on existing open data sets.

ONE and its partners will select three proposals and create crowdsourcing projects based on these. The winners and their projects will be profiled by ONE in upcoming international events related to the launch of the SDGs. Your project could inspire the world….

What can you do with open data to help verify SDGs? Have a look at what citizens have already created using the open source technology PyBossa that powers the Crowdcrafting platform and other crowdsourcing projects….(More)”

Full Post: The Open Seventeen

The Governance Lab

Polytechnic School of Engineering
New York University
2 MetroTech Center
Floor 9
Brooklyn, NY 11201

Not subscribed? Click here to subscribe.
 unsubscribe from this list | update subscription preferences