Copy
View this email in your browser
Table of Contents
 
 

AI Localism

Oct 01, 2020 03:16 pm

Today, The GovLab is  excited to launch a new platform which seeks to monitor, analyze and guide how AI is being governed in cities around the world: AI Localism. 

AI Localism refers to the actions taken by local decision-makers to address the use of AI within a city or community.  AI Localism has often emerged because of gaps left by incomplete state, national or global governance frameworks.

“AI Localism offers both immediacy and proximity. Because it is managed within tightly defined geographic regions, it affords policymakers a better understanding of the tradeoffs involved. By calibrating algorithms and AI policies for local conditions, policymakers have a better chance of creating positive feedback loops that will result in greater effectiveness and accountability.”

The initial AI Localism projects include:

The Ethics and Practice of AI Localism at a Time of Covid-19 and Beyond – In collaboration with the TUM School of Governance and University of Melbourne The GovLab will conduct a comparative review of current practices worldwide to gain a better understanding of successful AI Localism in the context of COVID-19 as to inform and guide local leaders and city officials towards best practices.

Responsible AI at the Local Level – Together with the NYU Center Responsible AI, The GovLab will seek to develop an interactive repository and a set of training modules of Responsible AI approaches at the local level. 

Join us as we seek to understand and develop new forms of governance to guide local leaders towards responsible AI implementation or share any effort you are working on to establishing responsible AI at the local level by visiting: http://ailocalism.org


READ MORE

Situating Open Data: Global Trends in Local Contexts

Sep 30, 2020 07:37 pm

Open Access Book edited by Danny Lämmerhirt, Ana Brandusescu, Natalia Domagala & Patrick Enaholo: “Open data and its effects on society are always woven into infrastructural legacies, social relations, and the political economy. This raises questions about how our understanding and engagement with open data shifts when we focus on its situated use. 

To shed a light on these questions, Situating Open Data provides several empirical accounts of open data practices, the local implementation of global initiatives, and the development of new open data ecosystems. Drawing on case studies in different countries and contexts, the chapters demonstrate the practices and actors involved in open government data initiatives unfolding within different socio-political settings. 

The book proposes three recommendations for researchers, policy-makers and practitioners. First, beyond upskilling through ‘data literacy’ programmes, open data initiatives should be specified through the kinds of data practices and effects they generate. Second, global visions of open data implementation require more studies of the resonances and tensions created in localised initiatives. And third, research into open data ecosystems requires more attention to the histories and legacies of information infrastructures and how these shape who benefits from open data flows. 

As such, this volume departs from the framing of data as a resource to be deployed. Instead, it proposes a prism of different data practices in different contexts through which to study the social relations, capacities, infrastructural histories and power structures affecting open data initiatives. It is hoped that the contributions collected in Situating Open Data will spark critical reflection about the way open data is locally practiced and implemented. The contributions should be of interest to open data researchers, advocates, and those in or advising government administrations designing and rolling out effective open data initiatives….(More)”.


READ MORE

Why we must break the constraints of the industrial model of government

Sep 30, 2020 03:44 pm

Max Beverton Palmer at the New Statesman: “…In practice, governments must shift from delivering what they always have to ensuring people’s needs are met in the best possible way. This should open up delivery to partners from both the private and charity sectors, where they can provide a better service that delivers better value to citizens, and much greater engagement with the public.

To manage this shift, leaders will need to resolve three key trade-offs.

First, states must be able to give up control to encourage innovation while protecting quality and in-house capacity. They must create new frameworks to assess where to encourage more open policymaking and delivery and where to double down on the competencies and infrastructure only they can provide. Technology can help here, creating new levers to protect the public interest by governing services’ access to government platforms and datasets akin to app store guidelines.

Second, states must reorganise around scale economies underpinned by technology while moving delivery closer to people’s lives. They should provide the foundations that allow new services to operate, while letting go of controlling the last mile of service delivery. A better way forward is a more collaborative approach that encourages communities, charities and companies to design more tailored services on top of public-controlled infrastructure, enabling people to choose those which best meet their needs.

Third, governments must be able to better listen, engage with and adapt to peoples’ views without descending into mob-rule. A core part of product and service design both in business and in the public sector is iterating delivery according to user needs, but the feedback loops in policymaking are comparably non-existent. New tools can help leaders understand the plurality of public opinions and address the growing disconnect between public institutions and those they represent.

MaxGetting from the status quo to this more open model will be challenging. But action in four priority areas should provide a starting point: infrastructure, organisation, competition and engagement….(More)”


READ MORE

Essential Requirements for Establishing and Operating Data Trusts

Sep 30, 2020 11:37 am

Paper by P Alison Paprica et al: “Increasingly, the label “data trust” is being applied to repeatable mechanisms or approaches to sharing data in a timely, fair, safe and equitable way. However, there is a gap in terms of practical guidance about how to establish and operate a data trust.

In December 2019, the Canadian Institute for Health Information and the Vector Institute for Artificial Intelligence convened a working meeting of 19 people representing 15 Canadian organizations/initiatives involved in data sharing, most of which focus on public sector health data. The objective was to identify essential requirements for the establishment and operation of data trusts. Preliminary findings were presented during the meeting then refined as participants and co-authors identified relevant literature and contributed to this manuscript.

Twelve (12) minimum specification requirements (“min specs”) for data trusts were identified. The foundational min spec is that data trusts must meet all legal requirements, including legal authority to collect, hold or share data. In addition, there was agreement that data trusts must have (i) an accountable governing body which ensures the data trust advances its stated purpose and is transparent, (ii) comprehensive data management including responsible parties and clear processes for the collection, storage, access, disclosure and use of data, (iii) training and accountability requirements for all data users and (iv) ongoing public and stakeholder engagement.

Based on a review of the literature and advice from participants from 15 Canadian organizations/initiatives, practical guidance in the form of twelve min specs for data trusts were agreed on. Public engagement and continued exchange of insights and experience is recommended on this evolving topic…(More)”.


READ MORE

Science and Scientists Held in High Esteem Across Global Publics

Sep 30, 2020 11:03 am

Pew Research: “As publics around the world look to scientists and the research and development process to bring new treatments and preventive strategies for the novel coronavirus, a new international survey finds scientists and their research are widely viewed in a positive light across global publics, and large majorities believe government investments in scientific research yield benefits for society.

Chart shows most value government investment in scientific research, being a world leader in science

Still, the wide-ranging survey, conducted before the COVID-19 outbreak reached pandemic proportions, reveals ambivalence about certain scientific developments – in areas such as artificial intelligence and genetically modified foods – often exists alongside high trust for scientists generally and positive views in other areas such as space exploration….

Scientists as a group are highly regarded, compared with other prominent groups and institutions in society. In all publics, majorities have at least some trust in scientists to do what is right. A median of 36% have “a lot” of trust in scientists, the same share who say this about the military, and much higher than the shares who say this about business leaders, the national government and the news media.

Still, an appreciation for practical experience, more so than expertise, in general, runs deep across publics. A median of 66% say it’s better to rely on people with practical experience to solve pressing problems, while a median of 28% say it’s better to rely on people who are considered experts about the problems, even if they don’t have much practical experience….(More)”.


READ MORE

RESIST Counter Disinformation Toolkit

Sep 30, 2020 10:14 am

UK Government: “This toolkit will help support the dissemination of reliable, truthful information that underpins our democracy. RESIST stands for (Recognise disinformation, Early warning, Situational Insight, Impact analysis, Strategic communication, Track outcomes).

This toolkit will:

  • build your resilience to the threat of disinformation
  • give you guidance on how to identify a range of different types of disinformation consistently and effectively
  • help you prevent and tackle the spread of disinformation
  • enable you to develop a response when disinformation affects your organisation’s ability to do its job or represents a threat to the general public.

The toolkit promotes a consistent approach to the threat and provides 6 steps to follow.

RESIST Disinformation: a toolkit

The purpose of this toolkit is to help you prevent the spread of disinformation. It will enable you to develop a response when disinformation affects your organisation’s ability to do its job, the people who depend on your services, or represents a threat to the general public.

What is disinformation?

Disinformation is the deliberate creation and/or sharing of false information with the intention to deceive and mislead audiences. The inadvertent sharing of false information is referred to as misinformation.

Who is this toolkit for?

Government and public sector communications professionals, as well as policy officers, senior managers and special advisers….(More)”


READ MORE

The Wisdom of the Crowd: Promoting Media Development through Deliberative Initiatives

Sep 30, 2020 10:05 am

Report by Craig Matasick: “…innovative new set of citizen engagement practices—collectively known as deliberative democracy—offers important lessons that, when applied to the media development efforts, can help improve media assistance efforts and strengthen independent media environments around the world. At a time when disinformation runs rampant, it is more important than ever to strengthen public demand for credible information, reduce political polarization, and prevent media capture. Deliberative democracy approaches can help tackle these issues by expanding the number and diversity of voices that participate in policymaking, thereby fostering greater collective action and enhancing public support for media reform efforts.

Through a series of five illustrative case studies, the report demonstrates how deliberative democracy practices can be employed in both media development and democracy assistance efforts, particularly in the Global South. Such initiatives produce recommendations that take into account a plurality of voices while building trust between citizens and decision-makers by demonstrating to participants that their issues will be heard and addressed. Ultimately, this process can enable media development funders and practitioners to identify priorities and design locally relevant projects that have a higher likelihood for long-term impact.

– Deliberative democracy approaches, which are characterized by representative participation and moderated deliberation, provide a framework to generate demand-driven media development interventions while at the same time building greater public support for media reform efforts.

– Deliberative democracy initiatives foster collaboration across different segments of society, building trust in democratic institutions, combatting polarization, and avoiding elite capture.

– When employed by news organizations, deliberative approaches provide a better understanding of the issues their audiences care most about and uncover new problems affecting citizens that might not otherwise have come to light….(More)”.


READ MORE

Amsterdam and Helsinki launch algorithm registries to bring transparency to public deployments of AI

Sep 29, 2020 05:49 pm

Khari Johnson at Venture Beat: “Amsterdam and Helsinki today launched AI registries to detail how each city government uses algorithms to deliver services, some of the first major cities in the world to do so. An AI Register for each city was introduced in beta today as part of the Next Generation Internet Policy Summit, organized in part by the European Commission and the city of Amsterdam. The Amsterdam registry currently features a handful of algorithms, but it will be extended to include all algorithms following the collection of feedback at the virtual conference to lay out a European vision of the future of the internet, according to a city official.

Each algorithm cited in the registry lists datasets used to train a model, a description of how an algorithm is used, how humans utilize the prediction, and how algorithms were assessed for potential bias or risks. The registry also provides citizens a way to give feedback on algorithms their local government uses and the name, city department, and contact information for the person responsible for the responsible deployment of a particular algorithm. A complete algorithmic registry can empower citizens and give them a way to evaluate, examine, or question governments’ applications of AI.

In a previous development in the U.S., New York City created an automated decision systems task force in 2017 to document and assess city use of algorithms. At the time it was the first city in the U.S. to do so. However, following the release of a report last year, commissioners on the task force complained about a lack of transparency and inability to access information about algorithms used by city government agencies….

In a statement accompanying the announcement, Helsinki City Data project manager Pasi Rautio said the registry is also aimed at increasing public trust in the kinds of artificial intelligence “with the greatest possible openness.”…(More)”.


READ MORE

Private Sector Data for Humanitarian Response: Closing the Gaps

Sep 29, 2020 05:36 pm

Jos Berens at Bloomberg New Economy Forum: “…Despite these and other examples, data sharing between the private sector and humanitarian agencies is still limited. Out of 281 contributing organizations on HDX, only a handful come from the private sector. 

So why don’t we see more use of private sector data in humanitarian response? One obvious set of challenges concerns privacy, data protection and ethics. Companies and their customers are often wary of data being used in ways not related to the original purpose of data collection. Such concerns are understandable, especially given the potential legal and reputational consequences of personal data breaches and leaks.

Figuring out how to use this type of sensitive data in an already volatile setting seems problematic, and it is — negotiations between public and private partners in the middle of a crisis often get hung up on a lack of mutual understanding. Data sharing partnerships negotiated during emergencies often fail to mature beyond the design phase. This dynamic creates a loop of inaction due to a lack of urgency in between crises, followed by slow and halfway efforts when action is needed most.

To ensure that private sector data is accessible in an emergency, humanitarian organizations and private sector companies need to work together to build partnerships before a crisis. They can do this by taking the following actions: 

  • Invest in relationships and build trust. Both humanitarian organizations and private sector organizations should designate focal points who can quickly identify potentially useful data during a humanitarian emergency. A data stewards network which identifies and connects data responsibility leaders across organizations, as proposed by the NYU Govlab, is a great example of how such relations could look. Efforts to build trust with the general public regarding private sector data use for humanitarian response should also be strengthened, primarily through transparency about the means and purpose of such collaborations. This is particularly important in the context of COVID-19, as noted in the UN Comprehensive Response to COVID-19 and the World Economic Forum’s ‘Great Reset’ initiative…(More)”.

READ MORE

Metrics at Work: Journalism and the Contested Meaning of Algorithms

Sep 29, 2020 05:03 pm

Book by Angèle Christin: “When the news moved online, journalists suddenly learned what their audiences actually liked, through algorithmic technologies that scrutinize web traffic and activity. Has this advent of audience metrics changed journalists’ work practices and professional identities? In Metrics at Work, Angèle Christin documents the ways that journalists grapple with audience data in the form of clicks, and analyzes how new forms of clickbait journalism travel across national borders.

Drawing on four years of fieldwork in web newsrooms in the United States and France, including more than one hundred interviews with journalists, Christin reveals many similarities among the media groups examined—their editorial goals, technological tools, and even office furniture. Yet she uncovers crucial and paradoxical differences in how American and French journalists understand audience analytics and how these affect the news produced in each country. American journalists routinely disregard traffic numbers and primarily rely on the opinion of their peers to define journalistic quality. Meanwhile, French journalists fixate on internet traffic and view these numbers as a sign of their resonance in the public sphere. Christin offers cultural and historical explanations for these disparities, arguing that distinct journalistic traditions structure how journalists make sense of digital measurements in the two countries.

Contrary to the popular belief that analytics and algorithms are globally homogenizing forces, Metrics at Work shows that computational technologies can have surprisingly divergent ramifications for work and organizations worldwide….(More)”.


READ MORE

Automation in government: Harnessing technology to transform customer experience

Sep 29, 2020 03:48 pm

Matthias Daub, Tony D’Emidio, Zaana Howard, and Seckin Ungur at McKinsey: “Who knew that one could develop warm feelings for a German Federal Employment Agency chatbot? If you own a business and wish to apply for state funds to supplement your employees’ reduced salaries, then UDO will fill in the application form for you. “Let’s go!” the digital assistant declares, launching into a series of questions. The system displays reassuring expertise; the queries—about the size of your workforce, the extent of the reduction in working hours, and so on—are simple, clear, and sensitive to previous responses, and the interface offers soothing blue tones and rounded edges. UDO goes on to ask why the workers are on reduced hours: for economic reasons, such as the cancellation of a large order due to the coronavirus, or because of an unavoidable event, such as a measure to mitigate the spread of the pandemic? And by now, a powerful and comforting thought may well arise in the citizen’s mind: UDO really cares.

In this article, we argue that smart use of automation can enable governments to provide outstanding levels of customer experience, driven by innovations that are as sensitive to people as they are to technology. We begin by considering the challenges and rewards of enhancing customer experience for governments. Then we discuss the benefits to governments of using automation to improve customer experience. Finally, we turn from why to how, identifying three key practices common to successful automation initiatives in public services….(More)”.


READ MORE

An Open-Source Tool to Accelerate Scientific Knowledge Discovery

Sep 29, 2020 03:07 pm

Mozilla: “Timely and open access to novel outputs is key to scientific research. It allows scientists to reproduce, test, and build on one another’s work — and ultimately unlock progress.

The most recent example of this is the research into COVID-19. Much of the work was published in open access journals, swiftly reviewed and ultimately improving our understanding of how to slow the spread and treat the disease. Although this rapid increase in scientific publications is evident in other domains too, we might not be reaping the benefits. The tools to parse and combine this newly created knowledge have roughly remained the same for years.

Today, Mozilla Fellow Kostas Stathoulopoulos is launching Orion — an open-source tool to illuminate the science behind the science and accelerate knowledge discovery in the life sciences. Orion enables users to monitor progress in science, visually explore the scientific landscape, and search for relevant publications.

Orion

Orion collects, enriches and analyses scientific publications in the life sciences from Microsoft Academic Graph.

Users can leverage Orion’s views to interact with the data. The Exploration view shows all of the academic publications in a three-dimensional visualization. Every particle is a paper and the distance between them signifies their semantic similarity; the closer two particles are, the more semantically similar. The Metrics view visualizes indicators of scientific progress and how they have changed over time for countries and thematic topics. The Search view enables the users to search for publications by submitting either a keyword or a longer query, for example, a sentence or a paragraph of a blog they read online….(More)”.


READ MORE

The secret to building a smart city that’s antiracist

Sep 29, 2020 10:51 am

Article by Eliza McCullough: “….Instead of a smart city model that extracts from, surveils, and displaces poor people of color, we need a democratic model that allows community members to decide how technological infrastructure operates and to ensure the equitable distribution of benefits. Doing so will allow us to create cities defined by inclusion, shared ownership, and shared prosperity.

In 2016, Barcelona, for example, launched its Digital City Plan, which aims to empower residents with control of technology used in their communities. The document incorporates over 8,000 proposals from residents and includes plans for open source software, government ownership of all ICT infrastructure, and a pilot platform to help citizens maintain control over their personal data. As a result, the city now has free applications that allow residents to easily propose city development ideas, actively participate in city council meetings, and choose how their data is shared.

In the U.S., we need a framework for tech sovereignty that incorporates a racial equity approach: In a racist society, race neutrality facilitates continued exclusion and exploitation of people of color. Digital Justice Lab in Toronto illustrates one critical element of this kind of approach: access to information. In 2018, the organization gave community groups a series of grants to hold public events that shared resources and information about digital rights. Their collaborative approach intentionally focuses on the specific needs of people of color and other marginalized groups.

The turn toward intensified surveillance infrastructure in the midst of the coronavirus outbreak makes the need to adopt such practices all the more crucial. Democratic tech models that uplift marginalized populations provide us the chance to build a city that is just and open to everyone….(More)”.


READ MORE

Can fake news really change behaviour? Evidence from a study of COVID-19 misinformation.

Sep 29, 2020 10:35 am

Paper by Ciara Greene and Gillian Murphy: “Previous research has argued that fake news may have grave consequences for health behaviour, but surprisingly, no empirical data have been provided to support this assumption. This issue takes on new urgency in the context of the coronavirus pandemic. In this large preregistered study (N = 3746) we investigated the effect of exposure to fabricated news stories about COVID-19 on related behavioural intentions. We observed small but measurable effects on some related behavioural intentions but not others – for example, participants who read a story about problems with a forthcoming contact-tracing app reported reduced willingness to download the app. We found no effects of providing a general warning about the dangers of online misinformation on response to the fake stories, regardless of the framing of the warning in positive or negative terms. We conclude with a call for more empirical research on the real-world consequences of fake news….(More)”


READ MORE

Why Modeling the Spread of COVID-19 Is So Damn Hard


Sep 28, 2020 05:51 pm

Matthew Hutson at IEEE Spectrum: “…Researchers say they’ve learned a lot of lessons modeling this pandemic, lessons that will carry over to the next.

The first set of lessons is all about data. Garbage in, garbage out, they say. Jarad Niemi, an associate professor of statistics at Iowa State University who helps run the forecast hub used by the CDC, says it’s not clear what we should be predicting. Infections, deaths, and hospitalization numbers each have problems, which affect their usefulness not only as inputs for the model but also as outputs. It’s hard to know the true number of infections when not everyone is tested. Deaths are easier to count, but they lag weeks behind infections. Hospitalization numbers have immense practical importance for planning, but not all hospitals release those figures. How useful is it to predict those numbers if you never have the true numbers for comparison? What we need, he said, is systematized random testing of the population, to provide clear statistics of both the number of people currently infected and the number of people who have antibodies against the virus, indicating recovery. Prakash, of Georgia Tech, says governments should collect and release data quickly in centralized locations. He also advocates for central repositories of policy decisions, so modelers can quickly see which areas are implementing which distancing measures.

Researchers also talked about the need for a diversity of models. At the most basic level, averaging an ensemble of forecasts improves reliability. More important, each type of model has its own uses—and pitfalls. An SEIR model is a relatively simple tool for making long-term forecasts, but the devil is in the details of its parameters: How do you set those to match real-world conditions now and into the future? Get them wrong and the model can head off into fantasyland. Data-driven models can make accurate short-term forecasts, and machine learning may be good for predicting complicated factors. But will the inscrutable computations of, for instance, a neural network remain reliable when conditions change? Agent-based models look ideal for simulating possible interventions to guide policy, but they’re a lot of work to build and tricky to calibrate.

Finally, researchers emphasize the need for agility. Niemi of Iowa State says software packages have made it easier to build models quickly, and the code-sharing site GitHub lets people share and compare their models. COVID-19 is giving modelers a chance to try out all their newest tools, says Meyers, of the University of Texas. “The pace of innovation, the pace of development, is unlike ever before,” she says. “There are new statistical methods, new kinds of data, new model structures.”…(More)”.


READ MORE

Accelerating AI for global health through crowdsourcing

Sep 28, 2020 05:27 pm

Poster by Geoffrey Henry Siwo: The promise of artificial intelligence (AI) in medicine is advancing rapidly driven by exponential growth in computing speed, data and new modeling techniques such as deep learning. Unfortunately, advancements in AI stand to disproportionately benefit diseases that predominantly affect the developed world because the key ingredients for AI – computational resources, big data and AI expertise – are less accessible in the developing world. Our research on automated mining of biomedical literature indicates that adoption of machine learning algorithms in global health, for example to understand malaria, lags several years behind diseases like cancer.


 To shift these inequities, we have been exploring the use of crowdsourced data science challenges as a means to rapidly advance computational models in global health. Data science challenges involve seeking computational solutions for specific, well-defined questions from anyone in the world. Here we describe key lessons from our work in this area and the potential value of data science challenges in accelerating AI for global health.


In one of our first initiatives in this area – the Malaria DREAM Challenge – we invited data scientists from across the world to develop computational models that predict the in vitro and in vivo drug sensitivity of malaria parasites to artemisinin using gene expression datasets. More than 360 individuals drawn from academia, government and startups across 31 countries participated in the challenge. Approximately 100 computational solutions to the problem were generated within a period of 3 months. In addition to this sheer volume of participation, a diverse range of modeling approaches including artificial neural networks and automated machine learning were employed….(More)”.


READ MORE

Digital Disruption

Sep 28, 2020 04:52 pm

Book by Bharat Vagadia: “Implications and opportunities for Economies, Society, Policy Makers and Business Leaders: “This book goes beyond the hype, delving into real world technologies and applications that are driving our future and examines the possible impact these changes will have on industries, economies and society at large. It details the actions governments and regulators must take in order to ensure these changes bring about positive benefits to the public without stifling innovation that may well be the future source of value creation. It examines how organisations in a world of digital ecosystems, where industry boundaries are blurring, must undertake radical digital transformation to survive and thrive in this new digital world. The reader is taken through a framework that critically examines (i) Digital Connectivity including 5G and IoT; (ii) Data Capture and Distribution which includes smart connected verticals; (iii) Data Integrity, Control and Tokenisation that includes cyber security, digital signatures, blockchain, smart contracts, digital assets and cryptocurrencies; (iv) Data Processing and Artificial Intelligence; and (v) Disruptive Applications which include platforms, virtual and augmented reality, drones, autonomous vehicles, digital twins and digital assistants…(More)”.


READ MORE

Reassembling Scholarly Communications: Histories, Infrastructures and Global Politics of Open Access

Sep 28, 2020 04:23 pm

Book edited by Martin Paul Eve and Jonathan Gray: “The Open Access Movement proposes to remove price and permission barriers for accessing peer-reviewed research work—to use the power of the internet to duplicate material at an infinitesimal cost-per-copy. In this volume, contributors show that open access does not exist in a technological or policy vacuum; there are complex social, political, cultural, philosophical, and economic implications for opening research through digital technologies. The contributors examine open access from the perspectives of colonial legacies, knowledge frameworks, publics and politics, archives and digital preservation, infrastructures and platforms, and global communities.

he contributors consider such topics as the perpetuation of colonial-era inequalities in research production and promulgation; the historical evolution of peer review; the problematic histories and discriminatory politics that shape our choices of what materials to preserve; the idea of scholarship as data; and resistance to the commercialization of platforms. Case studies report on such initiatives as the Making and Knowing Project, which created an openly accessible critical digital edition of a sixteenth-century French manuscript, the role of formats in Bruno Latour’s An Inquiry into Modes of Existence, and the Scientific Electronic Library Online (SciELO), a network of more than 1,200 journals from sixteen countries. Taken together, the contributions represent a substantive critical engagement with the politics, practices, infrastructures, and imaginaries of open access, suggesting alternative trajectories, values, and possible futures…(More)”.


READ MORE

Solving Urban Infrastructure Problems Using Smart City Technologies

Sep 28, 2020 03:29 pm

Book edited by John R. Vacca: “… the most complete guide for integrating next generation smart city technologies into the very foundation of urban areas worldwide, showing how to make urban areas more efficient, more sustainable, and safer. Smart cities are complex systems of systems that encompass all aspects of modern urban life. A key component of their success is creating an ecosystem of smart infrastructures that can work together to enable dynamic, real-time interactions between urban subsystems such as transportation, energy, healthcare, housing, food, entertainment, work, social interactions, and governance. Solving Urban Infrastructure Problems Using Smart City Technologies is a complete reference for building a holistic, system-level perspective on smart and sustainable cities, leveraging big data analytics and strategies for planning, zoning, and public policy. It offers in-depth coverage and practical solutions for how smart cities can utilize resident’s intellectual and social capital, press environmental sustainability, increase personalization, mobility, and higher quality of life….(More)”


READ MORE

Public Sector Tech: New tools for the new normal

Sep 26, 2020 03:27 pm

Special issue by ZDNet exploring “how new technologies like AI, cloud, drones, and 5G are helping government agencies, public organizations, and private companies respond to the events of today and tomorrow…:


READ MORE
color-twitter-48.png
color-facebook-48.png
color-link-48.png
 
Have a new article, report, initiative or paper worth sharing with the audience of The Digest? Share it here!

Browse recent issues of The Digest at the Living Library or subscribe now to get our curation in your inbox every week.


Our mailing address is:

TheGovLab, Tandon School of Engineering, NYU
2 MetroTech Center
Floor 9, Brooklyn
New York, NY 11201

Add us to your address book


Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.