Every day, people around the world use data to make decisions. When heading out of town, most of us use weather apps to check the forecast anywhere in the world before packing our bags. However, when we travel to far-flung places, we may find ourselves packing food from home because we don’t know what may be available when we arrive. We have a global, comprehensive, open data set that enables weather forecasting, but not something similar for food and agriculture?
In his first public remarks as head of USDA, Secretary Sonny Perdue noted that “…we want to make decisions based on facts and evidence,” “we want to be data-driven,” and “I need good data, I need good sound science to make decisions on…”
USDA recognizes that farmers, ranchers, and consumers alike – use data daily, from deciding when to plant, harvest or sell their crops, when to turn out cattle to pasture, or where to buy fresh fruits and vegetables. This is why it is important that data be made available, open and accessible, to facilitate the best-informed decisions.
Around the world, a movement called the “open data revolution” is under way to make data available for public use. This movement is expected to generate new insights, drive better decision-making, and enable governments, civil society, and the private sector to better target interventions and programs.
All of this is why the U.S. Government, led by USDA, was a founding partner of the Global Open Data for Agriculture and Nutrition (GODAN) initiative and why it continues to support the advancement of open data for agriculture and nutrition around the world. Now with over 500 partners, GODAN continues to support the sharing of available, accessible, and usable open data for agriculture and nutrition to help ensure global food security.
If we’re going to feed over 9 billion people by 2050, we need open data policies to make decisions based on facts and evidence. This global perspective will help identify data already available and data gaps that exist, and sharpen the focus on how open data can foster innovation and collaborative research, creating whole new kinds of growth around the world.
Dear Open Data Enthusiasts,
Below is an abstract of “Liberating data for public value: The case of Data.gov,” International Journal of Information Management (2016).
Public agencies around the globe are liberating their data. Drawing on a case of Data.gov, we outline the challenges and opportunities that lie ahead for the liberation of public data. Data.gov is an online portal that provides open access to datasets generated by US public agencies and countries around the world in a machine-readable format. By discussing the challenges and opportunities faced by Data.gov, we provide several lessons that can inform research and practice. We suggest that providing access to open data in itself does not spur innovation. Specifically, we claim that public agencies need to spend resources to improve the capacities of their organizations to move toward ‘open data by default’; develop capacities of community to use data to solve problems; and think critically about the unintended consequences of providing access to public data. We also suggest that public agencies need better metrics to evaluate the success of open-data efforts in achieving its goals.
Rashmi Krishnamurthy, School of Public Affairs, Arizona State University, Phoenix, AZ,USA
Yukika Awazu, The Institute for Knowledge and Innovation South-East Asia, Bangkok University, Bangkok, Thailand
Understanding data can sometimes be daunting, especially if you’re looking at a spreadsheet with rows, upon rows, upon rows of data.
To make FEMA data easier to understand, a small project team and I have been working on an innovative data visualization tool designed to help you better understand how we support communities and help visualize it in a way that tells a story.
Our project team is really excited about bringing the visualization from concept to reality, and we’d love to know what you think. Interact with it, learn more about your community, and let us know what you think. (The email address is on the visualization page.)
Transparency, participation, and collaboration form the cornerstone of an open government. That’s why we’ve worked hard to make our data free and open to everyone through OpenFEMA, our open government initiative.
To learn more, watch a short video about the visualization.
For developers and other techie folks, check out our Developer’s Resource page for info on our API’s and more.
President Obama launched the first U.S. Open Government National Action Plan in September 2011, as part of the Nation’s commitment to the global Open Government Partnership, a multilateral initiative to promote transparency, empower citizens, fight corruption, and harness new technologies to strengthen governance. The first Plan laid out 26 concrete steps the United States would take to promote public participation in government, increase transparency, and manage public resources more effectively. … [T]he Obama Administration has committed to develop a second National Action Plan on Open Government: “NAP 2.0.”
In order to develop a Plan with the most creative and ambitious solutions, we need all-hands-on-deck. That’s why we are asking for your input on what should be in the NAP 2.0
Federal IT professionals estimate that government agencies potentially can save 14% of their budgets, or nearly $500 billion across the government, from successfully analyzing big data. But while nearly one-fourth of federal IT managers in a new government poll report their agencies have launched at least one big-data project, only 31% believe their agency’s big-data strategy is sufficient to deliver on that potential.
The numbers come from a recent report, “Smarter Uncle Sam: The Big Data Forecast,” by government IT networking group MeriTalk. The report, sponsored by EMC Corporation, is based on a survey of 150 federal IT professionals.
New technology tools, combined with raised expectations among voters and stakeholders for government transparency, have sparked a movement toward “open government.” Championed by advocacy organizations and a few high-profile elected officials, the trend seeks to promote greater accountability and responsiveness for the systems of representative democracy. An area of particular opportunity — as well as potential concern — is the growing cache of large datasets of public information now available on the Internet. …
The White House under the Obama administration has been a leader in its approach to transparency and launched the website data.gov in 2009. To date, nearly 100,000 datasets are available on the site. Other countries soon followed: the U.K., Kenya, Brazil, India and more than 30 other countries have created portals for public data. The European Union Open Data Portal offers more than 6,000 datasets from its member countries. International organizations from the UN to the World Bank add their own repositories to the surfeit of online information. The trend is growing also at the state and local level. Chicago apparently boasts the most public datasets (950) among cities. San Francisco has an extensive open data policy and is one of the first cities in the nation to hire a Chief Data Officer.
White House Deputy Chief Technology Officer Nick Sinai posted a call for comments this week, seeking input for an updated version of the Obama administration’s National Action Plan for Open Government.
The White House developed Project Open Data (See project OpenData on github) – this collection of code, tools, and case studies – to help agencies adopt the Open Data Policy and unlock the potential of government data. Project Open Data will evolve over time as a community resource to facilitate broader adoption of open data practices in government. Anyone – government employees, contractors, developers, the general public – can view and contribute.
Many technology professionals can trace their passion back to an early age. Federal Communications Commission Chief Data Officer Greg Elin is no exception. … “It’s clear that positions titled ‘chief data officer’ are going to start appearing at more agencies,” he said. “They might not always be called that and they might already be there under a different title, but I think absolutely we’re going to see more people in this role.”
The administration released additional guidance to push agencies to make government data more transparent, according to a blog post Aug. 16.
The guidance will help agencies meet Nov. 1 goals to create and maintain data inventories and to establish a process for making more data public, according to Nick Sinai, the administration’s deputy chief technology officer, and Dominic Sale, the supervisory policy analyst at the Office of Management and Budget.
A supplemental guide to the White House’s Open Data Policy has been released providing additional clarification and detailed requirements to assist agencies in carrying out the objectives laid out in a May 9 memo and executive order.
The administration’s document, posted on the GitHub website, focuses on near-term efforts agencies must take to meet the five initial requirements of OMB Memorandum M-13-13 (.pdf), which are due November 1, 2013.
Agencies are expected to deliver on several elements of President Barack Obama’s executive order on open data by Nov. 1, including creation and publication of a list of data assets. To steer these efforts, the Office of Management and Budget and the Office of Science and Technology Policy released guidance that gets into the weeds about how agencies can fulfill the directives and incorporate open data policy into everyday activities.
White House officials have announced expanded technical guidance to help agencies make more data accessible to the public in machine-readable formats.
Following up on President Obama’s May executive order linking the pursuit of open data to economic growth, innovation and government efficiency, two budget and science office spokesmen on Friday published a blog post highlighting new instructions and answers to frequently asked questions.
Since President Obama announced the Open Data Initiative in 2009 — emphasizing government transparency — its flagship site, data.gov, has grown from 47 data sets to hundreds of thousands, providing more fodder for entrepreneurs such as Nash.
“Start-ups like mine are starting to see that data as valuable for a product we can sell to customers, as a catalyst for innovation,” Nash said.
For now, Jeanne Holm, data.gov’s evangelist at the General Services Administration, most developers seem focused on finding ways to liberate the data.
In May, the President signed an Executive Order to make government-held data more accessible to the public and to entrepreneurs and others as fuel for innovation, economic growth, and government efficiency. Under the terms of the Executive Order and a new Open Data Policy all newly generated government data will be required to be made available in open, machine-readable formats, greatly enhancing their accessibility and usefulness, while ensuring privacy and security.
Today, we are building on this effort by releasing additional resources to help Federal agencies make data open and available in machine-readable form. Specifically, we are releasing additional guidance to agencies about how to inventory and publish their data assets, new FAQs about how open data requirements apply to Federal acquisition and grant-making processes, and a framework for creating measurable goals that agencies can use to track progress. All of this is openly available on the Project Open Data website, where additional case studies and free software tools for the agencies are also available.
This post originally appeared on the White House Blog on August 16, 2013.
Soon, the Open Data Policy will require federal agencies to publish /data and /data.json pages. These pages will tell the world what datasets the agency has, and where to find them (if they’re publicly accessible). The /data.json pages will be crawled by the new data.gov to dynamically build out a comprehensive up-to-date catalog of public data sets.
For most Americans, buying a home is the largest purchase of their lives. That’s where a company like Zillow comes in — helping families make informed decisions about buying a home and where to raise a family. Zillow is powered, in part, by open government data – including freely available data from the Bureau of Labor Statistics, Federal Housing Finance Agency, and the Census Bureau. Zillow uses these data sets to do things like help home buyers in a given region understand the point in years at which buying a home is more financially advantageous than renting the same home.
Making government data resources publicly available in machine-readable form as fuel for new private-sector products and businesses is one example of how the President is working to make government smarter and more innovative for the American people.