As noted in the results of our initial audit, primary differences between strong and weak plans include: level of specificity, easy availability of information, thoughtful identification of key audiences and needs, and quality and sustainability of flagship initiatives. While some agency plans are strengthened by exceeding certain OGD requirements, on average the majority of the improvements are due to agencies better meeting the requirements. Examples of how agencies address these particular weaknesses in the revised versions include:
Version 1.2 of the Department of Transportations’ (DOT) Open Government Plan includes specific milestones for publishing online high value information that is not currently available, and includes a description of the agency's process for determining what constitutes “high value information.” Version 1 of the plan included no information on what information the agency would publish, or when.
Version 1.1 of the Department of Justice’s (DOJ) Open Government Plan incorporates a wealth of information about how the agency handles requests for information from the public and from Congress. Before updating the plan, the information was buried a couple of clicks away on the agency’s /open page. Additionally, updated plans from a couple of agencies, including the Department of Homeland Security and the National Archives and Records Administration, fixed technical problems like broken links in the revised plan, making it easier for the public to find information.
Version 1.1 of the Department of Health and Human Services’ (HHS) Open Government Plan includes a comprehensive list of stakeholders and their information needs. The agency also includes a type of information labeled "Accountability information" for every stakeholder activity in recognition of its responsibility to disclose information about its performance. Version 1 of HHS’ plan only identified stakeholders at a very high level (e.g. the public, regulated industry, congress etc).
Version 1.1 of the Social Security Administration’s (SSA) Open Government Plan includes specific tools and feedback mechanisms—including web metrics, people served, and user satisfaction surveys—the agency will use to continually improve and expand its flagship initiatives to meet the changing needs and preferences of the public. Version 1 of the plan discussed measuring performance of the flagship initiatives in the abstract.
Updated plans were also strengthened by the addition of specific internal management changes to increase transparency, participation, and collaboration.
Version 1.1 of the Department of Education’s Open Government Plan, for example, describes the creation of a formalized Open Government Steering Committee, and lays out the Committee’s agenda and deadlines, including the publication of an “Open Government Policy Document” to clarify how open government functions. Version 1 of the plan noted the agency was convening a working group to address “regulatory and statutory challenges.”
Evaluations of “extra open government plans” were completed May 17 and May 24; informal feedback from these evaluations was provided to the agencies at the end of May. Re-evaluations of all updated open government plans were completed between June 25 and July 8. All evaluations were completed using new media tools that involved a number of evaluators from multiple organizations. As with the initial evaluations, the criteria for assessment were based on the requirements in the OGD. For each item assessed, evaluators gave a score of 0 if the plan did not address the item, 1 if the plan referenced the item, but did not include a clear roadmap to achieve results, and 2 is the plan fulfilled the requirement. The maximum basic score was 60, except for agencies without original classification authority, where the maximum was 58 points.
Evaluators could also give an agency a bonus point for exceeding the requirement on each item. The highest number of bonus points received by any agency was 18. Bonus points were added on to the basic score, as a sort of extra credit.
Evaluators used Google Docs to record their assessments. Evaluators were able to look at all assessments to compare with their own.
For more information about the methodology, click here.
Agencies’ updates of plans greatly reduced the wide variation in the strength of plans noted by our initial audit. The list of “leading agencies” that have developed plans that exceeded the requirements of the OGD in important and innovative ways has grown substantially. Final rankings are based on the overall score the most current version of the plan earned (including bonus points) out of the agency’s basic score. To view the updated rankings, click here. To view each assessment of the most up-to-date version of agency plans, click on the agency on the left hand side of this webpage.
Three agencies developed plans that meet all of the requirements of the OGD: the Department of Health and Human Services, Department of Transportation, and Environmental Protection Agency. In addition to meeting the requirements, these plans were awarded bonus points for exceeding requirements.
They are joined on the list of “model plans” by several additional plans that, while not fulfilling all of the requirements, earned enough bonus points so that the overall score exceeded the maximum possible points on the basic score. These bonus points recognize specific achievements like the HHS’ newly-formed “innovation consulting team” which seeks out and takes down barriers to participation and collaboration activities. To join the rank of leaders, agencies must implement such leading practices.
“Extra” Open Government Plans that have not yet been updated tend to be the weakest plans, even with bonus points for taking the initiative to develop plans despite not being required to do so (the exception being the Railroad Retirement Board). The four lowest scoring plans—Peace Corps, Broadcasting Board of Governors, National Endowment for the Arts, and Udall Foundation—all fall into this category. Despite the weakness of the plans, we applaud these agencies for opting into participation in the Directive. As with weak plans identified by our initial audit, many of the deficiencies in the plans are easily remedied, and we look forward to working with them to improve the plans.
The four weakest required plans are: Office of National Drug Control Policy (ONDCP), Small Business Administration (SBA), and Office of the United States Trade Representative (USTR), and Department of Energy (DOE).
ONDCP, SBA, and USTR did not submit updated plans. While all three plans fell in the middle ground in our initial rankings, enough improvements were made to plans that have been updated that these were knocked into the weakest pool. Agencies that treat their plans as “drafts” and “living documents” will continue to move this bar higher.
Despite producing an updated plan that makes substantive progress towards meeting the OGD requirements, Version 1.1 of the DOE plan still ranked among the weakest plans. However, between the time we evaluated Version 1.1 and published this report, DOE shared an updated plan, Version 1.2, with the evaluators. Version 1.2, shows substantial progress in almost all areas of weakness identified in the evaluation of version 1.1. For example, the latest version of the plan describes public feedback mechanisms for the agency’s flagship initiatives, addresses how the initiatives will impact agency operations, and identifies a structure for measuring the value of the initiatives. Version 1.2 also lists DOE’s key stakeholders and their information needs, although this section could include greater detail about their specific data needs—information DOE could gather by reaching out to the stakeholders to ask them about information needs. The most recent plan also includes a strategy for identifying changes to administrative policy and current practices to increase participation and collaboration. Version 1.2 of the plan is strong enough to put rank the agency well within the middle ground.