With the votes counted, the work of updating our data & tools for 2019 begins.
We're aiming to raise $10,000 to fund this vital work between now and the start of the 2019 sessions, please contribute today!
 

Open States

Open Legislative Data Report Card

Comparing how state legislatures make their data publicly available.
For more context, read this post and see our methodology.

(Note: Since the publication of this report card several states have come to us with additional information or made changes that would affect their score. Details are available below.)

State Completeness Timeliness Ease of Access Machine Readability Standards Permanence Grade
Connecticut 0 1 0 1 0 2 A
Georgia 0 0 0 2 0 2 A
Kansas 0 0 1 1 1 2 A
New Hampshire 0 0 0 2 0 2 A
North Carolina 0 1 0 1 0 2 A
Texas -1 1 1 2 0 2 A
Washington 0 1 0 2 0 2 A
Alaska 0 1 0 0 0 2 B
Arkansas 0 1 0 1 0 2 A
Maryland 0 1 0 1 0 1 B
Mississippi 0 0 0 1 0 2 B
Nevada 0 1 0 0 0 2 B
New Jersey 0 0 -1 2 0 2 B
New York 0 1 0 1 0 2 A
Ohio 0 1 -1 1 0 2 B
Utah 0 0 1 0 0 2 B
Vermont 0 1 0 0 0 2 B
West Virginia 0 1 1 -1 0 2 B
Arizona 0 0 -1 0 0 2 C
Delaware 0 1 -1 0 0 2 C
District of Columbia 0 1 -1 -1 0 1 D
Florida 0 1 0 -1 0 2 C
Hawaii -1 0 0 0 0 2 C
Idaho 0 0 0 0 0 1 C
Illinois 0 1 0 -1 0 2 C
Iowa 0 1 0 -1 0 2 C
Michigan 0 1 0 1 0 0 C
Minnesota -1 1 0 0 0 2 C
Missouri 0 0 0 -1 0 2 C
Montana 0 1 0 -1 0 2 C
New Mexico 0 0 0 -1 0 2 C
North Dakota 0 0 0 -1 0 2 C
Oregon 0 -1 0 0 0 2 C
Pennsylvania 0 1 1 1 0 2 A
South Carolina 0 0 0 0 0 2 C
South Dakota 0 1 0 0 0 2 B
Tennessee 0 1 0 0 0 0 C
Virginia 0 1 0 1 0 2 A
Wyoming 0 0 0 0 0 2 C
California 0 0 -1 1 0 0 D
Indiana -1 1 0 -1 0 0 D
Louisiana 0 1 -1 -1 0 0 D
Maine 0 1 -1 0 0 0 D
Oklahoma 0 1 -1 0 0 0 D
Wisconsin 0 0 0 0 0 0 D
Alabama 0 1 -2 -1 0 -1 F
Colorado 0 1 0 -1 0 1 C
Kentucky 0 0 0 -2 -1 0 F
Massachusetts -1 1 -2 -2 0 -1 F
Nebraska 0 0 0 -1 0 -1 F
Rhode Island 0 1 0 0 0 -1 D

Methodology

Each state was evaluated in six categories based largely on the Ten Principles For Opening Up Government Information. Each score is based on at least two members of staff and a volunteer during our state survey. Additionally, state legislatures were contacted (unless noted in their score) to ensure that our information on bulk data availability and timeliness was as accurate as possible.

The specific criteria for each category are as follows:

Completeness

We evaluated each state on the data collected by Open States: bills, legislators, committees, votes and events. We also took note if a state went above and beyond to provide this information and other relevant contextual information such as supporting documents, legislative journals and schedules. Points were deducted for missing data, often roll call votes.

 0
State provides full breadth of legislative artifacts Open States collects: bills, legislators, votes, and committees.
-1
State does not provide stand-alone roll call votes.

Timeliness

Legislative information is most relevant when it happens, and many states are publishing information in real time. Unfortunately, there are also states where updates are more infrequent and showing up days after a legislative action took place. States were dinged if data took more than 48 hours to go online.

 1
Multiple updates throughout the day, real time or as close to it as systems will allow.
 0
Site updates once or twice daily, typically at the end of the legislative day.
-1
Updates take longer than 24 hours to appear on the site, often up to a week.

Ease of Access

Common web technologies such as Flash or JavaScript can cause problems when reviewing legislative data. We found that the majority of sites work fairly well without JavaScript, but some received lower scores due to being extremely difficult to navigate, impossible to bookmark bills, and in extreme cases, completely unusable.

 1
Site was considered exceptionally well layed out by multiple evaluators, no issues with Javascript.
 0
Site was deemed average by those that evaluated it and/or had minor Javascript dependencies.
-1
Site was considered more difficult than average to use by members of staff or volunteers or had more severe Javascript dependencies.
-2
Site was considered extremely difficult to use with a heavy reliance on irregular browser behavior and Javascript.

Machine Readability

For many sites, the Open States team wrote scrapers to collect legislative information from the website code—a slow, tedious and error prone process. We collected data faster and more reliably when data was provided in a machine-readable format such as XML, JSON, CSV or via bulk downloads. If a state posted PDF image files or scanned documents, it received the lowest score possible.

 2
Essentially all data can be found in machine-readable formats.
 1
Lots of data in machine readable format but substantial portions that still required scraping HTML.
 0
No machine readable data but standard screen scraping techniques applied.
-1
Site had information that was much more difficult than average to collect.
(Data only accessible via PDF or that required screen scraper to emulate Javascript.)
-2
Site had information that was unaccessible to Open States due to use of scanned PDFs.

Use of Commonly Owned Standards

Because our ability to access most of a state’s data is represented by the above “Machine Readability” metric, we decided to use this provision to measure how a state made their bill text available. Making text available in HTML or PDF is the norm, and was considered an acceptable commonly owned standard (PDFs are a commonly owned standard, but it would be certainly nice to see alternative options where bill text is only available via PDF). States that only make documents available in Microsoft Word or Wordperfect formats require an individual to purchase expensive software or rely on free alternatives that may not preserve the correct formatting. It is worth noting, all states except for two met the common criteria of providing HTML and/or PDF only, one state (Kansas) went above and beyond and another (Kentucky) did not even meet this threshold.

 1
State made an effort to go above and beyond.
 0
State provided bills in PDF and/or HTML format and nothing better (plaintext, ODT, etc.).
-1
State only provided bills in a proprietary format.

Permanence

Many states move or remove information when a new session starts, much to the dismay of citizens seeking information on old proposals and researchers that may have cited a link (e.g. https://somelegislature.gov/HB1 vs https://somelegislature.gov/2011/HB1) only to see it point to a different bill in the following session. Tim Berners-Lee, inventor of the World Wide Web, wrote an article declaring Cool URIs Don’t Change and we agree.

This poses a particular challenge to us since every page on OpenStates.org points to the page we collected data from, but if a state changes their site then users lose the ability to check us against the original source. Most (but not all) states are good about at least preserving bill information, but few were equally as good about preserving information about out-of-office legislators and historical committees, equally important parts of the legislative process.

 2
All information is avaialble in a permanent location and data goes back a reasonable amount of time (a decade or so).
 1
Almost all information has a permanent location but a single data set doesn't.
(Or a recent change to the site has wiped out historical links but information appears to be preservable going forward.)
 0
Legislator & committee information lacks a permanent location (such as committees and legislators) but most is acceptable.
-1
Ability to link to old information is badly damaged and and/or there is less than a decade of historical information.
-2
Vital information like bills or versions lack a permanent location.

Changelog

Since the initial publication of this report card on March 11th, 2013 some states have provided us with additional information or made changes in response that have affected their score. These changes are reflected below and noted on the report card itself.

Rhode Island - On March 12th, 2013 we confirmed with Rhode Island IT staff that data is updated in real-time, not weekly as we had initially been told. This information raised their score by 2 points, bringing them into the 'D' class.

New York - On March 12th, 2013 New York Senate staff reached out and clarified their update policy, raising their score by a point and putting them into the 'A' class. A better API was also pointed out to us, which may affect their machine readability score in the future.

Virginia - On March 22nd, 2013 Virginia Legislature staff clarified their update policy, raising their score by a point and putting them into the 'A' class. There is also potential that a coming change to their data availability will raise their score further.

Colorado - On March 22nd, 2013 it was determined via discussion with an IT manager from the Colorado Legislature that the bills that went missing were site errors and no actual data was affected. This made a large change in Colorado's Timeliness score, raising Colorado from an F to a C.

Pennsylvania - On December 4th, 2013 we evaluated the new Pennsylvania website that was unveiled in November 2013. This resulted in an increase in the timeliness, ease of access, and machine readability scores.

District of Columbia - On March 24, 2015 we evaluated the new District of Columbia website. The result was a decrease in ease of access and machine readability scores, and an increase in timeliness, lowering DC from a C to a D.