September 03, 2004

Federal Info Standards Under Review

A federal interagency committee on government information has drafted "Requirements for Enabling the Identification, Categorization, and Consistent Retrieval of Government Information." This document, dated August 5, 2004, from The Categorization of Government Information (CGI) Working Group has insights of value for those in government engaged in making information more accessible through portals and search engines.

The group "seeks to improve the methods by which Government information, including information on the Internet, is organized, preserved, and made accessible to the public, as required under the E-Government Act of 2002, Section 207, Accessibility, Usability, and Preservation of Government Information."

This group, working under the U.S. Federal Interagency Committee on Government Information, has also drafted a "Recommendation for Search Interoperability." The recommendations focus on "how the U.S. Federal Government should adopt a search service standard to enhance interoperability among networked systems that aid in the discovery of and access to government information." Comments on the document can be sent until September 27, 2004 to its editor, Eliot Christian, U.S. Geological Survey.

Eliot, as many readers may know, was instrumental in founding the U.S. State Government Information Locator Services working group. gilsUtah has been a participating member since 1999.

Source: beSpacific (September 2, 2004)

Posted by Ray Matthews on September 03, 2004 at 09:19 AM | | Comments (0) | Send this story to a friend!

October 01, 2003

Making Content Findable Through SEO

Most users arrive at government web sites by using search engines so it makes a lot of sense to prepare content so that search engines can find it. It's so important that there is a cottage industry devoted to the science and art of Search Engine Optimization (SEO).

Here at GilsUtah we've been crawling and indexing Utah state agency and local government websites now for over year and we've discovered some all-too-common practices that present barriers to spiders. We've found that the content of some agencies is almost entirely blocked. In other cases UtahGov Search, Google, and others can retrieve some content, but only after manual intervention.

Here are some common problems preventing public access:

(1) Linking within javascripts. There are right ways and wrong ways to do this. Unfortunately, most of the content of an entire branch of government, an entire department, and many division sites is being missed because of this. The trend seems to be worsening.

(2) Creating urls with question marks. Some database and content management systems create dynamic urls with question marks. While most search engines provide a workaround, the workaround can cause other problems. There are usually ways for webmasters to manipulate their scripts to create static looking html urls that are both search engine friendly and easier for users to remember and bookmark.

(3) HTTPS protocol. Secure Socket Layers (SSL) cannot be penetrated by search engines. Agencies sometimes use this for publications and areas when they don't need to. Limit SSL to your financial transactions and other uses where encryption is necessary.

(4) File naming. You'd be surprised how often content creators include spaces in file names. The search engine retrieves them, but inserts "%20" as the escaped encoding for the US-ASCII space character. Users often find that the resulting links are bad or that the urls have become cryptic and undecipherable. It's not advisable, but if necessary use underscores or other unreserved characters like such as - ! and . instead of spaces and avoid other reserved characters like these: & : = / ; ? + and $.

(5) Directory hierarchies. Some agencies dump their entire content, including images and scripts, into a single directory. You should create subdirectories for administrative functions or programs that naturally lend themselves to being in their own directory. This aids search engine crawling and rule writing.

(6) No site map. It's amazing the number of sites that still lack site maps. Every site should have a site map linked (using a static A HREF link) from at least the homepage. This helps get around the javascript linking problem, and site maps can be used to as crawling starting pages.

(7) Use robots.txt files, appropriately. All search engines respect robots.txt files. If you want directories and files excluded, use robots.txt (or .htaccess protection) instead of hiding resources or limiting them to the innerweb. Be careful, though. I can think of at least one agency whose important services are inaccessible because of an improper use of robots.txt.

What we hope to do in the coming year is to create a dialogue amongst agency webmasters and content creators to come up with best practices for optimizing our sites for search engines. We'll be offering workshops here at the Utah State Library and creating an easy to use and open knowledge base and code library of some nature so that we can share our discoveries and communicate.

Some of this gets technical beyond my experience, so I'll need your help. For starters, you can leave comments here with this story or links to resources that you've found helpful. Please contact us at the Government Information Locator Service with suggestions or to let us know that you're someone that we should get with. You can also subscribe here to receive helpful news by email.

Posted by Ray_Matthews on October 01, 2003 at 10:24 AM | | Comments (1) | Send this story to a friend!

September 25, 2003

Brown University E-Government Ratings

The newly released Urban E-Government, 2003 report from Brown University's Center for Public Policy ranks Salt Lake City 7th in the nation in their analysis of 1,933 city government websites. Salt Lake City ranked 37th in 2002.

Salt Lake City sites received high marks for their low reading level grade (1st place), the high percentage of services executable online, availability of publications and data on all sites, support of secure credit card transactions for most services, and links to security and privacy policies from nearly all pages.

Brown University's State and Federal E-Government study of 1,603 state government websites reports that the State of Utah, as a whole, fell to 17th after placing 10th last year.

Utah ranked highly in providing links to its security and privacy policies. The reviewers found that 97% of agency sites provided publications and 74% provided information in databases. The State ranked poorly, however, in reading level (grade 11.7 or 50th place), accessibility compliance (47th with only 14% of state sites in compliance), and online services (0.7 per website or 46th). The ranking difference between Utah and front-running Massachusetts is primarily because the Commonwealth provides links to their 48 online services in their state header. This is an easy fix that might take Utah Interactive all of about five minutes to put in place (hint, hint).

In terms of federal agencies, top-rated websites include FirstGov (the U.S. portal), Federal Communications Commission, Social Security Administration, Internal Revenue Service, Library of Congress, Postal Service, Dept. of Treasury, and Securities and Exchange Commission. The lowest-rated sites are the various federal circuit courts of appeals. The new Homeland Security Department scores in the lower third of federal agencies.

The evaluation paid more attention this year to online services, the handling of privacy and security, and offering disability access.

After reviewing the Center for Public Policy's evaluations, here are some simple suggestions that webmasters can follow to get our state and cities ranked higher next year:

  1. Use the Flesch-Kincaid tests that come with Word and WordPerfect to keep documents to an 8-9th grade reading level
  2. Test your pages periodically with Bobby for W3C and Section 508 accessibility compliance
  3. Provide robust Privacy AND security policies (including policy on sharing personal information), and prominently link them from every page
  4. Continually evaluate and improve performances
  5. Identify your online services, list them (quantity does matter), categorize them, and make them easy to navigate to from the banner headers; for bonus points, include a few novel services
  6. Identify all the forms you have and make sure that all or most can be submitted online
  7. Support SSL and allow citizens to do financial transactions with credit cards (currently 71% for state agencies), and digital signatures (now 0%)
  8. Identify and provide easy access to publications and databases
  9. Provide foreign language features such as "En Espanol" links for all your key services and publications
  10. Lower the gross number and percentages of services that restrict access such as requiring user fees and passwords (Utah is currently in 42nd place with 29% of sites having restrictions)
  11. Go beyond email links (we rank high with 89%) with easy to use comment boxes and alert services to facilitate contact access and public feedback from every page; currently 11% of state sites over comment facilities, 17% offer updating; and 0% offer personalization
  12. Eliminate commercial advertising if you have any
  13. By all means, be sure you have standardized headers, site map, and a site search


    In the "if you can't be 'em, learn from 'em" department, take a look at FirstGov and these five top-ranked cities:

    • Denver recognized for its clear layout and organization with easy navigation to all citizen services;
    • Charlotte with a plethora of fully executable services, a database that allows a user to access property information and demographic information for any address, a noteworthy security policy, and a comments box on every page that sends form data direct to the page author;
    • Boston noted for its uncluttered and aesthetic appearance, information organized into sections for residents, businesses and visitors, and site personalization and email updates;
    • Louisville cited for its easy navigation and compliance to Section 508 and W3C accessibility standards, Secure Sockets Layer protocol to safeguard personal information, and easy to use dropdown menu to city services; and
    • Nashville recognized for its content rich abundance of publications and databases.

    See also: State E-government press release | Urban E-government press release | PDF version of the full State E-Government report | PDF version of the full Urban E-Government report | Govering.com state report | Governing.com city report | Brown Policy Reports archives, 2001-03 | Government Computer News (GCN)

    Posted by Ray_Matthews on September 25, 2003 at 12:02 PM | | Comments (2) | Send this story to a friend!

May 29, 2003

Utah.gov Search Queries Are Revealing

Mike Brown, the Utah State Library's programming whiz, has created a tool that reveals the 5,000 daily queries that visitors to Utah.gov are asking. Mike's Query Snoop allows you to view the lists by date and by month.

Though we haven't yet created a ranking sort, it's clear that the search for "sex offenders" continues to top the list. While tax related queries have expectedly dropped off since April, we're still finding from queries like "contact utah state tax commission" that their services are still in high demaind. Poorly constructed queries such as "complain filed to utah devision of real estate" show why some people don't find what they're seeking. Queries such as "Not getting paid for work done" may suggest to an agency a topic for a needed FAQ.

Posted by Ray_Matthews on May 29, 2003 at 08:27 AM | | Comments (1) | Send this story to a friend!