This is the second of an occasional series of interviews with local tech leaders.
Owen Ambur recently retired from the Fish and Wildlife Service. He talks about the Federal XML Work Group, and his current work with AIIM's StratML Committee.
How did you come to work at the Fish & Wildlife Service?
After a 14-year stint on Capitol Hill with Congressman/Senator Abdnor, our election defeat became an opportunity for me to serve for 7 1/2 years as the Congressional liaison for an agency in whose mission I strongly believe. As chief of the Office of Legislative Services, I implemented an electronic document management system and the Director of the agency asked me to take a special assignment in the Office of Information Resources Management (IRM) to expand the system agency-wide. After spending the first two-thirds of my career in Congressional affairs, that's how I ended up in a more technical line of work -- not because of a fascination with technology but because I needed it to do my job efficiently and effectively
How did you get interested in XML?
My involvement in XML stems from my long-standing interest in document/records management and forms automation.
How was the xmlCoP formed?
Early in 2000 I sent a message to George Brundage of GSA highlighting two opportunities for the government to leverage the potential of XML. He posted my message on a listserv he was maintaining relating to information technology architecture. Martin Smith, who was then with the U.S. International Trade Commission (ITC), and more recently has worked at the Department of Homeland Security (DHS), suggested that I bring those ideas to the attention of the CIO Council (CIOC). I did so and Lee Holcomb, who was then CIO at National Aeronautics and Space Administration (NASA) and co-chaired what is now the CIOC's Architecture and Infrastructure Committee (AIC), commissioned Martin and me to form an ad hoc group and come back to his committee with a recommendation. That group recommended that a more formal working group be chartered under the auspices of the CIOC, and our first charter was approved in the fall of 2000.
The original message thread that led to formation of the XML Working Group (XML WG) is available at http://xml.gov/documents/completed/genesis.htm In September 2004, the XML WG was re-chartered as the XML Community of Practice (xmlCoP) and the history of the group is available at http://xml.gov/documents/completed/history.htm Coincidentally, our current charter expires on November 30: http://xml.gov/documents/completed/charter.htm
How did you come to co-chair the Federal XML Work Group?
In the ad hoc group that met prior to chartering of the XML WG, I was pushing for GSA and NIST to co-chair the group, in light of their missions, and Marion Royal of GSA was subsequently assigned to serve as co-chair. However, Martin and others pressed for me to accept the other slot and it was an offer I could not refuse to do something in which I truly believe.
What do you think the Federal XML Work Group achieved?
As I told Lee and others in the CIOC, XML was going to happen regardless of whether the CIOC did anything about it or not. It has taken longer than I had hoped and we still have a long way to go to fully capitalize on the potential in an effective and well-coordinated manner on a government-wide basis. However, I do believe the fact that the xmlWG/CoP was formally recognized by the CIOC, met virtually every month for six years straight, and still maintains the xml.gov site has fostered awareness, education, and sharing of experiences and expertise. Hopefully, that has brought greater credibility to what is to most people a pretty esoteric subject and, in turn, has encouraged agencies to act more rapidly than might have otherwise been the case.
For example, it is somewhat ironic that the first time the xmlCoP was briefed on the Global Justice XML Data Model (GJXDM) was when Pat McCreary and Bob Greeves of the Department of Justice (DOJ) filled in at the last minute when our scheduled speaker was unable to appear due to the events of 9/11. The agenda and minutes from that meeting are available, respectively, at http://xml.gov/agenda/20010919.htm and
http://www.xml.gov/minutes/20010919.htm Subsequently, the GJXDM morphed into the National Information Exchange Model (NIEM), which may be the single, best success story for XML in government.
However, I cannot leave this topic without addressing what I consider to be
our greatest failure, which was the failure of Congress to approve the President's budget request for $2.1 million for the XML registry -- despite a projected return on investment (ROI) in the range of 500 - 1400 percent.
The history of that failure is documented at http://xml.gov/registries.asp As a result, it remains far more difficult than it should be for agencies to discover and reuse XML data elements, rather than reinventing them, needlessly and perhaps inconsistently.
What is the importance of the Federal Enterprise Architecture
Technical Reference Model? What is its influence on the technology
industry as a whole?
While most of the focus of the FEA has been placed on the Business Reference Model (BRM), on the theory that it is the "business" that is most important, I contend that it is impossible to truly understand our "business" without understanding the data (records) required to conduct it. Thus, I believe there is a reason the Data Reference Model (DRM) was the last of the FEA "models" to be drafted ... because it is the only one that truly matters.
However, from my perspective, the Technical Reference Model (TRM) is the second most important -- because unless and until the relevant technical specifications are fully supported in the IT products, components, and services used to conduct We the People's business, all the talk about "interoperability" is just that ... talk. It would be nice to think that IT vendors would "do the right thing" by coalescing around and implementing those standards. However, the reality is that they have every incentive to continue selling us proprietary stovepipe systems as long as we are stupid enough to keep wasting the taxpayers' money on them.
Can you tell us about ET.gov? Has it been a success? Do civil servants use it?
The history of the ET.gov site/process is documented at http://et.gov/history.htm It was commissioned by the former co-chairs of the CIOC's AIC, John Gilligan, CIO of the Air Force, and Norm Lorentz, CTO at OMB. It was declared a "success story" in the CIOC's strategic plan for FY2007-2009. See pages 13 & 14 (PDF pages 15 & 16) at
The ET.gov site has been up and running for several years. More than 100 components and specifications have been registered and are discoverable at http://et.gov/component_search.aspx as well as via IntelligenX's search service at http://etgov.i411.com/etgov/websearchservlet?toplevel=true&
About 15 of them have progressed to Stage 2 of the ET.gov process --
http://et.gov/stage2.htm#CoPs Five have reached Stage 3 --
http://et.gov/stage3.htm#CoPs -- and three have "graduated":
http://et.gov/stage4.htm Two of those -- PDF/A and X3D -- have been incorporated into the FEA TRM.
Considering the relatively small amount of money originally allocated by EPA (when Mark Day, EPA's Deputy CIO co-chaired the CIOC/AIC's ET Subcommittee) for development of the site and the fact that no additional funding has been provided since then, it might be fair to suggest the ET.gov site/process has been a success. However, in truth, the jury is still very much out on the questions of whether:
a) Federal agencies really do want to collaborate more efficiently and effectively together to evaluate, demonstrate, prove and implement emerging technologies, and if so,
b) they want to use the ET.gov site/process or some other as-yet-undetermined means to do so.
You once quoted one of your fellow civil servants as saying "We can't deal with vendors coming at us with intergalactic solutions." What should vendors know about approaching the federal government with an "intergalactic solution?"
Norm Lorentz, CTO at OMB, is the one who made that statement, when he and John Gilligan, asked the ET Subcommittee to develop the ET.gov site and process. Now that he has gone over to the "dark side" again, Norm might be better qualified to answer your question than I. However, for my part, I must confess that it still seems to me that too many folks are too easily impressed with large, slick, fancy, so-called "solutions" that are too complex for anyone, including their proprietors to fully understand, much less effectively support.
Although Frank Raines' name has been sullied in the intervening years, his name was affixed to some very good guidance given to Federal agencies more than a decade ago when he was the Director at OMB. My favorite among the eight points that became known as "Raines' Rules" was number seven, which directed agencies to implement IT in "phased, successive chunks as narrow in scope and brief in duration as practicable, each of which solves a specific part of an overall mission problem and delivers a measurable net benefit independent of future chunks."
Unfortunately, it still seems to me that it is possible to fool too many of
the people too often and, thus, government agencies continue to waste far
too much of the taxpayers' money on "intergalactic solutions." However, I
remain hopeful that such foolishness may not continue indefinitely, if for
no other reason than it is becoming clearer and clearer that "change is
coming" ... because we can no longer afford "business as usual."
Small vendors tell me all the time how they cooperated with the federal government to develop an idea, often without compensation, only to see it handed off to one of the very large well known contractors. How can they avoid that unpleasant experience?
I don't think I'm qualified to answer this question, although I will say that I do believe the Federal procurement process is badly broken. I believe the process would be vastly improved by greater openness and transparency, in contrast to the current, highly centralized and secretive process. Perhaps the best I can suggest is that:
a) it doesn't cost anything to use the ET.gov site to identify emerging technology components, specifications, and services that may be of interest to .gov agencies, and
b) if someone uses the ET.gov process to identify something for which another vendor is subsequently paid to do work for Uncle Sam, at least the record would be clear as to who proposed it first and the government may feel some additional obligation to justify paying someone else to carry it out.
What are some of the things you would like to see the federal government do with the Web and RSS to make government more transparent and citizen centric?
First of all, as suggested in the EEIRS report, agencies should post all of their public records on their Web sites, so that they can be indexed by the search engines. http://www.cio.gov/documents/EEIRS_RFI_Response_Analysis.pdf
Second, they should specify XML schemas for all of the records, and they should post all of those schemas on their Web sites so that XML registry services can be built from the bottom up.
Third, consistent with the E-FOIA amendments, agencies should begin to create and maintain their records in XML format so that they can easily be made available in whatever formats they may be requested.
Fourth, agencies should participate in the finalization and use of the XML schema (XSD) for the FEA DRM: http://xml.gov/draft/drm20060105.xsd In the FEA PMO's assessment of enterprise architecture programs, agency scores on the DRM performance element should be based upon the degrees to which they have documented their data collections on their Web sites in conformance with the XSD for the DRM.
Fifth, as directed by subsection 202(b)(4) of the eGov Act, agencies should:
a) use AIIM's emerging StratML standard to explicitly identify the stakeholders for each of their strategic objectives,
b) embed in each and every one of their records a metatag(s) identifying the strategic objective(s) it supports and, thus,
c) enable the discovery of all of their records based upon the stakeholders
to which they apply.
What is StratML?
Strategy Markup Language (StratML) is an XML vocabulary and schema containing the elements that are common not only to the plans that U.S. federal agencies are required to compile and maintain under the Government Performance and Results Act (GPRA) but also to the strategic plans of all organizations.
The prospective purposes of the emerging StratML standard are outlined at
http://xml.gov/stratml/index.htm#DefinitionPurposes. Under the auspices of AIIM, we aim to establish it as an international voluntary consensus standard for potential use by all organizations worldwide, thus enabling population of the *Strategic* Semantic Web.
In the service to the notions of citizen-centricity and the government as a single "enterprise," as well as conformance with OMB Circular A-119, it would not be unreasonable to think that OMB might require U.S. federal agencies to post their GPRA plans on their Web sites in StratML format.
How did AIIM come to be involved?
I originally approached AIIM in December 2003 because:
a) I had been a member of AIIM since 1995, and
b) AIIM was then touting itself as the "strategic content management" association and I wanted to help them make that concept real, rather than merely a marketing slogan.
The full history of StratML is documented at
Why should private businesses who don't have significant federal work care about StratML?
Anyone who cares about the concept of "strategic alignment" has a stake in the success and widespread usage of StratML. There is nothing unique or "inherently governmental" about strategic planning. Any organization that wants to be effective must have a clear understanding about what it aims to do and how it will "align" its resources to achieve its objectives.
In times like these, it is more important than ever not only to use our own resources wisely but also to partner more efficiently and effectively with others with whom we share common objectives. That is the essence of StratML, whose purposes are more fully outlined at
Where do you see the XML industry going in the near future? Are we close to the semantic web?
There is little doubt that innovation will continue apace in the XML community. Just as the simplicity of HTML, HTTP, and IP enabled the explosion of the Web, the relative ease with which XML enables the sharing of data means that we haven't seen anything yet by comparison to what we will soon experience. Although the financial realities that have recently become too compelling to ignore any longer will place increasing pressure on IT budgets, those same realities increase the need to apply IT more
efficiently and effectively. Indeed, more and better usage of technology --
particularly information technology -- is the *only* way that we can hope to avoid reliving the mistakes of the past, much less continuing progress into the future.
Regarding the semantic web, I'm not sure that the business case has been sufficiently well-established, much less that most people have any idea yet as to how they might contribute to and benefit from it. However, I do hope that the emerging StratML standard will foster the development of a worldwide Web of organizations and individuals who choose to lead mission/goal-directed lives and seek to pursue those goals in collaboration with others who share their objectives.