data sharing
02.03.2013
Budget, congress, criminal justice, Data, data sharing, Information sharing, justice, law enforcement, Law enforcement information sharing, leadership, LEIS, N-DEx, NIEM
If you want law enforcement agencies to share information, go to the source and help the Chiefs and Sheriffs to push their data in the FBI’s National Data Exchange N-DEx. Trying to impose information sharing with unfunded standards mandates will not work.
As someone who has been in the standards business since 1995, history has proven to me that:
- The business need must drive standards, standards can NEVER drive the business; and
- Trying to SELL the business on standards is a losing strategy.
Hi Congressman Reichert,
You won’t remember me, but a long time ago we were in meetings together in Seattle with the likes of John McKay, Dave Brandt, Scott Jacobs, Dale Watson, and others working on building the Law Enforcement Information Exchange (LInX); I was the technical guy on the project, working with Chief Pat Lee and our very dear lost friend Julie Fisher (may she rest-in-peace, I sure miss her).
A hell of a lot of water has gone under the bridge since then–it’s been nearly TWELVE YEARS. If we look back over this time, we have had so many bills, laws, strategies, policies, papers, speeches, conferences, proclamations, and other assorted attempts to prod law enforcement data loose from the nearly 18,000 agencies across our country. While we are far better off than we were back then, I think we can agree that we still have a long way to go.
Where we differ, I’m afraid, is in the approach to get there – a few days ago, you proposed legislation, the Department of Justice Global Advisory Committee Authorization Act of 2013, as a means to improve information sharing among law enforcement agencies – do we really believe another “stick” will work to get agencies to share information? Do we really believe it’s a technology or data standards problem that’s preventing law enforcement data from being shared? As a technologist for 34 years, and someone who has been involved in law enforcement information sharing since the Gateway Project in St. Louis, MO in 1999, I can tell you it is neither.
While I applaud the work of the GAC, and I have many colleagues who participate in its work, I’m afraid having more meetings about information sharing, developing more standards, approving more legislation, and printing more paper will NOT help to reach the level of information sharing we all want.
Instead, I want to propose to you a solution aimed at capturing the commitment of the men and women who can actually make law enforcement information sharing happen, and virtually overnight (metaphorically speaking) – namely, the great men and women who lead our police and sheriffs departments across America.
Now to be fair, many of these agencies are already contributing their records to a system I am sure you are familiar with called the National Data Exchange (N-DEx). Built by the FBI CJIS Division, this system has matured into a pretty respectable platform for not only sharing law enforcement information, but also for helping cops and analysts to do their respective investigative and analytic work.
Now, in case you are wondering, I do not own stock in any of the companies that built N-DEx, nor has the FBI signed me up as a paid informant to market N-DEx. I write to you on my own volition as a result of my nearly six years of volunteer work as a member of the International Association of Chiefs of Police (IACP) Criminal Justice Information Systems (CJIS) Committee.
About two years ago I volunteered to lead a small sub-group of the committee who have either built, led, or managed municipal, state, federal, or regional information sharing systems. Our charge was (and still is) to help CJIS take a look under the hood of N-DEx to see what’s in there (data wise) and to help figure out what needs to be done to make it a more effective tool to help cops across America catch more criminals, and maybe, just maybe, even prevent criminals from acting in the first place.
While our work is far from done, I can tell you that one thing we need is more data – as you well know, be it N-DEx, LInX, RAIN, or any other information sharing system, it is only as good as the data that’s put into it.
Believe it or not we already have the data standards in-place to get the data into N-DEx. CJIS has developed two Information Exchange Packet Descriptions (IEPDs) that tells agencies exactly what to do and how to format and package up their data so it can get to N-DEx. Additionally, CJIS has an extensive team ready to assist and my colleagues over at the IJIS Institute hold training sessions sponsored by BJA, to help agencies along the process (NIEM training).
These two IEPDs can help law enforcement agencies today to share the following law enforcement records:
- Service Call
- Incident
- Arrest
- Missing Person
- Warrant Investigation
- Booking
- Holding
- Incarceration
- Pre-Trial Investigation
- Pre-Sent Investigation
- Supervised Release
So what’s the hold up? Speaking only for myself, and I will be very straight with you, I believe the root cause for not getting more law enforcement data into N-DEx is the current piecemeal, politically charged, hit and miss grant funding process that the Act you propose, if passed, will burden even further – see page 3, lines 17-25 and page 4, lines 1-6.
Instead, I ask that you please answer the following question…
If law enforcement information sharing is important enough to push though a Public Act, where is the nationwide project, with funding, to get all shareable law enforcement data loaded into the one system that would give ALL law enforcement officers and analysts access to collective knowledge of the nearly 18,000 law enforcement agencies?
The immediate answer might be “we already have one; N-DEx;” however, N-DEx is only a piece of the answer…it’s as they say, “one hand clapping.” And in all fairness to my friends and colleagues at the FBI CJIS Division, that program was only charged and funded to build the N-DEx bucket, they were never funded to actually go get the data to fill the bucket.
The strategy, for whatever reason back then, was relegated to a “build it and they will come” approach, that IMHO has not worked very well so far and may take another 5-10 years to work. I should also note that the bucket isn’t totally empty…there are quite a number of agencies and regional projects, like LInX, that have stepped up and are helping to fill the bucket – however, if we want to expedite filling up the bucket, focusing on mandating more standards is not the answer
What I submit is the “other hand clapping” is the need for a shift focus, away from policy, standards, and technology, and establish a funded nationwide project that will offer a menu of choices and support packages to the Chiefs and Sheriffs that will enable them to start sending as many of their shareable records as possible to N-DEx.
Some of the options/support packages could include:
- Provide direct funding to agencies and regional information sharing systems to develop N-DEx conformant data feeds to N-DEx;
- Grant direct funding to RMS and CAD system providers to develop N-DEx conformant data feeds from their software, with the stipulation they must offer the capability at no additional cost to agencies that use their products;
- Establish a law enforcement data mapping assistance center, either bolted on to IJIS NIEM Help Desk, as an extension of NLETS menu of services, or through funding support at an existing information sharing project like the Law Enforcement Technology, Training, & Research Center who works in partnership with the University of Central Florida.
At the end of the day, we all know that the safety and effectiveness of law enforcement is greatly affected by the information he or she has at their fingertips when responding to that call.
Do you really want to leave it to chance that that officer’s life is taken, or a criminal or terrorist is let go because his or her agency wasn’t “lucky enough” to win the grant lottery that year?
So, let’s empower the single most powerful force that can make sure the information is available – the Sheriff or Chief leading that agency. Let’s stop with the unfunded mandates, laws, standards, studies, point papers, etc., and let’s finally put a project in-place with the funding necessary to make it happen.
v/r
Chuck Georgo,
Executive Director
NOWHERETOHIDE.ORG
chuck@nowheretohide.org
05.04.2012
Data, data sharing, JIEM, Law enforcement information sharing, LEIS, NIEM
I posted this response to a question on the LinkedIn NIEM group where someone asked about the slow rate of NIEM adoption; i thought i would cros post my response here.
What’s standing in the way of NIEM adoption?It’s about leadership.
- It’s about reducing complexity.
- It’s about getting the word out.
- It’s about opening up proprietary protocols.
- It’s about conformance.
- It’s even about standards.
What’s really standing in the way? Two things…a) utility and b) a market for it.
I think it would also be wise for us to take a few pages out of the eCommerce, EDI, EBXML world (and honestly, the internet as a whole). EDI became a standard because large companies said “if you want to do business with me, then you will stop faxing me POs and start sending them to me in this new thing called EDI. When XML appeared on the scene the same companies converted their information exchanges to EBXML and vendors and service providers folllowed suit – one might say if it weren’t for the EDI-to-EBXML movement, we might not even be talking about GJXDM or NIEM today; EBXML was groundbreaking in ts day.
So what’s in the way? I’ll look at this in terms of two things I mentioned above:
- Utility of NIEM– The “technology Acceptance Model” tells us that for increased adoption of a technology, it must be “useful and easy to use.” Today, however, we are having difficulty getting people to see the utility of NIEM, and it certainly has not proven itself to be easy to use either. Now, to be fair, NIEM started off life as dictionary of common data elements (words if you will) with working views of syntax and semantics. Then we have IEPDs. These are like sentences strung together, and by different authors, and the difficulty is that we don’t have a good way to know how well those sentences are strung together, whether or not we can assemble those sentences into comprehensible paragraphs, or even what “stories”(and where in those stories) those sentences might belong. In other words, I don’t think NIEM is coupled tightly enough to the business processes of Justice and Public Safety agencies. To become more useful, we must dust off JIEM, revalidate the inventory of Justice exchanges, and specifically tie them to NIEM IEPDs. And while we do this, we must clean up the inventory of IEPDs, remove ones that are toublesome, and reference those IEPDs back to the Justice business process and exchanges in JIEM.
- A Market for NIEM– Unfortunately, reuse has NOT always result in cost savings. There are a number of examples where agencies have had bad experiences with implementing NIEM, whether it was because of lack of skill of the integrator, poor IEPD design, poor project planning, immature integration tools, or good old politics, saying an agency will save money by using NIEM is not a strong position right now. To resolve issue (after we join NIEM and JIEM) I think we must attack the problem at the root–in the technology acquisition process. Stop with the buttons and bumper stickers and neat shirts (I have one too). What we need to do is drive NIEM use through RFPs and contracting processes. Of course, we have to first clean up the clearinghoue, but then we must help agencies to craft RFP language that they can use to call for use of NIEM and “NIEM-enabled” web services to effect the information exchanges called out in the business processes to be suported by the new technoogy acquisition. While some vendors have demonstrated leadership in this area, the real driver (in a free market economy) is the contracting process–vendors will invest in their ability to adopt and integrate NIEM if it’s in their financial interest to do so–they do have payrolls to meet and investors to keep happy. Some shining light in this quest is also the effort by IJIS and others who are working hard to establish a “standards-like” effort to clean up IEPDs and to help vendors demonstrate conformance (or compliance) to those standards for their products.
Your comments/thoughts are welcomed….r/Chuck
02.06.2011
computer security, cyber security, data sharing, Information sharing, law enforcement, Law enforcement information sharing, LEIS, security, security threats, Uncategorized
So it’s no great revelation that public safety has benefited greatly from public private partnerships, and I’m cool with that, especially when we are dealing with technology that saves lives. However, a press release hit my email inbox today that made me think of the risks to security and privacy when we implement innovative technologies.
Before I get into the story it, let me be v-e-r-y clear…I am NOT here to debate the effectiveness or morality of red-light/speed enforcement systems, nor am I here to cast dispersions on any of the organizations involved in the press release…this blog posting is strictly about using the Gatso press release to emphasize a point about security and privacy – when we engage in innovative law enforcement technology solutions, we need to take extra care to adequately address the security and privacy of personally identifiable information.
Here’s the press release from Gatso-USA:
GATSO USA Forms Unique, Strategic Partnership with Nlets
Earlier this month, GATSO USA was approved as a strategic partner by the Board of Directors of the National Law Enforcement Telecommunications System (Nlets). Nlets is….general narrative about NLETS was deleted. The approval of GATSO is an exciting first for the photo-enforcement industry.
Nlets will be hosting GATSO’s back office and server operations within the Nlets infrastructure. GATSO will have access to registered owner information for all 50 states plus additional provinces in Canada. The strategic relationship has been described as a “win-win” for both organizations.
From Nlets’ perspective, there are key benefits to providing GATSO with hosted service. Most importantly, it virtually guarantees personal data security. Due to this extra step of storing personal data behind the DMV walls of Nlets, the public can be assured that security breaches — such as the recent incident with PlayStation users — are avoided.
From GATSO’s perspective, hosting the system with Nlets will provide a ruggedized, robust connection to comprehensive registered owner information — without the security issues faced by other vendors in this industry. Nlets was created over 40 years ago…more stuff about NLETS was deleted).
The main points I took away from this press release were:
- Nlets is going to host the back-end server technology that GATSO needs to look up vehicle registration information of red-light runners;
- Gatso is going to have access to vehicle registration information for all vehicles/owners in ALL 50 states in the U.S. and (some) provinces in Canada; and
- And, because it’s behind Nlets firewalls, security is not an issue.
Again, please don’t call me a party-pooper as I am a huge advocate for finding innovative ways to use technology to make law enforcement’s job easier. However, I am also painfully aware (as many of you are) of the many security and privacy related missteps that have happened over the last few years with technology efforts that meant well, but didn’t do enough to make sure that they covered the bases for security and privacy matters. These efforts either had accidental leakage of personal information, left holes in their security posture that enables direct attacks, or created opportunities for nefarious evil-doers with legitimate access to use that access to sensitive information for other than honorable purposes.
After I read the press release, I thought that it would be a good case-study for the topic of this blog – it involved innovative use of technolgy for law enforcement, a psuedo-government agency (Nlets), two foreign-owned private companies, and LOTS of PII sharing – some might even say it had all the makings of a Will Smith movie. 🙂
To help set the stage, here are a few facts I found online:
- Gatso-USA is a foreign company, registered in New York State, operating out of Delaware; its parent company is a Dutch company, GATSOmeter BVGatso.
- Gatso does not appear to vet all of the red-light/speed violations itself; it uses another company – Redflex Traffic Systems to help with that (Redflex is not mentioned in the press release).
- Redflex seems to be a U.S. company, but it has a (foreign) parent company based in South Melbourne, Australia.
- Finally, there are no-sworn officers involved in violation processing. Red-light/speed enforcement cameras are not operated by law enforcement agencies; they outsource that to Gatso, who installs and operates the systems for local jurisdictions (with Redflex) for free, (Gatso/Redflex is given a piece of the fine for each violation).
There are no real surprises here either; there are many foreign companies that provide good law enforcement technologies to jurisdications across the U.S., and outsourcing traffic violations is not new…BUT what is new here is that a sort-of-government agency (Nlets), has now provided two civilian companies (with foreign connections) access to Personally Identifiable Information (PII) (vehicle registrations) for the entire U.S. and parts of Canada…should we be worried?
Maybe; maybe not. Here are nine questions I would ask:
- Personnel Security: Will Nlets have a documented process to vet the U.S. and overseas Gatso and Redflex staff who will have access to this information through direct or VPN access to Nlets systems?
- Data Security: Will Gatso or Redflex maintain working/test copies of any of the registration information outside of the Nlets firewall? If so, are there documented ways to make sure this information is protected outside the firewall?
- Data Access: Will Gatso/Redflex have access to the entire registration record? or, will access be limited to certain fields?
- Code Security: Will any of the code development or code maintenance be done overseas in the Netherlands or Australia? If so, will all developers be vetted?
- Network Security: Will overseas developers/site suport staff have access to the data behind Nlets firewalls? What extra precautions will be taken to protect Nltes systems/networks from abuse/attack?
- Code Security: Will Nlets conduct any security testing on code loaded on the servers behind their firewalls?
- Stakeholder Support: Have all 50 U.S. states, and provinces in Canada, been made aware of this new information sharing relationship? Do they understand all of the nuances of the relationship? And, are they satisfied that their constituents personal information will be protected?
- Audit/Logging: Will all queries to vehicle registration information logged? Is someone checking the logs? How will Nlets know if abuses of authorized access are taking place?
- Public Acceptance: How do states inform their constituents that their personal vehicle registration information is being made available to foreign owned company? Will they care?
How these questions are answered will determine whether or not we should worry…
Did I miss any other important questions?
Beyond this particular press release and blog posting, I suggest that you consider asking these kinds of questions whenever your agency is considering opening/connecting its data systems to outside organizations or private companies—it may just prevent your agency from becoming a headline on tonights news, like St. Louis –> St. Louis Police Department computer hacked in cyber-attack .
The bottom-line is that whenever you take advantage of opportunities to apply innovative technologies to public safety, make sure that you cover ALL the bases to protect your sensitve data and PII from leakage, direct attacks, or misuse and abuse.
As always, your thoughts and comments are welcome.
r/Chuck
28.01.2011
data sharing, Information sharing, Law enforcement information sharing, LEIS
Almost two years ago, i responded to a blog posting by Jeff Jonas entitled “Nation At Risk: Policy Makers Need Better Information to Protect the Country.” After a recent discussion about law enforcement information sharing with a colleague, i thought it might be worthy to re-run my response here…read the posting below and let me know what you think…r/Chuck
March 17, 2009
Hi Jeff,
With sincere apologies to Sean Connery, I am dismayed that people are still bringing a knife to an information sharing gun fight—the importance of information sharing, data discoverability, security protections, metrics and incentives, and empowerment have been documented many times over since I became involved in information sharing in 1999 and have proved to be of little value to making information sharing happen.
I believe a significant reason for this is that information sharing has been seen as the “main thing.” Information sharing should NEVER be seen as the main thing; it is simply a means to an end. I have never forgotten what Scott McNealy of Sun Microsystems said—“The main thing is to keep the main thing the main thing.” And, the main thing for government is safe streets, clean air and water, a strong economy, etc…NOT information sharing.
The “guns” that we need to bring to the information sharing table are simply engaged executive leadership and accountability for mission results.
Of the many significant information sharing projects around the country that I have been a part of, I can tell you that the most important ingredient for successful information sharing is: “An agency executive who actively communicates an operational imperative for mission success and then holds their managers accountable for using information sharing as a critical enabler for achieving desired mission results.” [I have a few blog posts on the subject at http://www/nowheretohide.org/wordpress]
While I agree that good security, good technology, good project management, good metrics and the like are necessary, none of this will matter if the need for information sharing is relegated two or three levels down the organization chart or is just seen as an edict from above—federal, state, and municipal agencies are already choking on multiple (and often conflicting and unfunded) mandates.
With my apologies to our President, the PM-ISE, and the Markle Foundation there is nothing more they can print on a sheet of paper to make information sharing happen—hundreds of executive orders, national strategies, task force reports, and security policies have been published—what more could they possibly say?
I believe it now comes down to the individual will of executive leadership in those federal, state and municipal agencies who hold the information that should be made shareable and their capacity to make it happen within their respective agencies. And that Jeff is the one area where I do believe that President Obama and our Congress can help—by simply ensuring that the people they choose to lead those agencies a) truly embody the will, character, and leadership qualities to achieve the mission and b) understand the value that information sharing brings to help make that happen.
r/Chuck Georgo
chuck@nowheretohide.org
30.01.2010
data sharing, Information sharing, Law enforcement information sharing, privacy, security
The Poneman Institute, considered the pre-eminent research center dedicated to privacy, data protection and information security policy, released its 2009 Ponemon Institute “Cost of a Data Breach” Study on January 29, 2010.
In the report, they published the results of their fifth annual study on the costs of data breaches for U.S.-based companies. They surveyed 45 companies represnting 15 various industry sectors–significant contributors were financial, retail, services and healthcare companies.
Numbers-wise, the companies they interviewed lost between 5,000 and 101,000 records, at a cost range between $750,000 and $31 million.
What was really interesting was that the average per-record cost of the loss was determined to be $204.00–and how many records does your law enforcement/public safety agency hold?
Some factors they considered in computing the cost of the breach included:
- Direct costs – communications costs, investigations and forensics costs and legal costs
- Indirect costs – lost business, public relations, and new customer acquisition costs
The report also lists a number of causes for the data breaches, such as:
- 82% of all breaches involved organizations that had experienced more than one data breach
- 42% of all breaches studied involved errors made by a third party
- 36% of all breaches studied involved lost, misplaced or stolen laptops or other mobile computing devices
- 24% of all breaches studied involved some sort of criminal or other malicious attack or act (as opposed to mere negligence).
You can download the full report here: http://www.encryptionreports.com/download/Ponemon_COB_2009_US.pdf
Thoughts and comments welcomed…r/Chuck
02.01.2010
data sharing, Open Government, privacy, Processes, security, transparency
Following up on my comments and thoughts about the Open Government Directive and Data.gov effort, i just posted five ideas on the “Evolving Data.gov with You“ website and thought i would cross-post them on my blog as well…enjoy! r/Chuck
1. Funding – Data.gov cannot be another unfunded federal mandate
Federal agencies are already trying their best to respond to a stream of unfunded mandates. Requiring federal agencies to a) expose their raw data as a service and b) collect, analyze, and respond to public comments requires resources. The requirement to make data accessible to (through) Data.gov should be formally established as a component of one of the Federal strategic planning and performance management frameworks (GPRA, OMB PART, PMA) and each agency should be funded (resourced) to help ensure agency commitment towards the Data.gov effort. Without direct linkage to a planning framework and allocation of dedicated resources, success of Data.gov will vary considerably across the federal government.
2. Strategy – Revise CONOP to address the value to American citizens
As currently written, the CONOP only addresses internal activities (means) and doesn’t identify the outcomes (ends) that would result from successful implementation of Data.gov. In paragraph 1 the CONOP states “Data.gov is a flagship Administration initiative intended to allow the public to easily find, access, understand, and use data that are generated by the Federal government.”, yet there is no discussion about “what data” the “public” wants or needs to know about.
The examples given in the document are anecdotal at best and (in my opinion) do not reflect what the average citizen will want to see–all apologies to Aneesh Chopra and Vivek Kundra, but I do not believe (as they spoke in the December 8th webcast) that citizens really care much about things like average airline delay times, visa application wait times, or who visited the Whitehouse yesterday.
In paragraph 1.3 the CONOP states “An important value proposition of Data.gov is that it allows members of the public to leverage Federal data for robust discovery of information, knowledge and innovation,” yet these terms are not defined–what are they to mean to the average citizen (public)? I would suggest the Data.gov effort begin with a dialogue of the ‘public’ they envision using the data feeds on Data.gov; a few questions I would recommend they ask include:
- What issues about federal agency performance is important to them?
- What specific questions do they have about those issues?
- In what format(s) would they like to see the data?
I would also suggest stratifying the “public” into the different categories of potential users, for example:
- General taxpayer public, non-government employee
- Government employee seeking data to do their job
- Government agency with oversight responsibility
- Commercial/non-profit organization providing voluntary oversight
- Press, news media, blogs, and mash-ups using data to generate ‘buzz’
3. Key Partnerships – Engage Congress to participate in Data.gov
To some, Data.gov can be viewed as an end-run around the many congressional committees who have official responsibility for oversight of federal agency performance. Aside from general concepts of government transparency, Data.gov could (should) be a very valuable resource to our legislators.
Towards that end, I recommend that Data.gov open a dialogue with Congress to help ensure that Data.gov addresses the data needs of these oversight committees so that Senators and Congressmen alike can make better informed decisions that ultimately affect agency responsibilities, staffing, performance expectations, and funding.
4. Data Quality – Need process for assuring ‘good data’ on Data.gov
On Page 9 of the CONOP, the example of Forbes’ use of Federal data to develop the list of “America’s Safest Cities” brings to light a significant risk associated with providing ‘raw data’ for public consumption. As you are aware, much of the crime data used for that survey is drawn from the Uniformed Crime Reporting effort of the FBI.
As self-reported on the “Crime in the United States” website, “Figures used in this Report are submitted voluntarily by law enforcement agencies throughout the country. Individuals using these tabulations are cautioned against drawing conclusions by making direct comparisons between cities. Comparisons lead to simplistic and/or incomplete analyses that often create misleading perceptions adversely affecting communities and their residents.”
Because Data.gov seeks to make raw data available to a broad set of potential users; How will Data.gov address the issue of data quality within the feeds provided through Data.gov? Currently, federal agency Annual Performance Reports required under the Government Performance and Results Act (GPRA) of 1993 require some assurance of data accuracy of the data reported; will there be a similar process for federal agency data made accessible through Data.gov? If not, what measures will be put in-place to ensure that conclusions drawn from the Data.gov data sources reflect the risks associated with ‘raw’ data? And, how will we know that the data made available through Data.gov is accurate and up-to-date?
5. Measuring success of Data.gov – a suggested (simple) framework
The OMB Open Government Directive published on December 8, 2009 includes what are (in my opinion) some undefined terms and very unrealistic expectations and deadlines for federal agency compliance with the directive. It also did not include any method for assessing progress towards the spirit and intent of the stated objectives.
I would like to offer a simple framework that the Data.gov effort can use to work (collaboratively) with federal agencies to help achieve the objectives laid out in the directive. The framework includes the following five questions:
- Are we are clear about the performance questions that we want to answer with data to be made available from each of the contributing federal agencies?
- Have we identified the availability of the desired data and have we appropriately addressed security and privacy risks or concerns related to making that data available through Data.gov?
- Do we understand the burden (level of effort) required to make each of the desired data streams available through Data.gov and is the funding available (either internally or externally) to make the effort a success?
- Do we understand how the various data consumer groups (the ‘public’) will want to see or access the data and does the infrastructure exist to make the data available in the desired format?
- Do we (Data.gov and the federal agency involved) have a documented and agreed to strategy that prepares us to digest and respond to public feedback, ideas for innovation, etc., received as a result of making data available through Data.gov?
I would recommend this framework be included in the next version of the Data.gov CONOP so as to provide a way for everyone involved to a) measure progress towards the objectives of the OMB directive and b) provide a tool for facilitating the dialogue with federal agencies and Congress that will be required to make Data.gov a success.
29.12.2009
Analysis, Data, data sharing, Open Government, transparency
I just finished commenting on Data.gov on the NIEM LinkedIn Group and thought I would share what I wrote here on my blog.
I just finished watching a rerun episode of Tough Love on VH1 and I know some of you will think this is a bit odd, but the show led me to some thoughts about how to give the Data.gov project some focus and priority.
You’re probably wondering what Data.gov has to do with eight beautiful women looking for marriage and long-lasting love, but believe it or not, the show and Data.gov have a lot in common.
In this particular episode of the show, the “boot camp” director was focusing on communication skills. He made it very clear to the ladies that communication is very important in making a good first impression with a would be suitor. In the show he counseled the ladies that if they wanted to make a good impression, the ladies would need to:
- Listen carefully to what their date is telling them about what’s important to them;
- Make the conversation about “them” on first contact and avoid bragging about yourself; and
- Resist the urge to reveal too much information about their own respective private lives.
While I will avoid speaking to the validity of this counsel as it applies to love, I would like to suggest that these three rules are also quite relevant in our efforts to have a more transparent, open and collaborative government.
Along these lines, I offer the following three suggestions for Data.gov’s first (transparent, open and collaborative) date with America:
- Ask the public (and Congress) what they specifically want to see on Data.gov and the forthcoming dashboard; all apologies to Aneesh Chopra and Vivek Kundra, but I do not believe (as they spoke in the December 8th webcast) that citizens really care much about things like average airline delay times, visa application wait times, or who visited the Whitehouse yesterday. I particualry suggest they work with Congressional Oversight Committees to make Data.gov a tool that Congress can (and will) use.
- Make Data.gov about demonstrating the good things that Federal agencies do that directly impact the general public. It’s no surprise that most agencies do a poor job of explaining to citizens what they do. I suggest reviving the OMB Performance Assessment Rating Tool (PART) Program (which appears to have died on the vine with the new administration) and use the performance measures in the Program Results/Accountability section to better communicate the relevant value these agencies deliver to citizens.
- Focus Data.gov data sources and the desire for openness on the critical few measures and metrics that matter to the public. Avoid the urge to just “get the data posted” – not many people will care about how many kilowatt hours of hydroelectric power the Bureau of Reclamation is counting, how many FOIA requests the Department of Justice received, or the Toxic Release Inventory for the Mariana Islands. Information sharing is most successful when it is directly relevant with the person (or agency)with whom you are sharing.
I’ll let you know if the next episode is as enlightening as this was. 😉
r/Chuck
17.12.2009
Analysis, data sharing, Information sharing, Open Government
Before you send me hate mail let me state that I am all for Federal agencies sharing data in the sprit of open government, but we have to do it smart way, making sure that:
- We fully understand why we want it and are clear about what we are really asking for;
- We understand the burden involved in achieving open government and that we fund the agencies to do it right;
- We are clear about the performance questions that we want the [transparent] data to answer;
- We have an understanding for how the public will want to see/access the information; and
- We are fully prepared to digest and respond to received public feedback .
After reading the 3,185 words of the Office of Management and Budget (OMB) Open Government Directive (with attachment), I am very sorry to report that IMO none of the five critiera (conditions) listed above have been met by the language contained in the document. From what I read:
- It would appear that no one in the approval chain asked any hard questions about the language–much of the language used is very vague and leaves a lot of room for interpretation (or misinterpretation);
- There is no mention of how agencies will be funded to build the capacity to meet the additional workload that the requirements of the memorandum are certain to cause.
- The focus of the document to “get agency data on the web” and “solicit (direct) public feedback” appears to be totally out of context of any other strategic management, performance assessment, or planning framework. This appears to ba an end-run around other oversight committees and organizations, like Congress. Will Federal agencies be able to deal with direct feedback from hundreds or thousands of citizens? I am reminded of the old adage “be careful what you ask for”…;
- The document tells agencies to “publish information online in an open format that can be retrieved, downloaded, indexed, and searched by commonly used web search applications;” however, this can be satisfied in many ways–.txt, .csv, .doc, .pdf, .html,.xml, etc.–some formats will make it very cumbersome for the “public” to view, analyze and understand the data.
- Finally, the memorandum sets what I believe to be some very unrealistic expectations from both a performance and timeline perspective. For example, how can agencies be expected to review and respond to public input from the web when these same agencies are already overwhelmed with their current day-to-day tasks?
Here are a couple examples to ponder:
On Page 2 – “To increase accountability, promote informed participation by the public, and create economic opportunity, each agency shall take prompt steps to expand access to information by making it available online in open formats”
- Nowhere in the memorandum are the terms “accountability” or “informed participation” defined
- What does “create economic opportunity” really mean?
- It would appear that this mandate circumvents established management processes for holding Federal agencies accountable for efficient and effective performance? (OMB,GAO, Congress)
On Page 3 – “Each agency shall respond to public input received on its Open Government Webpage on a regular basis…Each agency with a significant pending backlog of outstanding Freedom of Information requests shall take steps to reduce any such backlog by ten percent each year.”
- What do the mean by “respond to public feedback on a regular basis?”
- All feedback? Some feedback?
- What does “regular basis” mean? Within 24 hours? Weekly? Annually?
If we really want Federal agencies to be more “open” with their data and information, we must be willing to commit the effort required to:
Be clear about what we really want them to do;
Give them the funding to do it right;
Drive data openness with specific questions we want answered;
Present the data in a way that the public can easily understand it; and
Be ready and willing to act on the feedback we’re sure to receive.
What are your thoughts and comments on this issue?
Thanks…r/Chuck
03.08.2009
data sharing, Evaluation, Information sharing, LEIS, NIEM, Performance Measures
I just finished reading of your appointment on the FederalNews Radio website. As you begin your review of the state of information sharing and the ISE, I would like to offer up some thoughts as someone who has been an information sharing evangelist for nearly a decade. here are seven points to consider:
- Resist the urge to see information sharing as an outcome. Information sharing is a means to an end, not the end itself. Each federal agency, every state and regional fusion center, and all law enforcement intelligence units should have a clear set of information requirements, questions if you will, that information sharing and the intelligence process should work to answer–hold agencies accountable for having clear and valid requirements. This has been a common practice in the intelligence community for decades and should be a practice for all information sharing elements.
- Build clear accountability into the information sharing process. Every federal agency, fusion center and law enforcement agency should have one person, preferably an impassioned, well-respected leader, that can ensure that their agencies requirements are well documented and communicated horizontally across federal boundaries and vertically to local, state, and municipal agencies, and (where applicable) private sector organizations.
- Establish clear linkage of information sharing to agency operational performance measures. Just as staffing, information technology, facilities, and utilities are seen as strategic resources in a performance-based budget, information sharing must be seen as a resource to be strategically used to help an agency achieve its mission. When measuring the success of information sharing, focus on the extent to which it helped achieve agency goals–just as counting cases in law enforcement is a misleading way to judge public safety success, counting RFIs, records shared, SARs submitted is not a good way to gauge information sharing success–successful information sharing can only be measured through the extent to which it helps agencies (at all levels) achieve their operational goals.
- Discourage agencies from using stovepiped portals for information sharing. All shareable data should be available as a “service” for consumer agencies to ingest into their systems and not through a dedicated portal that users will need a discrete login to access. You can read my previous “Portal-mania” blog post for more detail here, but all federal agencies should be required to make their data accessible through National information Exchange Model (NIEM) based web services. This will enable consumer agencies to integrate multiple data streams into their workflow and will reduce the number of websites and portals analysts are required to access to perform their work.
- Give the same amount of attention to what is shared and how it is shared. Over the last few years, a significant amount of effort has gone into how information is shared at the expense of understanding the depth and breadth of information actually being shared. Many regional and national information sharing efforts still only contain basic levels of information, or worse are just pointer systems that require additional human effort to gain access to the actual record. Encourage agencies to communicate to each other what specific information is being shared, and what is not being shared, and help everyone understand the consequences of their decisions.
- Encourage maximum use of NIEM and the Information Exchange Package Descriptions (IEPD) contained it its clearinghouse. NIEM has emerged as the dictionary of shareable data elements. When you string together sets of these data elements to satisfy a specific business need, an IEPD is born. The NIEM IEPD clearinghouse contains more than 150 IEPDs, many of which apply to national security, law enforcement and public safety missions. While many federal agencies have pledged their support of NIEM, more effort is needed to ensure that they first seek to use IEPDs already contained in the clearinghouse and do not develop one-off IEPDs designed to meet very narrow applications.
- Finally, foster a culture of transparency to help communicate an appreciation of personal civil rights and civil liberties. All information sharing and intelligence operations should engage in proactive efforts to help alleviate any fears that individual privacy and liberties are violated by any of the actions taken by those agencies. In my September 3, 2009 blog posting I list ten questions a fusion center director should ask of their own intelligence operations. I’d like to offer up these questions as a beginning framework for any information sharing or intelligence operation. They also serve as a good framework for evaluating the extent to which information sharing and intelligence operations are in fact seriously working to do the right thing.
In closing, I hope you can see how these seven points help to frame how you might structure a results oriented evaluation of information sharing across our federal agencies and with our state and regional fusion center, and private sector partners. Taken together you will be able to report the extent to which agencies have:
- Documented their information sharing requirements – what needs to be shared;
- Someone who can be directly held accountable for effective and proper information sharing;
- Linked their need for information to specific operational goals and strategies;
- Implemented mechanisms that makes it easy for other agencies to access their information;
- Ensured that they are sharing the right information (most meaningful) information;
- Taken advantage of NIEM as a way to save money and expedite information sharing; and
- Taken measures to proactively diffuse public (and media) perceptions of information misuse.
I wish you well in your new role as Senior Director for Information Sharing Policy.
Regards,
Chuck Georgo
chuck@nowheretohide.org
30.07.2009
Analysis, CJIS, data sharing, fusion center, intelligence center, Law enforcement information sharing, public safety
I had a conversation with a fusion center director yesterday about portals that really drove home a feeling I had about the recent plethora (read: boatload) of portals that the average analyst person supporting public safety and homeland security has to login to in order to do their jobs.
I’m paraphrasing a bit, but he basically indicated that the state, local, and private sector organizations in his state told him that they “DO NOT want to have to log into multiple portals” to stay informed about criminal and terrorism threats to their state’s infrastructure.”
When you take a closer look at the “Portal-mania” that exists, it seems that every agency and multiple programs within a single agency has to have their own portal for accessing the information and analytic tools that agency or program provides; here’s a quick list of ones I am familar with, (feel free to email me the names of others you know about):
- DHS HSIN State and Local Community of Interest (SLIC)
- DHS Lessons Learned Information Sharing (LLIS)
- DHS Automated Critical Asset Management System (ACAMS)
- DOJ Regional Data Exchange (R-DEx)
- DOJ National Data Exchange (N-DEx)
- DOJ eGuardian
- DOJ Law Enforcement Online (LEO)
- DOJ InfraGard
- DOJ National Sex Offender Public Website (NSOPW)
- DOJ National Criminal Intelligence Resource Center (NCIRC)
- DOJ Regional information Sharing System (RISS)
- Private Sector CyberCop
- [State] Criminal Justice Information System (CJIS)
- …add to this Department of the Treasury, Department of Transportation, and other federal agency portals
- …and about three-dozen other databases and private sector websites
This is nutz! Dedicated portals are so 1990’s…we should be able to use the same technology I used to create this website and blog (WordPress and four different plug-in widgets) to make information and advanced analytic capabilities available to Fusion Centers and other public safety users. I would like to challenge the agencies and programs listed above to make the information and capabilities they offer available through widgets, web-parts, and gadgets that Fusion Centers and other intelligence/information sharing users can integrate into THEIR portal of choice.
Whether it’s SharePoint, Oracle, or IBM Websphere, state, local, or private sector organizations should be able to pick and integrate into THEIR selected portal environment from the portal list above the information and capabilities that they need to do their job–they should not have to access the multiple, stovepiped portals as they do today.
I’d like to know what you think about this…Thanks..r/Chuck Georgo