Data Innovators Johan Bos-Beijer

Published on January 19th, 2016 | by Joshua New

0

5 Q’s for Johan Bos-Beijer, Director of Analytics Services at GSA’s Office of the Associate Administrator

The Center for Data Innovation spoke with Johan Bos-Beijer, director of analytics services at the Office of the Associate Administrator at the General Services Administration in Washington, D.C. Bos-Beijer discussed the “analytics as a service” model he developed, as well as how the government’s approach to data has changed over time.

This interview has been edited.

Joshua New: One of the roles of the Office of Citizen Service and Innovative Technologies is to help agencies make better use of technology. When it comes to data technologies, what are some of the common challenges that agencies face?

Johan Bos-Beijer: Agencies face diverse resource constraints. These can take the form of funding, technologies, skills, or capacity limitations. We encounter situations where we help agencies balance expectations to drive mission support with the reality of conflicting technological priorities. Some technologies have been in use for longer durations than others which also means agencies are obligated to evaluate their sunken costs when considering any new technological approaches or capabilities. The information management maturity of an agency can also play a significant role in adoption. Federal and state agencies that we work with have treasure troves of untapped data and are understandably challenged with determining what value is in those data assets.

We also see that we need to enable agencies to take full advantage of more adaptive technologies. With the implementation of the DATA Act, the Federal Information Technology Acquisition Reform Act, the Improper Payments Elimination and Recovery Improvement Act, and other data-oriented legislation, the landscape of technology and data investment has changed for the better. We help with everything from application program interface use, human centric design, alignment of mobile and resident technologies, developing a challenge to fund and execute program objectives, and automating the capital planning and investment processes, to understanding the technology migration path adoption of agile methods and responsive frameworks. We understand that agencies may be interested and willing to adopt these approaches but have to incorporate existing contractual obligations, statutory requirements, and policy driven constraints. I often speak about the “navigation and negotiation” skills that are required for success in government. Just bringing forward thinking to the table is not enough as agencies need to show results using these approaches and invest in change management. We also need to be a partner at the table collaborating as they work on legal, policy, and organizational challenges. There is much more that agencies want to take on but they need trusted, informed, and skilled advisors to do so. That is what we are here to provide.

New: Can you talk about the “analytics as a service” model you have developed?

Bos-Beijer: This service framework has been a wonderful and exciting journey. In 2009 I conceived a method to deliver analytics as a service based on what colleagues and I had seen since the initial days of data warehousing and data marts. Back in 1999, my team created and deployed a successful data warehouse for the Department of Education which took massive amounts of data from disparate mainframe systems and made comparisons easier for program management, auditors, and reviewers. It also was an early stage prediction scoring system. What was missing for us, and at other agencies, was an ability to determine the best data sources and applicable tools and test these before putting the project into full production. The current analytics as a service (AaaS) model I refreshed in 2013 when I co-authored the Common Acquisition Platform program, a federal acquisition platform, with a very forward thinking assistant commissioner who understood the AaaS value to our purchasers and providers. I went back to colleagues who are chief information officers, chief financial officers, program officials, oversight officials, and practitioners in a variety of agencies to validate that their needs were the core of this model and to ensure it was easy to understand and use.

There are four sections: a software and tools category where the analytics and data management tools reside, including lease, subscription, and other license options; an environment category where customers can purchase federally certified development and test space based on their actual use; a data category where existing government and third party data sources reside, which has the potential to develop uniform data use or matching memoranda of understanding; and a service category where customers can acquire domain, integration, service, and other expertise to support their data analytics efforts. The framework is also built on three service delivery models: menu, modified, and managed. The menu is self-service for an agency that knows what they need to buy. The modified is based on blending what an agency has in place that it wishes to augment, change, or enhance. The managed is a full service model where an agency can use service sources to help it manage anything from developing a data analytics concept of operations to training materials to data or behavioral science expertise.

Everything in this AaaS shared service is constructed and will be rolled out based on actual agency needs and requirements. It was validated with the provider and academic communities as they play roles in the development and delivery parts of the model. When I initially proposed this concept in 2009 it was uniformly endorsed however we did not have the resources at the time to deliver the service. Now that the Common Acquisition Platform and other mechanisms are in place, there is a hub for delivery and an ability to align strong resources across the General Services Administration (GSA) and other agencies. We are also in an analytics-focused era in government which means there is a greater interest in adoption.

New: Prior to GSA, you worked as a senior advisor at the Department of Health and Human Services (HHS) to develop the Consolidated Data Analysis Center of Excellence to help combat healthcare fraud. Can you explain this program?

Bos-Beijer: Credit must be given to some very innovative and outcome-oriented executives in HHS, primarily in the Office of the Inspector General. The program had the full support of the Inspector General as well as his principal deputy, and we also had help from large operational components of the agency that wanted to contribute to the effort for mutual benefit. Their approach was to look at optimum use of their massive data assets combined with flexible and responsive technology. As a result they explored a wide range of technology capabilities, starting with a fresh look at data analytics.

I was asked to join the agency as their executive management senior advisor to work on all aspects of this effort, including data sourcing, collaboration opportunities, data validation, analytical methods, technology migration, skills development, security, and defining outcome metrics to demonstrate value. Much of the work centered on fraud prevention while supporting multiple dimensions of evaluation and intervention, as well as prevention methods. The objective of the Center was to establish a foundational framework to bring together talent, technology, and data to enable improved actionable information. This supported an allocation of resources as well which meant value targets for audits or investigations could be worked simultaneously and enabled best use of limited resources. There are aspects of the program which I cannot discuss, but I can say that an approach to fraud prevention and program integrity which starts at the project or objective level, such as this one, has a higher success rate. Defining what outcome or question needs to be answered or achieved enables the best choices in technology, tools, and people who do the work. Their record over the past several years has been commendable as their efforts paid off in protecting taxpayer dollars.

New: You have a long resume of overseeing how the government works with data. Technological advancements aside, how do you think agencies differ from just a decade ago in how they approach data?

Bos-Beijer: It is enlightening and ever-fascinating to witness, and even five years ago the approach was different. The pace has quickened. One noticeable difference is that we are no longer detached from data; we create it and use it on a massive scale today—in our personal lives and in government as well. Not too long ago we logged into our computer at home or work to download mail or read news. Accessibility and time were accepted delay factors. But now, we don’t stop to think of the data magnitude of all the connection capabilities we use throughout our day. The approaches to data in government changed out of global cultural necessity. Torrents rapidly became a deluge of data which has defined our data colloquy. Explaining apparently simple terms has been an important part of the maturity process of how agencies better utilize or share their data. For example, we have to ask “is this actionable information,” and “how do we differentiate between data assets and data debris?”

The role of the chief data officer is new as well. This role is also undergoing development and definition—organizational placement, authority, contribution to the agency, and so on are all still developing.  

Open data, like open source, has been substantial and transformative in the sense that it has obligated people to think differently, approach objectives from different perspectives, orient operations toward citizens and customers of government, and enabled cross-discipline dialogue with and across agencies.

There is also a resilient difference in approaching data now as it has become dimensional—combining sophisticated people and technologies can allow agencies to have incredibly powerful information readily available that can help solve global problems. This is why I mentioned that this is the age of the data person earlier. The landscape of positive opportunities is endless.

New: From your time in the federal government, is there a particular success story you were involved with where a federal agency performing dramatically better after adopting better data management practices?

Bos-Beijer: I have been truly fortunate to contribute to multiple successes and learn from very diverse situations and challenges. I am conflicted when asked these questions as it is like asking a musician what piece of music they like most. It is not a deflection to say the musical piece I am playing or learning now is my favorite or a success. The same applies to data management. I also respect and value highly the great relationships with a number of agencies so I would be remiss in choosing one over the other.

To answer your question, with no agency attribution, I can share an example of one success where data management and the understanding of information management completely changed the program success. One agency had many data sources in different formats from different servicers with many of the same providers and beneficiaries. This is a common dilemma. It was the same universe of beneficiaries, the same universe of providers, and yet there were disparate outcomes when analytics were applied. Rather than starting with just the outcomes or the analytics methods used, we looked back to the data sources first and validated them for uniformity and quality by applying a consistent data management approach from ingest through outcome. Data management, in my experience, benefits most from an expeditious and simple lifecycle structure. Additionally, like this agency, understanding what data can do to meet an objective also identifies the kind of data management to apply. The agency was able to use what they developed to work with two other agencies where the provider and beneficiaries were almost identical.

If data and methods will be used for prosecution or adjudication, the duration and applied methods must be repeatable and transparent to ensure neutrality of outcomes. If the intent is a process evaluation, such as the customer experience work we are presently doing at GSA, then the focus is on how you manage and use data that you collect from the process. In this agency’s case, they used a method I developed decades ago to determine data use objectives. Start with an apparently simple statement and fill in the blanks. It is a very difficult exercise but works effectively to focus the data management itself such as, “If I had ‘X’ I would be able to ‘Y’ which would result in ‘Z.’” The success at this agency came from figuring out how to fill in those blanks, demonstrating the methods used to select data sources, creating a repeatable process, having a transparent and objective process, and dramatically reducing time and resources to create tangible results that could be acted upon. In this case, the agency’s success prevented billions of dollars of fraudulent spending. Examples like this energize me for the next data opportunity or analytics challenge.

Tags: , , , , ,


About the Author

Joshua New is a policy analyst at the Center for Data Innovation. He has a background in government affairs, policy, and communication. Prior to joining the Center for Data Innovation, Joshua graduated from American University with degrees in C.L.E.G. (Communication, Legal Institutions, Economics, and Government) and Public Communication. His research focuses on methods of promoting innovative and emerging technologies as a means of improving the economy and quality of life. Follow Joshua on Twitter @Josh_A_New.



Back to Top ↑

Show Buttons
Hide Buttons