posts

Cloudbusting

2021.02.28 Title: Cloudbusting. Retro 80s video game (pixel art) style portrait of Kate Bush in front of a car Get in loser, we’re going cloudbusting! Image by Bill Hunt. The “Cloudbuster” was a device invented by William Reich to create clouds and rain by shooting “energy” into the sky through a series of metal rods. Although Reich was paid by many desperate farmers to produce rain, the device was never proven to work. It’s been ten years since the Office of Management and Budget (OMB) released the original Federal Cloud Computing Strategy. I had the opportunity to update this strategy two years ago when I served as the Cloud Policy Lead at OMB. Having spent 20 years in the private sector building bleeding-edge cloud infrastructure for some of the best known companies in the world, I was able to leverage my practical experience in the creation of the 2019 Federal Cloud Computing Strategy, “Cloud Smart”. During the course of my work at OMB, I spoke with hundreds of practitioners, policy experts, and Chief Information Officers (CIOs) across government. From this vantage point, I had an intimate view into the entire Federal technology portfolio and learned that many myths about cloud computing were being accepted as truth. In this article, I’ll debunk key myths about cloud adoption, and explain why - and when - cloud is appropriate for government. These myths are generally intended for civilian Federal agencies of the United States, but the recommendations below apply to any public sector organization - and even some private organizations as well. In part two, I’ll discuss some strategies for overcoming the pitfalls discussed here. Both guides are available to download as a single PDF


Myth 1: Cloud Is Cheaper

The main reason cited by Federal agencies to move to commercial cloud is the promise of cost savings. This myth originated with vendors and was repeated by Congress, eventually becoming a common talking point for Executive Branch agencies. Unfortunately, it is based on false premises and poor cost analyses. In practice, the government almost never saves actual money moving to the cloud - though the capabilities they gain from that investment will usually result in a greater value. At a glance, it can appear that moving applications to the cloud may be cheaper than leaving them in a data center. But in most cases, a Federal agency will not see much, if any, cost savings from moving to the cloud. More often than not, they end up spending many times more on cloud than for comparable workloads run in their data center. Experts have known this was a myth for at least a decade, but the lobbyists and salespeople were simply louder than those who had done the math. First, it’s important to note that most Federal agencies own outright the facilities their data centers are located in. In the 1980s and 1990s, agencies began repurposing existing office space for use as data centers, adding in advanced cooling and electrical systems to support their growing compute needs. This changes the equation for the total cost of ownership because the facilities are already built and can be run relatively cheaply, though they may be partially or fully staffed by contractors due to the constant push to outsource all work. The government has also built a few best-in-breed data centers such as the Social Security Administration’s flagship data center that can compete with some of the most efficient commercial facilities in the world, with solar collectors for electricity generation, and advanced heat management systems for reduced energy usage. However, these super-efficient facilities are only represent a handful of the over 1500 data centers the government owns and operates, and cost half a billion dollars each to build. Second, agencies routinely run their servers and equipment well past the end-of-life to save money. There are no Federal requirements to update hardware. In fact, until recently, Federal data center requirements for efficiency measured the utilization of servers by time spent processing, which disincentivized agencies from upgrading - older hardware runs slower and thus results in a higher utilization rate for a given task than a newer, more efficient server that completes the task quickly. During a budget shortfall, an agency with a data center has the option of skipping a hardware refresh cycle or cutting staff to make up the deficit; meanwhile, an agency that is all-in on cloud loses this option, as they will have to continue paying for licenses, operations and maintenance costs. As a result, agencies will need to future-proof their plans in more innovative ways, or better communicate funding priorities to OMB and Congress. Also, it’s important to realize that once the government does buy hardware, the government owns it outright. When you move your application to a commercial cloud, you’re paying a premium for data storage even if it’s just sitting around and not being actively used - for large amounts of data, cloud costs will quickly skyrocket. The government maintains decades worth of massive data sets - NASA generates terabytes of data per day, and even a tiny agency like the Small Business Administration has to maintain billions of scanned loan documents going back to its inception sixty years ago. This is why some major companies have moved away from commercial cloud and built their own infrastructure instead. I would note that the idea of workload portability  - moving a service between different cloud vendors, generally to get a cheaper cost - is also largely a myth. The cost to move between services is simply too great, and the time spent in building this flexibility will not realize any savings. Moreover, every cloud vendor’s offering is just slightly different from its peers, and if you’re only using the most basic offerings which are identical - virtual servers and storage - you’re missing out on the full value that cloud offers.

Myth 2: Cloud Requires Fewer Staff

Another promise of cloud cost savings is that an agency no longer has to keep data center engineers on staff. These practitioners are usually comparatively cheap to employ in government, and rarely reach a grade above GS-13 ($76K-$99K annual salary) and agencies moving to cloud will instead employ comparatively expensive DevSecOps practitioners, site reliability engineers, and cloud-software engineers to replace them when moving applications to IaaS or PaaS. These types of staff are extremely difficult to hire into government as they make very high salaries in the private sector, well in excess of the highest end of the General Schedule pay scale (GS-15: $104K-138K), even assuming an agency has the budget and staff slots open to create a GS-15 position in the first place. Due to the many flaws in the government hiring process, it also can be very difficult to recruit these people into government, even with the new OPM hiring authorities to streamline this process. An agency that chooses to outsource these skills will often find that contractors may cost even more than hiring capable staff. The agency will still need to have staff with cloud experience to actively manage these staff, and contracts will need to be carefully crafted around concrete outcomes so that the agency is not fleeced by a vendor. Another overlooked cost here is training. New solutions aren’t always easy for agencies to adopt - whether that’s a fancy software development tool or something as simple as a video chat platform. Personally, a day doesn’t go by that I don’t find myself explaining to a customer some aspect of Teams or Sharepoint they don’t know how to use. Agencies often must provide formal training, and of course there’s inevitably a loss of productivity while teams get up to speed on the new tools and solutions. Since many SaaS vendors roll out new features extremely rapidly, this can present a challenge for slow-to-adapt agencies. Although some training is provided free from vendors, this rarely suffices for all of an agency’s needs, so in most cases further training will have to be purchased.

Myth 3: Cloud Is More Secure

A constant refrain is that cloud is safer and more secure, owing to the fact that the servers are patched automatically - meaning that key security updates are installed immediately, rather than waiting for a human to make the time to roll out all of these updates.  For a large enterprise, this is historically a very time-consuming manual process, which automation has improved dramatically. However, the same tools that major corporations use for patching in the Cloud are largely open source and free, and they can be used in an agency’s own data center. Moreover, it’s important to note that cloud does not remove complexity, it only hides it in places that are harder to see.  When it comes to security, this is especially true, as organizations must adapt to highly-specialized security settings that are not always easily found, particularly with the IaaS offerings. These settings are also constantly changing because of the constant-patching of these vendors, and all too often with little notice in the case of SaaS offerings. This “double-edged sword” has resulted in a number of high-profile cloud-related breaches over the last few years - affecting both the public and private sectors alike as we learn best security practices the hard way. Cloud vendors have also been… less than enthusiastic about meeting government security and policy requirements, unless the government is willing to pay a very high premium for the privilege of security. (I talked about this contentious relationship more in my post on Automation Principles.) For instance, as of today no major cloud vendor completely meets the government requirements for IPv6 which have been around for 15 years and which OMB recently revised to try to get them to move faster.

Myth 4: Cloud Is More Reliable

This one is less of a myth and more of an overpromise, or fundamental misunderstanding of the underlying technology. For a long time, one of the main pitches of cloud is that of self-healing infrastructure - when one server or drive fails, a new one is spun up to replace it. Although this is something that can be implemented in the cloud, it’s definitely not the default. Specifically, for IaaS solutions, you have to build that into your application - and you don’t get it for free. Relatedly, many agencies assume that any application put into the cloud will automatically scale to meet any demand. If your agency’s website gets mentioned by the President, let’s say, you wouldn’t want it to collapse due to its newfound popularity. Without building infrastructure designed to handle this, simply being “in the cloud” will not solve this problem. However, solving it in the cloud will likely be faster than waiting for physical servers to be purchased, built, shipped, and installed - assuming you have staff on-hand who can handle the tasks. It is important to keep in mind cloud is, by definition, ephemeral. Servers and drives are often replaced with little-to-no notice. I’ve frequently had virtual machines simply become completely unresponsive, requiring them to be rebooted or rebuilt entirely. When you’re building in the cloud, you should assume that anything could break without warning, and you should have recovery procedures in place to handle the situation. Tools like Chaos Monkey can help you test your recovery procedures. One issue that some of the most seasoned practitioners often miss is that all cloud providers have hard limits on their resources that they are able to sell you. After all, they are just running their own data centers, and there are a fixed number of servers that they have on-hand. I have often encountered these limits in practical, seemingly-simple use cases. For instance, I’ve created applications which needed high-memory virtual servers, where the provider didn’t have enough instances to sell us. During the pandemic response, I also discovered that cloud-based email inboxes have hardcoded, technical limits as to the volume of mail they can receive. I had assumed we could simply buy more capacity but this was not the case, requiring a “Rube Goldberg machine” workaround of routing rules to handle the massive increase associated with a national disaster. There is no question that scalability is a huge benefit, until the practical limits become a liability because of your assumptions.

Myth 5: Cloud Must Be All-or-Nothing

Many organizations assume that the goal is to move everything to a commercial cloud provider.  Both the Government Accountability Office and Congress have stated that the government needs to “get out of the data center business.” However, this is simply not a realistic goal in the public sector - government couldn’t afford to make such a massive move given their very restricted budgets. We also must clarify the concept of “legacy systems,” another frequent talking point. Most Federal agencies that have been around for more than 30 years still have mainframes, and they’re often still running older programming languages such as COBOL, Fortran, and Pascal. Many major industries in the private sector still use these same technologies - most notably, the banking industry still is heavily dependent on these legacy systems. Regardless of the hype about cloud and blockchain for moving money around, 95% of credit card transactions still use a COBOL system, probably running on a mainframe behind the scenes. These systems are not going away any time soon. Now these mainframes usually are not dusty old metal boxes that have been taking up an entire basement room for decades. Often, they’re cutting edge hardware that’s incredibly efficient - and even have all the shiny plastic and glowing lights and advanced cooling systems you’d expect to see on a gamer’s desktop computer. Dollar for dollar, modern mainframe systems can be more cost-effective than cloud for comparable workloads over their lifecycle. It’s also worth noting that they are about a thousand times less likely to be attacked or exploited than cloud-based infrastructure. The code running on these mainframes, on the other hand, is likely to be very old, and it’s almost certainly been written such that it cannot be virtualized or moved to the cloud without rewriting partially or entirely at great expense. Modern programming languages come with their own risks, so finding a sustainable middle path between the ancient and bleeding-edge is important for a successful modernization effort. Due to the considerations above, the future of government infrastructure will remain a hybrid, multi-cloud environment - much to the consternation of cloud vendors.

“… I just know that something good is gonna happen”

Instead of these myths, the best reason to use cloud is for the unrivaled capabilities that these tools can unlock:
  • Agility: being able to quickly spin up a server to try something new is much easier in the cloud, if you have not already created an on-premise virtualized infrastructure. Cloud.gov, an offering from the General Services Administration (GSA) that bundles many Amazon Web Services (AWS) offerings in a government-friendly “procurement wrapper” can make this even easier for agencies.
  • Scalability: the main hallmark of cloud is using this agility to quickly respond to sudden increases in requests to websites and applications. Especially during the COVID-19 pandemic, agencies have taken advantage of this functionality to deal with the dramatic increase in traffic to benefit applications and other services. However, it is critical to note that most cloud services do not scale automatically (another myth covered below).
  • Distributed: most Federal agencies have staff in field offices all over the country, and of course their customers are both at home and abroad. Since the cloud is really just a series of distributed data centers around the world, this can dramatically reduce the latency between the customer and the service. For instance, agencies are using cloud-based virtual private network (VPN) solutions to securely connect their staff to internal networks. Those that have moved to cloud-based email, video chat, and document collaboration tools see an additional speed bump for staying in the same cloud for all of these services.
Of course, we all know that “cloud is just someone else’s data center,” but the government should not be held back by fear, uncertainty, and doubt from someone else holding their data. Cloud technologies have a huge potential to improve Federal technology, when approached with a full knowledge of the complexity and costs. Cloud is not a replacement for good management, however. You can’t buy your way out of risk. Until the government invests in its workforce to make sure that IT can be planned, acquired, implemented, and maintained effectively, we will not see any improvement in the services provided to the American people. Now, Congress just needs to be convinced to fully fund some of these improvements. Next week I’ll share part two, where I will discuss several key strategies for a successful cloud implementation in a government agency.

Read This

Login.gov for Everyone!

2021.02.18 – A little over two years ago, I was walking out of the New Executive Office Building by the White House. I immediately ran into Robin Carnahan, who said to me, “Bill, we should be able to provide Login to cities and states.” (If you haven’t met Robin, let me just make it clear for the narrative here that she’s super-smart and anything she says you should just agree with immediately because she knows what she’s talking about.) As soon as I got back to my desk at the Office of Management and Budget (OMB), I started sending out emails to figure out why the General Services Administration (GSA) was preventing this excellent service from being used by smaller governments. For those of you who don’t know about this hidden gem, Login.gov is a GSA solution to help solve the difficult problem of verifying that a person is who they say they are to receive a government benefit, as well as a solution for logging into government websites. It was created through the combined efforts of USDS and 18F - the two most prominent digital service teams in all of government - and is in use by many Federal agencies today. Today it provides access to government services for over 27 million people! Today, GSA has announced that Login.gov is available for use by local and state governments! (To be clear, I had effectively nothing to do with the actual permission being granted here - sending stern emails had little effect. The victory today belongs entirely to the wonderful, amazing, fantastic team at Login and the bureaucrats who were willing to push to make it happen.) There are, however, still a few restrictions for city and state use. To be eligible, the government agencies must be using Login for a “federally funded program.” This is an arbitrary addition by GSA that, in my opinion, misinterprets the original intent of the legal authority - but I’m not a lawyer and am no longer responsible for these sorts of policy decisions. I am hopeful that this restriction will be removed in the future and this incredible service will be open to all who want it! Moreover, as I’ve written in the past, it is my hope that OMB will mandate the use of Login for all Federal agencies. This is already mandated by law, but OMB is not enforcing the requirement. The most expensive part of the tool is the identity verification step - however, once an identity has been proven, it does not need to be re-proven if the customer wants to use any other service that is using Login. This means that as more organizations sign up for Login, the cost to each decreases. By allowing Federal agencies to maintain their own independent login systems, the costs remain high. Moreover, this presents customers with an inferior experience, as they must sign up for a new account for each website or application. It’s also important to note that most identity verification behind the scenes is using data sources that the government controls and gives to private companies, who then sell the government back its own data in the verification process at a very high premium. Eventually, it would be smarter to allow agencies to exchange the necessary information themselves, cutting out the middleperson, which would decrease the cost to almost nothing. (Congress, of course, could speed this along too with the right legislation.) I’ve heard that the Login team has also been working on a pilot to allow customers to prove their identity in-person at a government facility, which has shown to improve the success rates of the verification process. The Department of Veterans Affairs (VA) uses such a process to help Veterans walk through the process of setting up their online accounts right in the lobby of many VA health clinics. The US Postal Service also performed a similar pilot several years ago, where anyone could stop by a post office and have them review their documents, or even let their postal carrier perform the review when they drop off the day’s mail, allowing them to reach almost every single person in the country! Detractors still complain about the cost of Login.gov, and consider that a reason to not require it, even though the cost would be reduced if it was mandated. Even so, if the Federal government agrees that this is the tool that agencies should be using, then it should be treated like a Public Good - like a library or park. To that end, Congress could pass appropriations dedicated to funding this critical program, for instance as part of President Biden’s proposal for TTS Funding. However, I would caution agencies from implementing identity requirements beyond what is absolutely necessary! The Digital Identity Guidelines from the National Institute for Standards and Technology (NIST) are the baseline that most Federal agencies use; in my personal opinion, they set too high a bar. The government must provide critical services to at-risk and economically disadvantaged groups, and by setting requirements that individuals in these groups cannot meet agencies are not serving people equitably. For instance, the the VA serves Veterans that may be homeless, may not have a credit card, may be partially or fully blind, may have trouble remembering or recalling information, may not have fingerprints, and so on. Since the standard methods of identity verification and authentication may present an impossible barrier for the very people the VA serves, it is in the best interest of these people to not implement NIST’s high standards as written. (And I told NIST the same thing.) If you’re a city or state government interested in a world-class identity solution, I’d recommend reaching out to GSA about Login.gov! Even if you don’t meet the requirement mentioned above, it’s definitely worthwhile to getting in touch with GSA anyway - as we’ve learned, policies change every day.

Read This

Presenting EOPbot

2021.01.25 – If you’re like me, you may be having trouble keeping up with all the new Executive Orders and OMB Memos that the Biden Administration is putting out. To help, I’ve created a little bot to look for changes on specific pages of the White House website: @EOPbot!

Read This

Principles for Automation in Government

2020.12.20 – This article is part three in a series on IT policy recommendations. A PDF of the full recommendations may be downloaded here. Artificial Intelligence (AI), Machine Learning (ML), Robotic Processing Automation (RPA)1, and other related predictive algorithm technologies continue to gain attention. However, at the moment their promises are far greater than the reality, and instead of successes we continue to see the worst of ourselves reflected back. Vendors also continue to oversell the functionality of these tools, while glossing over major expenses and difficulties, such as acquiring and tagging training data. The Trump Administration, rather than increasing scrutiny and oversight of these technologies, only sought to reduce barriers to its usage. The Biden Administration will need to** create stronger protections for the American people through better governance of the usage of these solutions in government.** The problem is that humans have written our biases into our processes, and automation only expedites and amplifies these biases. (The book Automating Inequality explains this better than I ever could.) As a technologist, I become concerned when I hear of government agencies implementing these technologies for decision-making, as our unequal systems will only lead to greater inequity. It’s all too easy to “blame the algorithm” to avoid liability, but it’s who humans create the algorithms. Simply put, the Federal government cannot have racist chatbots. The government must not exacerbate the problem of minorities not receiving benefits they deserve. And the government should not be using tools that can reenforce existing racism and sexism while remaining willfully ignorant of these topics. Yet with all of these failures, we still see organizations running gleefully towards toxic ideas such as predictive policing and facial-recognition technology. Fundamentally, this is a question of ethics. Although in government we have extensive ethics laws and regulations in regard to finances and influence, there is almost no actual guidance on ethical practices in the use of technology. And in the U.S. there exists no standard code of ethics for software engineering, no Hippocratic Oath for practicing technology. However, we do have a series of regulatory proxies for ethics, in the form of security and privacy requirements aimed to protect the data of the American people. A diagram reflecting the balance between human versus computer decision-making and impact to human life and livelihood. A diagram reflecting the balance between human versus computer decision-making and impact to human life and livelihood. By requiring a series of controls — not unlike those that we use for IT security — we can increase the safety of the usage of these tools. Similar to the current National Institute of Standards and Technology (NIST) classifications for Low, Medium, and High security systems, artificial intelligence systems should be classified by their impact to people, and the level of automation that is allowed must be guided by the impact. And like the NIST security controls, these must be auditable and testable, to make sure systems are functioning within the expected policy parameters. For instance, a robot vacuum cleaner presents very little risk of life, but can cause some inconvenience if it misbehaves, so very few controls and human oversight would be required. But automation in the processing for loans or other benefits may disastrously impact people’s finances, so higher controls must be implemented and more human engagement should be required. Most notably among these controls must be explainability in decision-making by computers. When a decision is made by a machine — for instance, the denial of a benefit to a person — we must be able to see exactly how and why the decision was made and improve the system in the future. This is a requirement that megacorporations have long railed against due to the potential legal liabilities they may face in having to provide such documentation, but the Administration must not yield to these private interests at the expense of The People. Another key control will be transparency in the usage of these systems, and all Federal agencies must be required to notify the people when such a system is in use. This should be done both through a Federal Records Notice similar to the ones required for new information systems, but also on the form, tool, or decision letter itself so that consumers are aware of how these tools are used. Standard, plain language descriptions should be created and used government-wide. Related to that control, any system that makes a determination, on a benefit or similar, must have a process for the recipient to appeal the decision to an actual human in a timely fashion. This requirement is deliberately burdensome, as it will actively curtail many inappropriate uses in government, since overtaxed government processes won’t be able to keep up with too many denied benefits. For instance, the Veterans Benefit Appeals system currently is entirely manual, but has a delay of a year or more, and some Veterans have been waiting years for appeals to be adjudicated; if a system is seeing an unreasonably large number of appeals of benefit denials, that’s a good indicator of a broken system. Moreover the result of that appeal must become part of the determining framework after re-adjudication, and any previous adjudications or pending appeals should be automatically reconsidered retroactively. There also exists a category of uses of Artificial Intelligence that the government should entirely prohibit. The most extreme and obvious example is the creation of lethal robots for law enforcement or military usage — regardless of what benefits the Department of Defense and military vendors try to sell us. Although there’s little fear of a science-fiction dystopia of self-aware murderbots, major ethical considerations must still be taken into account. If we cannot trust even human officers to act ethically under political duress, we certainly cannot expect robots devoid of any empathy to protect our citizens from tyranny when they can be turned against people with the push of a button. Similarly, the government must also be able to hold private companies liable for their usage of these technologies both in government and the private sector as well. If something fails, the government legally owns the risk, but that does not mean that private companies should escape blame or penalties. The increase in companies creating self-driving cars will inevitably lead to more deaths, but these companies continue to avoid any responsibility. The National Highway Traffic Safety Administration’s recommendations on autonomous vehicles do not go nearly far enough, merely making the “request that manufacturers and other entities voluntarily provide reports.” In short, the government must make a stand to protect its people, instead of merely serving the interests of private companies — it cannot do both. For further reading, the governments of Canada and Colombia have released guidance on this topic, providing an excellent starting point for other governments.

  1. Some of us technologists have referred to RPA as “Steampunkification” instead of IT modernization, as the older systems are still left in place while newer tech is just stuck on top, increasing rather than decreasing the technical debt of an organization— much as Steampunks glue shiny gears onto old hats as fashion. 

Read This

Reskilling and Hiring for Technology in Government

2020.12.19 – This article is part two in a series on IT policy recommendations. A PDF of the full recommendations may be downloaded here. The nature of business is change — we move, refine, and combine goods and services and data, which generates value — and this is true both in the public and the private sector. Technology is just one of the ways that we manage that change. Those organizations that do best at managing change are often the best equipped to deal with the relentless pace of transformation within the IT field itself. Government, however, tends to resist change because of misaligned value incentives which prioritize stability *and avoid *risk, though these elements do not necessarily need to be at odds with one another. Since the Reagan era, government agencies have outsourced more and more IT tasks to contractors and vendors, under the false promise of reduced risk and increased savings for taxpayers. There’s an infamous joke that we’ve done such a good job of saving money through IT over the last decade that we’ve reduced the IT budget of $2 billion to $40 billion. Yet almost all of that spending has gone to private companies, instead of increasing Federal staff and providing needed training, and the government has astonishingly little positive progress to show for it — systems and projects continue to fail. This effort has lobotomized government by eliminating subject matter experts, reducing its ability to manage change, and as a result has greatly increased — rather than reduced — the risk for Federal agencies. Agencies have tried to “buy their way out” of their risk, by leveraging vendors and IT products to “absorb” the risk. Unfortunately, government doesn’t work that way — agencies are solely responsible for risk, and if something fails, the agency, not the vendor, is the one on the hook for any lawsuits or Congressional hearings that result. The only practical way for agencies to deal with their risk and begin paying down the government’s massive technical debt is to hire and train experts inside of government who can address these problems directly, and begin to facilitate change management. In the Cloud Smart strategy OMB states, “to harness new capabilities and expand existing abilities to enable their mission and deliver services to the public faster … instead of ‘buy before build’, agencies will need to move to ‘solve before buy,’ addressing their service needs, fundamental requirements, and gaps in processes and skillsets.” Although there has been a major effort to hire and train cybersecurity professionals in government, technology literacy needs to be improved in all job roles. Technology will always be a core function of government, and to be successful, government must have expertise in its core functions; to do otherwise is to deliberately sabotage that success. Efforts such as GSA’s 18F Team and The US Digital Service (USDS) have proven that there is a need for this expertise, and the government must continue and expand on those efforts by teaching agencies “how to fish.” Beyond just these short-term hires via Digital Service/Schedule A and Cybersecurity/2210 to augment staff temporarily, agencies need to invest in permanently expanding their knowledge, skills, and capacity.

Increase Training Opportunities for Federal Government Employees

First, there needs to be a** governmentwide approach to increasing training, starting with **additional funding in the President’s budget dedicated to improving IT skills. Financial and leave award incentives could also be used to encourage staff to participate in more training outside of their immediate job roles. The Federal Cybersecurity Reskilling Academy as part of the Cloud Smart strategy was a good start, but didn’t go far enough. It’s impossible to fully train a practitioner in everything they need to know about Cybersecurity — or any other complex technology — in just a few short weeks. A real apprenticeship program in the form of agency rotation & detail programs for staff into more IT-mature agencies would have a major impact, by allowing staff to learn skills on-the-job in a hands-on way. Many of these skills are impossible to learn meaningfully from a book or seminar; in general most technical certifications — instead of being required — should be met with skepticism. Almost all policy decisions today have some aspect of technology involved. To address the rapidly aging Federal IT infrastructure and make smart investments with taxpayer dollars, all of our leaders need to be equipped with knowledge of modern systems beyond just the sales pitches they receive from vendors. Ongoing training in technology must be made a priority and part of every Senior Executive Service (SES) performance plan.

Create a new IT Job Series

Although many technologists have been willing to work for a short term of 2–4 years in government at a massive pay cut just out of a feeling of civic duty, this sort of “holiday labor” is not a sustainable path for long-term success. A new Administration will need to address the massive pay disparity for government IT jobs, which acts as a barrier to both hiring and retaining staff. The White House will need to direct the Office of Personnel Management (OPM) to establish a proper IT job series or extend the 2210 CyberSecurity role definition, and create a special rate that reduces this gap particularly at the top end of the scale (GS-13 through GS-15). Ideally this pay should be competitive with the private sector by locale, or as close to the standard rates as possible. And this pay must be made available to staff as they are retrained, not just to outsiders coming in to government with lucrative salaries from the private sector. Without this key step, the work done to reskill our staff will be lost as they use their new skills to find better-paying employment outside of government. Also, this job series should include not only security personnel, software engineers, and graphic designers, but also non-traditional (but very important) members of government technical teams such as program & product managers, contracting officer representatives (CORs), customer experience experts, and content designers.

Leverage Modern Hiring Techniques to Bring in Skilled Personnel

Third, agencies must be directed to aggressively move away from older hiring processes and switch to techniques which evaluate if candidates can actually do the job. OPM, in coordination with USDS, has already done a lot of work towards this, including eliminating education requirements and moving to knowledge-based hiring techniques, but agencies largely have not yet implemented this new guidance. The White House will need to apply more pressure for these changes if agencies are expected to adopt them. Initiatives such as Launch Grad and the Civic Digital Fellowship could also provide a pipeline for potential candidates with critical skills into government service.

Improving Diversity in the Senior Executive Service

Finally, major improvements must be made to the Senior Executive Service (SES) hiring process. These staff represent the senior leaders at Federal agencies, and almost all policy decisions today have some aspect of technology involved. To address the rapidly aging Federal IT infrastructure and make smart investments with taxpayer dollars, all of our leaders need to be equipped with knowledge of modern systems beyond just the sales pitches they receive from vendors. In addition to increasing critical technical knowledge of these key decision-makers, the lack of diversity of this group has gone woefully unaddressed even after years of critical reports. Since these SESs are on the boards that hire the other SESs, and many of these leadership roles are filled due to tacit political connections not the candidates’ skills, it is unlikely that the diversity will improve organically from this in-group. This entire hiring process needs to be reconsidered to level the playing field. The Executive Core Qualifications (ECQs) were a good idea to set a baseline for expertise in senior management, but have largely become an expensive gatekeeping exercise. This has given rise to a cottage industry of writers who simply churn out government resumes to a pricetag of thousands of dollars. I know of very few SES staff who were not either hand-picked for their first SES role or who paid to have their resume written by a professional. This limits these staff to those who can “pay to play” — either with literal dollars or political influence, severely limiting the candidate pool. On the reviewer’s end, it’s long been known that overtaxed human resources staff are often just searching for keywords from the job postings in the resumes as a means of first review, which eliminates anyone who may have missed a specific word or phrase. Government expertise and education appears to be given a higher standing than outside experience as well. And after your ECQs have been approved once you don’t need to have them re-reviewed for each job, further narrowing the list of candidates who are considered. There is no single, easy solution to the systemic problems in this process. Expanding training opportunities for senior General Schedule employees (GS-14 and GS-15) beyond just the outdated and time-consuming Candidate Development Program would be a first step. A new Administration could make diversity a key priority in the President’s Management Agenda, setting goals for hiring and new initiatives for recruiting under the Chief Human Capital Officers Council (CHCOC).

In Closing: Countering Bias Through Diversity

Our country is changing, and so is the nature of government. Diversity is critical for all technology roles in government, not just leadership. Addressing systemic bias in the tools that agencies are implementing will require attention from all levels of staff. Our benefit systems must provide services equitably to all, but this will be impossible without acknowledging these biases. However, due to a recent Executive Order, training around bias has largely been halted in the Federal government, reducing our ability to tackle this challenge. As the government begins to close gaps around technology skills, it is critical that we’re building a workforce that reflects the people we serve, so that we can better address these issues at their root.

Read This

Federal IT Policy Recommendations: 2021-2024

2020.12.18 – This article is part one in a series on IT policy recommendations. A PDF of the full recommendations may be downloaded here.

Executive Summary

The work improving technology in government through policy initiatives over the last twelve years has been very successful, however there will always be more work that needs to be done. Today, there are several key steps that the Biden Administration could immediately address and work on over the next four years to continue to build trust and drive maturity in technology across government to “Build Back Better” — not just at the Federal level, but state and local as well. These steps include:
  1. Renew the Commitment to Open Data & Transparency
  2. Focus on Outcomes, not Box-Checking
  3. Drive Customer Experience & Human-Centered Design
  4. Solve Identity Once and for All
  5. Increase Attention to Small Agencies and
  6. Manage Risk through Security
I’ve spent the last ten years working on civic tech from local to Federal levels, inside and outside of government, and have been excited to see incredible gains in the government’s ability to deliver services to constituents. After the Obama Presidency, the work to drive innovation in government didn’t suddenly stop — the Trump Administration pursued an aggressive agenda of IT Modernization. This included a major effort to update a very large amount of outdated government technology guidance, laying the critical foundation for many modern technology practices and ideas. From 2017–2019, I served in the Office of Management and Budget (OMB) in the Office of the Federal Chief Information Officer (OFCIO), where I worked on the new Federal Cloud Computing Strategy, ”Cloud Smart.” I designed this strategy to drive maturity across the Federal Government by updating a variety of older, interrelated policies on cybersecurity, procurement, and workforce training. At the time, we had no idea that many of these initiatives, such as the update to the Trusted Internet Connections policy (TIC), would be critical to enabling government-wide mission continuity during the COVID-19 response just a few months later. From the past 4 years spent in government, I have been able to see many opportunities for improvements that did not get as much attention as they deserve. What follows are a few policy areas that I believe would build trust and improve service delivery to the American people. These aren’t all major innovations, but these efforts are needed to Move Carefully and Fix Things.

1. Renew the Commitment to Open Data & Transparency

Before joining the Federal Government, I spent years working for government transparency organizations including the Sunlight Foundation and the OpenGov Foundation. Although those and many other transparency organizations have shut their doors over the last four years, the need for transparency has never been greater. However, I no longer hold the naive belief that sunlight is the best disinfectant. As it turns out, disinfectant is a better disinfectant, and regularly putting in the work to keep things clean in the first place is critically important. Transparency is an active process, not an end in and of itself — and care will have to be given to rebuilding some of the atrophied muscles within government.

Share Data on the Fight Against COVID-19

First and foremost, to heal the country a new Administration will need to deal with not only the COVID-19 virus, but also the disinformation virus. To do so effectively will require addressing public trust around information quality and availability. The Administration should focus on providing timely, accurate information including infection rates from Health and Human Services (HHS), job numbers from the Department of Labor (DOL), housing data from Housing and Urban Development (HUD), and loan data from the Small Business Administration (SBA). By utilizing the new Chief Data Officers across government installed as part of the Open, Public, Electronic and Necessary, (OPEN) Government Data Act signed into law in 2019, the Biden Administration would be able to gather and centralize the critical recovery data. Everyone loves shiny dashboards, but I would instead propose that sharing raw data to allow independent analysis would be vastly more valuable than Yet Another Dashboard.

Revise the National Action Plan

My work on the Fourth National Action Plan for Open Government (NAP4) — and the challenges the Trump Administration faced in delivering this plan — are matters of public record. As we look towards the Fifth National Action Plan, it will be critical to improve engagement with the public and open government groups. Since most of the country has quickly become accustomed to remote collaboration due to the pandemic, I would recommend hosting a variety of virtual forums beyond the DC area to maximize input and idea-generation outside of the beltway. In addition to bringing in more stakeholders from across the country, this would also aid in empowering grassroots-initiated activities towards anti-corruption practices as well. I’d also recommend starting this process as early as possible to develop and gain traction around high-quality, ambitious commitments. There are also more than a few initiatives that civil society has proposed over the last decade that are worthy of reconsideration, including these from the NAP4.

Revise Agency Open Government Plans

As part of this work, OMB will need to update the long-neglected Agency Open Government Plans guidance, which has not been revised since 2016. Although most agencies have updated their Open Government plans since then, more ambitious efforts to publish data are needed. Notably, the Department of Veterans Affairs (VA) have not updated their plan since 2010, even though more scrutiny has been paid to them by Congress during this time. The VA Inspector General also previously identified that the VA had been actively working to undermine efforts to measure their progress on improving patient wait times, as a result of simply not recording data on the topic. With the new, $5 billion Electronic Health Records (EHR) system being implemented today, it is even more urgent that the VA improve their transparency. However, all Federal agencies should be directed to more aggressively and proactively publish data, instead of just as a response to Freedom of Information Act (FOIA) requests. Throughout the Trump Administration, key datasets have been removed from government websites. The new Administration can both better tell its story and also build confidence in the American people using government services by working to restore key data and increasing the volume of information that is actively shared.

Rebuild The Office of Science and Technology Policy

The Office of Science and Technology Policy (OSTP), headed by the Federal Chief Technology Officer, was previously the center of open government work under the Obama Administration, but this office and its authority were dramatically reduced over the last four years, with staff cut from 150 to less than 50. As a result, major reconstitution of OSTP and other offices will need to be done to drive these efforts.

2. Focus on Outcomes, Not Box-Checking

Narrow Oversight Focus to High-Impact Projects

Transparency goes hand-in-hand with oversight. The Office of Management and Budget is the primary oversight organization within the Executive Branch (other than Inspectors General), and is organized into smaller domain-specific offices. Staff in these program offices act as “desk officers,” focusing primarily on the 24 large CFO Act Agencies. For smaller offices, a* *single individual may be tasked with oversight of several agencies’ billion dollar budgets. OMB’s OFCIO is one such smaller office that has been stretched thin in this oversight duty while having to simultaneously fulfill a variety of policymaking roles. However, the primary role of this office is to oversee technology implementation across government to ensure the success of projects. Given the few remaining staff, rather than being stretched thin on meaningless compliance, these resources could be better spent primarily focusing on only the top five or ten major technology projects in government and making sure that they do not failin the way we saw happen with Healthcare.gov. Projects such as the State Department’s passport & visa modernization, the Department of Veterans Affairs new EHR system, and other similar initiatives could greatly benefit from closer scrutiny. By investing in hiring subject matter experts with skills in technology and managing massive projects, the government could save taxpayers billions of dollars while simultaneously improving services. OFCIO should also collaborate closely with the Office of Performance and Personnel Management (OPPM) which oversees the Customer Experience initiative across government to make sure that these projects also meet the needs of the American people.

Restore and Expand The Office of the Federal Chief Information Officer

Moreover, OFCIO shares its limited budget with the U.S. Digital Service’s (USDS) core operations via the Information Technology Oversight and Reform (ITOR) Fund, which was slashed dramatically under the Trump Administration. More than just paying for staff salaries, this fund is used to fund a variety of key technology oversight projects, such as the government’s software code sharing initiative, code.gov. Cuts to this fund have caused OFCIO to eliminate programs like pulse.cio.gov, which monitored and evaluated the maturity and security of agency websites. Moreover, this fund is flexible and can be used by OMB to fund interesting technology initiatives at other agencies. The new Administration should restore the ITOR budget. It would also be useful to further supplement this fund by taking the step of working with Congress to set appropriations to ensure the future of OFCIO and USDS. Like OSTP, OFCIO has experienced large setbacks. The constant budget cuts and toxic culture have decimated the office, and most of the talented & passionate subject matter experts I served with have since left. Reversing the course on this office, and investing in hiring experts with practical experience in technology in government — not just Silicon Valley thought leadership solutionism — in these offices and beyond will be critical for the success of Federal IT for the next four years. This will improve both the quality of policy that is created as well as the outcomes of IT projects governmentwide.

3. Drive Customer Experience & Human-Centered Design

Historically the government spends hundreds of millions of dollars on major IT projects. However, very little work is typically done to make sure that the right thing is being built — or if the right problem is even being solved. And sadly, newer systems are not always better systems. However, initiatives on Human-Centered Design (HCD) — a process to engage service recipients as stakeholders in the design and implementation of those services and systems — that were started under the Obama administration were built upon over the last four years. For instance, common private sector practices like user research and testing were previously considered difficult in government because of review & approval requirements under the Paperwork Reduction Act, but using streamlined processes and blanket-permission requests these barriers have largely been eliminated for most agencies. These efforts need continued attention and support to maintain the momentum.

Drive Commitment to Human-Centered Design Across OMB

At OMB, the Office of Information and Regulatory Affairs and the Performance & Personnel Management office worked to institutionalize much of this work over the last four years, including new governmentwide Customer Experience (CX) metrics guidance and a related Cross-Agency Priority Goal as part of the President’s Management Agenda. These metrics should be considered table stakes for driving customer experience, and much more work must be done in this area. For instance, every major (and possibly even minor!) IT project should have CX metrics defined as part of its requirements, and these should be tracked throughout the life of the project. For existing projects, these should be created retroactively — starting with the highest-impact public-serving systems — with adequate baselines so that agencies don’t just receive an “easy A.” The recent General Services Administration (GSA) Playbook on CX may provide a great starting point for most agencies.

Fix the Definition of Agile

Of course, this customer experience work is not a new idea — in fact, this sort of Human-Centered Design is a core tenet of Agile software development. Unfortunately, the Federal Government has completely missed the forest for the trees on the principles of Agile, and almost all law and regulation focuses entirely on one area: incremental development, delivering software in small, working chunks over time, instead of delivering a full solution at the end of a lengthy development process. However, the real value of Agile is not in these small chunks, but rather in regular testing – both automated as well as having actual members of the public using the service directly involved in the development process to give feedback as the project progresses. In this way, teams can make sure their software works and is actually solving problems for people using the service, instead of assuming what the people served want. In the private sector we joke that you’ll have testing either way — would you rather do it before your product launches when you can get ahead of the issues, or after when it’s a public embarrassment? Currently, agencies are required to report on their major IT investments and state if these projects are developed “incrementally,” defined in guidance at the depressingly-low rate of once every six months. OMB could refine their guidance to add additional Agile characteristics, including the requirement that software is tested throughout the development process with real customers. This alone would dramatically decrease the number of failed projects in government, saving potentially billions of dollars.

Fund Great Customer Experience

However, all of this work requires expertise to be done well, and expertise comes at a cost. Champions such as Matt Lira have called for the creation of Chief Customer Experience Officers (CXOs) within agencies, which would be an excellent next step. However, we must not repeat the mistake of the creation of the Chief Data Officer (CDO) roles, where additional funding was not dedicated for these new roles or their staff – as a result this became yet another hat for the CIO to wear at most agencies. Agencies will need to have increased funding in the President’s Budget to both hire new CX experts as well as to fund contracts to support these efforts CX efforts government-wide.

4. Solve Identity Once and for All

Accurately verifying a person’s identity to satisfy Federal requirements, as well as creating a secure environment to allow them to login to Federal websites & tools, is a difficult and expensive task for all agencies. This also remains one of the biggest challenges for both agencies and the people accessing government services today. Most agencies have multiple login systems, each specifically tied to an individual service and without sharing information. For instance at the Department of Veterans Affairs until very recently there were nearly a dozen different login systems. Each of these systems would require you to prove that you are who you say you are separately as well.

Mandate Login.gov

Meanwhile, the GSA’s Login.gov is an easy solution to this problem, and has been an overwhelming success for many agency services, including USAJobs, the website for most Federal job postings and application processes. Login.gov provides a simple solution to the very expensive problem of checking the identity of a member of the public and allowing them to login to a government website or application — to receive government benefits, register their small business, or any number of other services. This identity-proofing step is typically the most expensive part of the process, requiring the use of independent, private data sources like those used by our national credit bureaus. With Login.gov, once you’re verified on one site you’re verified at them all, so the cost for taxpayers is dramatically reduced. Although some agencies are starting to move to this platform, a new administration should mandate all agencies must use Login.gov, and require them to provide a transition plan to this service within 5 years. In fact, usage of Login.gov is already required by law, but the law is simply not being followed (6 U.S.C. 1523(b)(1)(D)). Instead of just an unfunded mandate, the President’s Budget should include a request for Congress to provide appropriations directly to GSA to fund these efforts to ensure this product is sustainable well into the future.

Use USPS for In-Person Identity Proofing

At the VA we also learned that many people have trouble with identity proofing over the internet for a number of reasons, including problems with having suitable cameras for capturing information from IDs, issues with people’s memory that preclude standard address verification methods, and other issues. However, we found that people were much more likely to be successful by having their identity validated by humans in-person at VA hospitals. The US Postal Service (USPS) has successfully piloted a service to check people’s identity in-person at both USPS locations and at people’s homes using their existing portable tablets used for mail delivery. By working with Congress to help fund this service, identity verification could be a solved problem, while also providing a sustainable additional revenue stream for the desperately-underfunded USPS.

Share these Services with State & Local Governments

Moreover, these services should be offered to state and local governments, who are incredibly eager for these solutions, coupled with the expertise of the Federal government. For instance, the same login that you use for USAJobs could be used to login to your local DMV, once again making government easier and friendlier for everyone. To date, GSA leadership has not actively allowed sales to these governments, even though it is explicitly allowed under law and other similar services have been allowed, such as Cloud.gov. The White House should direct GSA to provide this service to any government agency who wants it — and even to the private sector where appropriate! Recent bills in Congress have also prioritized security for state and local governments, so it would not be unreasonable to go even further and work with Congress to set appropriations to provide this identity service to them as well. Working closely with the Cybersecurity and Infrastructure Security Agency (CISA), GSA could turn this from a small project into a national program.

5. Increase Attention to Small Agencies

There are nearly a hundred smaller independent agencies that are not situated under the President’s Cabinet, and as a result they are largely ignored. However, they still have critically important missions, and these agencies also interface with the bigger agencies to exchange data, presenting a number of potential security concerns and operational risks. Although a focus on projects and outcomes — not just compliance — is critical, OMB needs to pay more attention to these smaller agencies. For instance, the U.S. Securities and Exchange Commission is a small independent agency of only 4000 people, but is tasked with protecting investors and the national banking system, as a result of the stock market crash in the 1920s. As such a small agency, they don’t have nearly the budget for IT and cybersecurity of the large agencies. However, since they exchange data with the Department of the Treasury, they act as a backdoor into the larger agency. This sort of attack, by exploiting a softer target to gain access to a more secure one, is extremely common on the smaller scale and will inevitably become a focus for hostile nation-states in the future.

Fund Small Agencies’ IT

These smaller agencies will need additional resources to be able to deal with these threats while also keeping their services up-to-date. OMB can take the much-needed step of** requesting larger IT budgets for these agencies.** Furthermore, to date no small agencies have been selected for Technology Modernization Funds — a “loan program” for agencies to fund IT projects — to help them improve their IT. Meanwhile massive organizations such as U.S. Customs and Border Protection (CBP) — who have an annual budget of 17 billion dollars *and are not in any way short of money — have received an *additional 15 million dollars from this fund to update their legacy financial systems. Providing access to further funds for smaller agencies would give them an opportunity to improve their systems.

Drive Shared Service Use

Shared IT services are even more important for these agencies as well. In many cases the Chief Information Officer (CIO) will wear many hats — acting as Chief Information Security Officer (CISO), Chief Data Officer (CDO), and other roles. To be successful while being stretched so thin means that staff must take advantage of the capabilities of the bigger agencies to help them fill their gaps, such as the Department of Justice’s Security Operations Center-as-a-Service offering. The idea of a “CIO in a Box” for the smaller agencies has been brought up several times, providing information, services, and resources to these organizations. However, very little movement has been made on this initiative and this is a large opportunity for further work and investment. Other shared services, including the aforementioned Login.gov and Cloud.gov also would provide major benefits to smaller agencies, especially if the President’s budget included additional dedicated funding to GSA for these projects for small agencies, so that they don’t have to scrape together the money out of their own limited budgets.

6. Manage Risk through Security

The common theme here is that cybersecurity remains one of the greatest challenges for technology in government today. The Federal Information Security Management Act (FISMA) sets many of the legal requirements for cybersecurity in government, and in practice this has transformed risk management into risk avoidance, reducing the overall risk tolerance for agencies and freezing any interest in trying new things. There is little hope of Congress fixing FISMA in the near future, and the attempts to date only will make things worse. In the meantime, the Biden Administration could supplement ongoing initiatives for security automation with additional resources, and implement the resulting best practices as official policy governmentwide.

Continuous Security Authorization of IT Systems

At the center of IT security in government is the Authorization to Operate(ATO) process. If you’ve ever worked for the government, I’m sure you groaned just having to read that phrase. FISMA requires that for all IT systems, agencies must implement a series of “security controls” — measures defined by the National Institute of Standards and Technology (NIST) to enhance security. Now, this is an extremely laborious process, and a new product may take months to meet the requirements of a security review. This process generates a lot of paperwork — enough to stop bullets, but this isn’t very effective for keeping out nefarious attackers. Many agencies only have a three-year cycle of re-assessing products for these security controls — basically only checking to see if the door is locked once every few years. Moreover, the interpretation and implementation of these controls differ wildly between agencies. Several agencies have started separate pilots to improve the consistency and speed of this process. For instance, some agencies are working to implement a “lightweight authorization to operate” (LATO) or a “progressive authorization to operate” process where only a subset of the security controls must be reviewed to begin developing on a platform, with further controls added along the way before launching the application for public use. Others are moving to “continuous authorization,” a concept similar to continuous integration for software testing, by using standard tools to automatically check the various security controls on an ongoing basis — providing real-time visibility to the security of the systems. Still other agencies are working to standardize security plan language, or use natural language processing (NLP) as a means of reviewing paperwork-heavy controls faster. These efforts also relate to NIST’s efforts to standardize controls via a machine-readable structure called OSCAL, which is now being used by GSA’s FedRAMP program. Some of these efforts were previously being replicated via the CIO Council, but with the exodus of OFCIO staff efforts have stalled out. These efforts should be spread across government via additional funding, staffing, and more pilots.

Conclusion

These are just a few of the policy areas that need attention in technology in government. There are still other agency-specific projects that need further attention that I haven’t covered here. However, these specific areas of focus will continue to build back better technology in government, and equip us with the necessary tools for the next decade or two.

Read This