Why Is It So Hard For Government Programs To Deliver Results?

March 16, 2022 00:47:23
Why Is It So Hard For Government Programs To Deliver Results?
I See What You Mean
Why Is It So Hard For Government Programs To Deliver Results?

Mar 16 2022 | 00:47:23

/

Show Notes

The US Federal government runs more than 3500 programs, each designed to deliver a benefit to citizens or other customers in the US and around the world. Each is a collaborative effort with state and local governments, NGOs and sometimes other counties. And each relies on actors at the end of many delivery chains, far from where programs are planned and managed. Teachers. Inspectors. Law enforcement. War fighers. Diplomats. VA doctors and nurses. And many more.

 

So how can the Federal government ensure its programs deliver the right benefit to the right beneficiary, in the right way - plus continuously innovate to stay relevant to citizens and partners?

 

I know of no one better able to navigate a challenge of this magnitude than Mark Forman. With 30 years' experience inside government and out, in the US and around the world, Mark can conceive of solutions to big problems and see how they work where people implement them where the rubber meets the road. In this episode we discuss objectives and key results - OKRs - as a framework for getting people on the same page to deliver results and continuously improve return on program investments. Here are a few of my ahh-ha! moments:

 

2:06 - The OKR model as a framework for return on investment, especially of investments in transformations

3:11 - OKRs shift the focus from paying for activities to buying outcomes

4:19 - The relationship between ORRs and KPIs

5:38 and on - OKR use in the Federal government with 3500 programs

11:21 - The crux of modernizing many of the Federal government's systems is the difference between the user experience and the user interface

19:04 - The failures in applying agile development globally - which was supposed to be a breakthrough for modernization - stem from using technology to simplify legacy approaches to work, rather than using technology to rethink work

26:35 - OKR conversations in 3500 programs is a daunting endeavor. How could it be done on that scale?

31:05 - Using Federal Enterprise Architecture reference models to frame OKR conversations within and across programs

41:05 - Meeting the toughest innovation challenge - transforming programs as they conduct business as usual, which they must

43:05 - We can't boil the ocean so where do we start?

View Full Transcript

Episode Transcript

Speaker 1 00:00:07 Welcome to, I see what you mean a podcast about how people get on the same page or don't, or perhaps shouldn't today. My guest is Mark Foreman marks our friend and federal government consulting colleague with a career of government it transformation, including senior leadership roles in the federal government. Mark. Welcome to the show. Speaker 2 00:00:26 Good to be here, Speaker 1 00:00:26 Lou. Thanks mark. I'm looking forward to our conversation. Why don't you give listeners a short bio about yourself? Speaker 2 00:00:32 Thanks Lou. I've spent, uh, over 30 years inside and out of government and industry a variety of roles in it, reforms, both in the congressional staff and in the white house as a presidential appointee to the first administrator, free government, also known as the federal CIO more recently. And uh, and I've worked with governments all around the world on management reform and it transformation testified, uh, not only in front of the us Congress, but for state legislatures and in Australia as well. Speaker 1 00:01:07 Hmm. And one of the things I like about your background is you're an old operations research guy. Speaker 2 00:01:13 That's my reasonably regional training. I'm not a techie by training, but you know, do any kind of management reform in government. You have to understand the flow of information and information or information, resources management, as it used to be called. Right? Includes the technology use to, to take advantage of that, whether it's workflows or data analytics, right. So much in government revolves around information increasingly. So Speaker 1 00:01:40 It is called the chief information officer for her reason. We talked about, okay, ours, when we were doing our prep call objectives and key results on let's start there. You, you mentioned it as a framework for government to get a better return on it spending. So let's start with tell us what, okay. Ours are, how the federal government could use them and what would use them get people on the same page about mark? Speaker 2 00:02:06 Oh, those are great questions. The OKR model is really a framework for return on investment. It's got three parts to it, an objective that is transformational. You want to have a big impact and that objective should carry you as an organization organization for a year or two years. And of course, most of the government major it investments are at least two years. So they're intended to have a big impact when government does a major it investment. It's $50 million, $2 billion should have a big impact under that are quarterly key results. And in the way, this is being used in Silicon valley venture capitalists, the people making these investments are looking to the software companies to not just reach milestone or funding activity, but actually create something that a customer buys. So it's useful. It's an outcome that matters and it matters to their customer. And the third element is the activity. Speaker 2 00:03:11 Now, the way the government, it investment capital planning and control system was set up. You measure the cost of the it investment. You measured the schedule for the it investment and what became the performance measurement was milestones. Did you complete an activity? And it's because a lot of folks in government feel that they don't really deliver an outcome. They can only deliver what they can control, which is the activity. The OKR model would force them to take a look, not just at what activity they're doing. Well, what's the relationship between that activity and a performance improvement and outcome, the matters to the citizen, whether the state and local government is supposed to work with the federal government, right? So, uh, that's the real values. So it shifts the focus from paying for activities to buying outcomes. And the real transformation is it gets the folks in government that are doing these transformation projects to understand which activities actually lead to a better outcome. Speaker 1 00:04:19 Thanks mark. And okay. Ours have pretty storied history at Intel and Google. So it's not a new thing and it's not, uh, we're not just breaking the learning curve on it, but how are they different just briefly? How were they different than management by objectives or KPIs, which we're all quite familiar with, Speaker 2 00:04:38 Right? Uh, well, we'll clearly you can have a KPI key performance indicator that can be, uh, a measure of an outcome. And in that case, you could use that KPI as part of your KR your key result for a quarter. So you don't necessarily get rid of KPIs. It's really what are you measuring? And the title I think of John Doris book, you got to measure what matters is the heart of the issue, right? Yeah. Speaker 1 00:05:09 And how would you, how would some larger organizations have used these, but the federal government's large, right. And there's larger and there's large Speaker 2 00:05:18 Biggest it spending, uh, in the world, you know, over a hundred billion dollars a year, spend time, right? Speaker 1 00:05:26 You mentioned a number to me, which I've forgotten. Now, how many programs did you say this administration is setting some objectives for how many is there an account on federal programs? Speaker 2 00:05:38 There's a count on federal programs and there's a focus on improving customer experience are about 3,500. I think it's actually over 3,500 programs. So these programs that things like foods, the food stamp program, the headstart program. So, uh, 3,500 quite few programs. And the president has laid out in the president's management agenda and in what he calls the customer expense executive order that each of these programs should modernize to improve the customer experience. So the question is, can you have 3,500 different customer experiences? If you're one person dealing with multiple programs, do you really want each program to have a different customer Speaker 1 00:06:24 Right now? They could have it still boggles the mind, but they could have their own objectives. It's certainly could have their own key results, but some were so when they were used and I've, as I've read about at Intel and Google, they, they were established up and down the chain of command. Some of them were top down, but many were bottom up. And in fact, some of those, some of those leaders whose names, we know Andy Grove and, and the Google CEOs thought it was very important for teams to be able to establish their own, okay. Ours, not all of them, but a subset of them. So how would the federal government go about, how could you imagine the federal government going about doing that? Speaker 2 00:07:05 Let's take a look at the customer experience area, uh, the president in his executive order and customer experience as the agencies to focus on one key measure of success, uh, in the commercial world, we call it response timer or cycle time, the president called it the time tax. How long does it take, if you are deserving of a benefit to actually get that benefit approved. And it could be months in some cases it's over a year. Yeah. And so a part of the issue is you've got these legacy approaches where you fill out a form, it used to be paper. Um, now it may be paper or a website, and there's still a lot of paper in the federal government, but you fill out this form, the form then goes into a system. The system runs through some checks. And, uh, as we've seen with recent reports on the, uh, the PBP, uh, paycheck protection program or some, uh, pandemic relief, that the way that these had been automated, regardless of whether it was true or not, if you figured out how to put the right data and the right elements to the form, you got the benefit and people who are legitimate, trying to figure out the forms or fill in the forms, didn't get the benefits. Speaker 2 00:08:30 Um, part of that was maybe 80% of the cases fit these traditional simple business rules. And you can run them through an automated workflow and associate those 20% of the people where you need a caseworker. You need a more sophisticated approach. You need somebody to look at the situation and apply some judgment, uh, how we treat the applying for benefits or for that matter state and local governments that get grants, university research centers that get grants, uh, environmental and other regulatory compliance, uh, has to become more sophisticated. Now we're in the 21st century and it can't be stuck in accuse. So the question is, if you're in one of these agencies or departments and you're building an it project, are you improving the customer experience of a form that's built in an industrial era, right. In a workflow or processes decades out of date, right? You know, are, are you doing something that, that really gets to the heart of this time tax issue, this response time, recycle time mission that the president's called out? Speaker 1 00:09:45 Well, one of the things that I think is good about OKR is, is that if every one of those programs that say 3,500 programs had an objective that said, uh, improve, let's say it's called time to benefit the time that somebody applies for him. And she was a benefit. Each of them would have their own appropriate activities and they'd have their own appropriate measures, right. The key result. So that's useful because then government wide, you have one objective and by program, let's say by, uh, department by, uh, you've got, you've got appropriate key results. The theory with OKR is the result. The key results have to be worded such that if they're all achieved, the objective is met. That's a great sort of a rule in the system, which I like a lot. So let's say some smart and experienced government officials at different levels of an organization, put their brains together and come up with those. Speaker 1 00:10:45 And you mentioned quarterly, I think mentioned quarterly reports, quarterly reviews. Yes. It's the measuring quarterly, which is good. Probably a good, a good pace. Tell me how you see some of the mechanics work like a workflow, the process. How do you see some of the mechanics working so that if they've established those key results, and then they turn to their key activities and what are we going to do to accomplish those things? Tell me how you see that process going and how would that jive or not jive. And it's okay if it doesn't with what they're doing now, what would have to be done differently, maybe what would have to be abandoned, let go of, Speaker 2 00:11:21 Yeah. So, uh, so I think those are great questions. And I think the crux of the issue in modernizing so many of these government systems is really the difference between the user interface and the user experience user interface. Uh, maybe the website is easier to use, but what about the people that don't have access to the web? But what about the, the, the people in state and local government who have their own systems and, uh, they can't interface with these highly customized federal systems. You know, so there are things on the technology side that relate to the processing of the data, and then there's also the process. Um, so I'll give you an example, new year in Florida, right? Okay. Hurricanes come, uh, the federal government gives them money to the state government state government makes some money available for the local communities. Some are local governments could be townships, could be a variety of local organizations that local organization can't just go off and spend that money. Speaker 2 00:12:39 They've got to fill out this federal form called a, a project worksheet. I believe it's called. And that project worksheet goes through the state to FEMA things kind of proven. Now that project worksheet is basically a spreadsheet and it's got a lot of terminology and costing and categories that you fill in that were built back in the 1980s. So the last iteration, the government spent a lot of money simplifying that project worksheet interface. And then they went out and they trained. This was after hurricane Maria. They went out and trained a bunch of the local communities across Florida, and still nobody could follow the process. In that case, they made it easy to fill in the form, but they didn't fix the underlying process, which is, you know, most of the people filling out that form was, were people in the township, maybe they were part-time maybe they were about here. Speaker 2 00:13:48 Maybe were there a fire person, person volunteer fire department or actual EMT. And they didn't know the terminology. So it was easy to fill out the form, but they didn't fix the underlying binding constraint on making that form simpler to use, or their process easier to work in the OKR model. If you adapt the way that, so that kinds valley addresses this, you would have seen that nobody was using the investment or the, the output of this investment in this case, a simplified form. And so if your metric is I build this and the key result is 10% increase in the use of this, this form and this process. And you saw a 0% increase that would become feedback to the team saying, right, we did the wrong thing. You know, we did it the right way. We used agile development, we just did the wrong thing. Speaker 2 00:14:51 So this is very much, I think for government, a requirement to focus on the concept or binding constraints or the bottlenecks of a process. And it forces to look beyond the technology to actually see what are the process issues. What's that relationship between simplifying the process, the application of technology, maybe it's, there's not enough people in that agency to, to run that process. So you build up queues, you made it simpler. A lot of people used the process, but only 10% get through the cues because you made it too simple, too fast, right? Uh, uh, I think that the value here is a, it shifts the focus on these modernization initiatives from an it project to a true customer experience, looking across people, process and technology, Speaker 1 00:15:50 Okay. Shift from it project to a broader look at what you said, people process in technology. Here's the thing to mark. There should be part of that. I think one of the things that OKR should do is foster or prompt or require a different conversation between parties, right? Your, you need to have a different conversation to get the objective can be set from the top, but you need to have a different conversation to know what the key results should be and the activities that would achieve them. So if federal officials let's say, FEMA sat down with some state officials, but they, maybe they were emergency management people. Maybe they were some others and said, here's our, here's what we need. Here's our form. And let's just say that it had a bunch of cost accounting kinds of things in it, right? Because FEMA, you know, has to report something O and B and then report something to Congress. Speaker 1 00:16:45 And sometimes somebody from FEMA call it up to Congress. There's got to be cost accounting. Fair enough. But that's not the objective of the program. That's an administrative requirement perhaps, but it's not the objective of the program. If the program was to get federal money into the hands of local communities for let's say emergency preparation, right. And disaster prevention, mitigation recovery in the right conversation. And this is, this is happened before it isn't like government officials never have these conversations, but in the right conversation, people would say, look, we understand you have to account for the money. Let's talk about how to do that without making the forum all about that or making the process bottleneck on that. Because now here's where I pick up your point. We've got volunteer, firemen filling these out. We've got, we've got, you know, small communities along the coast filling these out. Speaker 1 00:17:40 It's not always Tampa. Right? It's got a government infrastructure and government. If it's not always, it's not always a city, sometimes it's counties and sometimes a small jurisdictions and they're not going to get it. They're not going to know what you're saying. They're not going to know what to do. And they're going to think they're going to go to jail if they get it wrong. Right. Yeah. You can cause a bottleneck because you're, well-intentioned, you want to be able to account for federal money. You think that a part of the process of giving out the money and approving its expense, its, its, its distribution should pay to have a form filled out and you cause problems. You don't know you don't foresee because you didn't have those conversations. I think it would be important. And I want to get your thoughts on, on this question, depending on the program, 3,500, 4,000 programs, depending on the program, they should be thinking about who to have those conversations with. Speaker 1 00:18:33 Maybe it's maybe it's just within the federal family. Sometimes, maybe it's Tuesday at local officials sometimes maybe it's to other, um, entities in, in, in, in the country. Because really, if you think about it in a way it was used in companies, but your smaller operations, no matter how big they are, they had those conversations up and down the chain of command. Right. And sometimes it was what people at the top couldn't know that came from the bottom to say, here's also what we should do, what we should measure. And we should do that rounded out the OKR. Speaker 2 00:19:04 Uh that's absolutely correct. And, and Lou, of course, some of this is inside baseball because you and I did, uh, do that trip around the, uh, the U S meeting with these various regional offices. And uh, and what struck me was how so many of the central office determined it investments or it modernization initiatives were well-intended but actually were constraints on the ability of the field to do the job. Right? And so very much this needs to be a, uh, a bottom up understanding the requirements of the field. But, uh, I'll tell you globally, the failures in applying agile development, which was supposed to be a breakthrough for modernization, uh, were directly a result of pulling together focus groups and not letting the people in the focus group, see what technologies could actually simplify the workload or simplify the practice, but merely how do we simplify the legacy approach to doing the work? Speaker 2 00:20:15 I think the two have to go hand in hand and in using the focus group. And that's why though I think this, okay, our model is so powerful and the way Silicon valley did it, it's not what activity was completed. It's who used what you built and did they renew their, used to, they increase their use were more people using it, right. You know, if he did that with a lot of these federal programs, you'd see a good number of them were not being used. People were forced to use them. And I think the government and the central offices would see what we saw in the field. People, we do work arounds and then doing their daily work with a spreadsheet or paper and then putting it into the system when they were told they had to, but the system wasn't supporting their work. It Speaker 1 00:21:06 Was, they were supporting, Speaker 2 00:21:09 Supporting the headquarters or the people who wanted the systems, but the people were doing their work around it. And the systems were another burden on them to get their work done. Right. Speaker 1 00:21:21 We got to fix, it was ironically on one of those trips that you turned me on to a theory of constraints and, uh, w was the title. I bought the, the goal, the goal we were driving to dinner. I was sitting in the backseat, I downloaded it on my phone. I bought it and downloaded it, sitting there and began read it. And I read it since then. I have several others about the theory of constraints. So you made a connection. I didn't see coming between and strange. So say a little bit more about that. Speaker 2 00:21:49 Sure. You know, the book, the goal highlighted that you could use computers or computerized machinery in a business process and each component could be processing at the max performance level. And the overall business process could be failing errors, uh, cues that slowed things down inefficiencies, the typical process metrics. And the point was that you had to understand what was the outcome or the objective you were trying to achieve from the process. And then you could manage the flow of work through that process and manage what you applied and how you applied your technology to maximize the, the performance measures for that process. Be it quality measures, uh, cycle time measures, cost measures. But we had to do it from a comprehensive look across the process, as opposed to looking at each piece of that process. Speaker 1 00:22:56 Right, right, right. And it isn't, it's sorry, go ahead. Isn't it also true? Isn't it always the case there will be a constraint somewhere in there, Speaker 2 00:23:05 Right. Or bottleneck, because there Speaker 1 00:23:06 Will be, there will be a point where something slows down always and you can, and the goal was to do something about that constraint, but even if you do another, uh, another bottleneck we'll, we'll be in somewhere in the system, there always will be. So I think part of what I got out of it was for process improvement or for good processes, or for working better with customer, you should always be having those conversations about what's working. What's not working where, where something getting hung up, where are you having to work around something. Right. Speaker 2 00:23:38 And understanding the flow right through that process. Right. So the equivalent in the OKR model is called value stream analysis. Okay. And the concept is that, um, if you're looking at achieving the OKR and you find out that you're not achieving your key results for a quarter, you ought to be able to dig into it and understand where the process broke down in the valley stream. Okay. Um, there's a, a, a company called Tasktop and the CIO has done a lot of us. The CEO, I should say, has done a lot of work on this. In fact, we wrote a book about this concept specifically around agile development, but it applies to other things. And the, the example would be you have a software company, maybe it's mid stage. And, um, the VC has the venture capitalists would put, uh, performance measures on it there, okay. Speaker 2 00:24:38 Ours, where they're looking at things like renewal and sales graph, and they have a quarter where the renewals are 20%, which means basically 80% of your customers are going away. If they're not renewing, you gotta be able to, to look into your operations and your activities and see what's breaking down. And in the example, you could see things like, uh, the developer staff's unhappy. And why are the developer and stat developer staff's unhappy? Well, they're not able to focus on creating new features and functions. Why can't they do that? Well, they've got to focus on putting place security controls because they never built the security controls in from the beginning. So rather than fixing the product, they've got all these security controls issues that clients want and need, and they had never put into the product. So that's a value stream model applied to systems development and achievement of OKR and it development. Speaker 1 00:25:43 So imagine again, a one or someone from all of those federal programs who are going to take this on, they're going to use a car. So start a process of using it to begin the conversations with users. I would hope mark, I think, and I would hope that adopting a characters, a framework would pause programs to have conversations or to learn about their customers in ways that they aren't right now. So they know what the next value should be added, added should be about, right. We know what the program's supposed to do. That's in legislation, that's in reg regulation, that's in a program, but the customer's life is changing all the time. And innovation in the, in the, in the commercial world is to be on that so that, you know, what needs the customers, what it means or emerging in a customer's life. So you can offer a value proposition to meet those needs. Speaker 1 00:26:35 You can make, maybe create new value innovation. I would hope that OKR, which set objectives over a manageable period of time, like you said, year or two with key results on, let's say quarterly basis. I would hope that it would, people would begin to understand the need. Why we'd better be gathering information. We're not gathering right now. We better be talking to people. We're not talking to often enough, or maybe in forums that are different than meeting them at a conference, which is legitimate and perhaps valuable. But maybe we need to add to the communication we have with customers or partners in this system of benefit delivery and change those conversations. Those seem like daunting things to do. How are you? Those conversations seem like daunting things like how do we do it? Is it, did we hire facilitators? Do we, you know, get a rent, rent, some hotel space? How, how, how have you seen it done? How could it be done? 3,500 programs ought to be thinking about those kinds of things if they're to use okay. Ours, just for the customer experience objective. Speaker 2 00:27:36 Well, there are some breakthroughs sometimes, uh, across multiple programs. Uh, there was a law that was passed a few years ago called the 21st century integrated digital experience act 21st century ideas. What the acronym is, and it requires a white house office of management budget, jive identify high-impact service providers. These are some kinds of programs, but, um, a lot of it is the organization that has multiple programs like the veterans benefits administration, the veterans affairs department. And there are lots of programs under that they have to take and report survey data. And there are set of specific questions about, does the process or service you received today, uh, help improve your trust that your needs are being taken care of, help you improve your trust in the federal government. And it's tied to whether you, you come in via the web or you come in via a mobile app. Speaker 2 00:28:43 Uh, that's, that's fine if you're coming in through the electronic means, but if you're coming through manual means, right, I think that's, that's kinda the next next wave. But when you look across all these different programs, I think it's really incumbent on the white house, on the office of management budget and the e-government office, mild office to, uh, bring forth that citizen centric approach. One of the biggest constraints in the federal government is there is no office of citizen services. There's the department of labor that's labor programs. Uh, there's HHS that it gives grants to the states where you get a case management and social services from state governments. There's, uh, the food stamp program, uh, and agriculture and snap. There are, um, multiple other programs that are a mixture of you come to the federal, may come to state government. Uh, we created an initiative when I was at LNB called gov benefits. Speaker 2 00:29:45 And that was the concept that you ought to be able to get an integrated benefit for, or be able to apply once. Right. And so much of this is tied to state. So, you know, you've got a, roughly 50 plus states and territories, depending what state you're in that rocket science with the computers anymore. But you ought to be able to fill out an integrated form. It ought to be able to reroute it to all of these different benefits programs. And you ought to be treated like an individual by the government, like 30 or 40 different individuals, because you're able to access 30 or 40 Speaker 1 00:30:23 Different programs. Right, right, right. Yeah. Speaker 2 00:30:26 So of course, absent that integrated customer care case management, like you you'd get it, uh, uh, an Amazon or Facebook marketplace, uh, you become the integrator, you got to figure all this side yourself. Right. Right. And for a lot of people, the studies have shown it's just too confusing. You'll go to the easy ones, but the ones made that could really help you. You just don't, if it's too hard, you don't go to it. It's just to, you know, the verbiage of the federal bureaucracy. That's really hard to understand, even for people working in the federal bureaucracy. Speaker 1 00:31:05 Yeah. It is. That raises an interesting question about where should the borders or the boundaries be if we've got right now, if let's say a 3,500 defined identifiable programs, that might not be the best way to divide, to think about the LKR questions. Years ago, I had some recent to look at the federal enterprise architecture, mark, and I always thought that the business reference model scenarios was pretty smart. It had a nice organization of business functions of the federal government, and it would seem, it always thought, it seemed to me if you said, okay, ours by those functions at a high level, and you could cascade them down. Some of them would go to different departments, but that's our point. If we have created that much variety of programs across the federal government, but we want to manage them better. And you said, you know, a better framework for ROI, somewhere, somewhere in planning and execution and review and budgeting, w we, we have to be able to roll up the results to see what they are at a business function level or some higher level. How are we doing with healthcare, healthcare for veterans or healthcare for children? How are we doing with transportation, transportation, safety, so many things, things cross over it wouldn't make sense to have a lot of, okay, ours that work wholly separate, right. Completely, completely separate, uh, by, by departments or even by parts of one department somewhere, you have to roll them up. Do you see that happening? Or, and then Congress is going to get involved in that. Congress would have a lot to say Speaker 2 00:32:41 About that, right? And I think from the congressional perspective, they've already laid out that framework with the government performance and results act, monitorization act, what people call the results act. So, so just to, uh, if I may embellish the business reference model of the BRM discussion, what you referring to is what we originally called the federal enterprise architecture reference models. And this was a, a business concept for, uh, how you organize and, uh, align it spending with your corporate strategy that was brought into the federal government, uh, many years ago. So the, the question at the top that you're asking is what we call the performance reference model. People should know that every spring, the federal government had a report to Congress for the money that's been budgeted. What were the performance results for each of these 3,500 programs? And under the results act, the agencies is supposed to work with Congress to set the strategy and the targets for the next year. Speaker 2 00:33:50 For example, if it's the, a school lunch program or the food stamp program, how much are we going to reduce hunger in there? That would be the discussion right now, the business reference model says we've got dozens of lines of business and dozens of activities in the federal government. What are the lines of business now that have to improve in order to achieve that goal? And then under that, what are the business processes that support those lines of business? Those happen the grant programs to the state governments. And so what are the business processes for the grants? And the grant programs are done using systems and, uh, concept or risk reviews. Uh, it's a very bureaucratic system where an increasing amount of the dollars that is supposed to go to help people actually go to administering the process, right? So it could be as simple thing as we want to have 50% less of the funds spent by the states on administering the program and have that 50% go into reducing the amount of hunger and measure all that, that would then drive the it initiative. Speaker 2 00:35:15 Now, when you get to the state government, the state government generally has local governments doing the casework. If it's state programs, it's generally through a county, uh, social worker caseworker, and the issue there where the reforms are occurring, the caseworkers call practice models. So there's a slew of innovation going on at that local level and how to improve the practice model. And I think one of the great question is how do we take the innovations out in the field and put those in all the way back up the top? So it actually does free up the funds and it frees up the fund, not just in that area, during the innovation, but across the us. Uh, and, and I think when you start to focus on the leveraging of innovation from the local governments and one community across the U S then it becomes easier to set the OKR, Speaker 1 00:36:18 The process of the hierarchy of them reference models you mentioned, and the process of setting objectives and measures to report on annually, presumably would would be used that that sort of that process would sort of be calm when an OKR conversation. Speaker 2 00:36:35 So back to your discussion on the, the cascading issue, any one program, you know, you figure a caseworker at the local government is probably looking across 20 to 40 programs to figure out the best combination for the client. Any one program only sees and focuses on their specific, right. It investment, right at that business reference model layer, really at OMB, you have the opportunity to look across programs. And so the OKR can cascade down to those 20 or 40 programs. So innovation can be shared across those 2040 program. You can use the, okay, okay. Our process to drive innovation from one group across the country and across the other programs, that's the way this cascading would work in federal government. Speaker 1 00:37:27 I think it could be used profitably, and I think it would change the investment picture. It ought to change the conversations that people have about investments. So that I think this is an age old problem. A lot of programs want their it, and it's, it's a bit stove pipe for them. It's their investments. It's their, it's their hardware, it's their software it's there's solutions that they apply to problems that are to deliver the mission of the program. But we're talking about things about needs from a citizen standpoint, from any federal government customer standpoint, from their standpoint, they don't care. They don't care where the it came from or where the it investment was made, who made it, but people own some of those it and investments and they guard them. So how would the conversation change if we could look across programs and say, you know, we there's 10 or 15 or 20, or a couple of dozen programs that have a piece of some, okay, ours, maybe the objective is, is shared. We are going to have some different key, key, key results, and we we'll have some separate key activities, but maybe we also would integrate or combine some or collaborate on some, that's got big implications for changing the conversation about how an investment is made in it, in the next cycle or two. Speaker 2 00:38:45 I think you're right. The question that you're getting at is how do we inject the voice of the customer across programs, as opposed to one by one, because in the one by one model, you, you run the risk of sub-optimization in some of the recent president's management agenda and the executive orders. And I think this really goes all the way back to the Bush administration and the e-government act. There was a clear focus on taking a portfolio approach. So you understand much like an Amazon, you have things that you're going to do between your business and other businesses that are part of your environment. And there are things that you're going to do across those businesses between you and your customer as a customer. Your ideal is the customer is not just coming once, but the customers coming for multiple things to your environment, right? Speaker 2 00:39:52 That's a little different in the sense that the government doesn't want to build dependency, right? Like the government wants to be like a safety net, right. And build independence that dependence, I believe. But then you have other aspects in the regulatory environment, environmental regulation, for example. And for many years, the, uh, I think the Congress has had hearings and GAO has reported that the, the different regulatory language causes confusion across the regulated because they're not regulated by one agencies by multiple agencies regulating generally the same thing, or at least overlapping. So you have to take a look at this. There needs to be a structure that looks at this from a portfolio approach, you call customer centric. But obviously in the regulatory environment, you're really regulating businesses, the real customer, right? So it's a public good it's the public is the customer, right. But you have to look across these systems and programs, and that's really the job of all. And be fair enough. Speaker 1 00:41:01 Is there anything else that has come to mind that you want to talk about? Sure. Speaker 2 00:41:05 A couple of last points on this monitor, this OKR model is really good for transformation initiatives, but the agencies still have to run their business as usual. So, uh, I really like Christine or wad keys, radical focus is the book that she wrote. And she talks about a cadence where you have the quarterly assessment of your cars, but every week you're looking at your activities. Are you getting close to the achievement of the key results in the objective while you're also tracking? Are we keeping our eye on the business as usual metrics? She calls them health checks or health metrics. Okay. Yeah. So agencies need to focus on, okay. Ours for the transformation initiatives, but KPIs for their, their, uh, operating plans and their operations, Speaker 1 00:42:04 You know, mark. Um, those are, Speaker 2 00:42:06 Yeah. So that's, that's one key point. Speaker 1 00:42:09 Please comment on that before you go to the second one. Yeah. Don't forget the second one that makes me think something that we've talked about before, which is innovation in government innovation in any organization, innovation and companies that live in on it is difficult because their operations are set up every day to create and deliver the value today to today's customers. And they want to do that as efficiently as possible. And yet the customer's world is changing. So the organization wants to continue doing different things, to create new value for the customer that's existing customers or new customers, but there's a current customer. Current value, future value, future customer play hard to do in one organization. Even the best organizations in the commercial world are challenged by how to do that. And there's books written on different ways to do it. And to me, it's the weakest part of the innovation literature. Speaker 1 00:43:03 There really needs to be more study on how one organization delivers today's value and creates new value. At the same time, government could have a real hard time doing that for the same reasons they have set up programs, the programs have intended benefits and beneficiaries. They are expected to be efficient in how they deliver those, whether they are or not. And so they've got metrics that they're trying to hit for operational efficiency to get the benefit to today's customers. But we've talked for the most of the hour about the fact that much of the conversation has implied change and the government continuously creating new value for customers. So what I heard you say there was that a combination of things, okay. Ours being one could help with these two different conversations and perhaps these two different organizational endeavors, Speaker 2 00:43:50 Right? I think you can't boil the whole ocean, especially the start, right? The focus today on a rewarding activity, essentially rewarding actions rather than results has to change. Right. And I think an organization has to learn how to pick the right activities, right. That generate outcomes. So I think one approach to doing this is to get agencies, to learn by picking just one or two of their it investments. It's true out of the, um, a hundred billion dollars, somewhere around $15 billion has been spent on what's called the development modernization and enhancement. Uh, these are your monitorization initiatives and every agency has multiple ones. How much change can, uh, uh, cabinet agencies, cabinet departments or, but I think it's a great question. And I don't think there's any perfect yet answer. So starting with something very small focused, a stretch, the second point I was going to make is it's a learning process. It's not like a rigid rule. If you do this magically success occurs the first quarter. Right? Right. You have to learn which activities generate the outcomes if you've never focused on outcomes, but you just focus on funding completion of activities. It's a random walk. Maybe you'll achieve it. Maybe you won't, but learning which activities generate outcomes is almost like an iterative approach. It is. And, and so I, I think that's, that's a key for agencies to be successful here. Speaker 1 00:45:40 Yeah, you're right. And the literature about, okay. Ours emphasizes that those organizations that came to do them well had to learn. They didn't do them well, you know, out of the gate, they had to learn how to have those conversations, who to involve in those conversations when to bail on something, when to double down, if you have to learn those things. And I think the only safe way to do that, and if we do need some safety in it is like you said, pick, pick one or two, maybe pick an objective or to pick a it systems related to it. You're almost experimenting with it yourself to see how to apply it and see what works and what doesn't work. And before you maybe roll it out before you start tying investments to it before something really consequential has done with it, right? Speaker 2 00:46:26 Oh no. I think that's, I think we've covered a lot of the ground. Okay. Speaker 1 00:46:29 I, my notes here, I had fun going back and looking at doors book and some other things and thinking, thinking about this, especially as it applies to do the federal government, uh, mark, thanks for joining the podcast. Uh, you know, whether we were working or golfing, I always had a lot of fun talking to you and learned a lot. So thank you very much for taking the time to do this with me today. Like, Speaker 2 00:46:51 Frankly, this is great. Thank you. All right, Speaker 1 00:46:53 My friend. Appreciate it. All right. All right. Take care. You too. Bye-bye and that's how we see it. My friends, I want to thank mark for recording today's episode. You can find it at, I see what you mean dot dot com. Plus all the usual places, send questions and suggestions through the app. Subscribe and give me a five-star rating unless you can't. In which case, let me know why do join me next week when we'll take another look at how to get on the same page and stay there unless we show.

Other Episodes

Episode

April 27, 2022 00:26:07
Episode Cover

Another Admirable Bundle Of Contradictions - Part 1

Meet Lola Stith and she's quiet, unassuming, cooperative, hard to read. Know Lola and she's focused, determined, uncannily perceptive, catch-you-off guard funny, and perceptive....

Listen

Episode

April 20, 2022 00:48:59
Episode Cover

I Might Have A Master's Degree In Conflict Resolution, But This Guy's A Master At Resolving Conflicts!

I love knowing people who don't fit typecasts. Analytic types good with emotion. Tough leaders who care for their people. I love working with...

Listen

Episode

February 23, 2022 00:20:05
Episode Cover

Retrospective: Looking Back At Episodes 1 - 20

In this week's episode of I See What You Mean, I highlight ideas and insights from the first 20 episodes. Several guests discussed what...

Listen