In 2015, a study released by Bootstrap Solutions with help from the University of Nevada-Reno outlined the economic benefits brought to Idaho by the Idaho Army National Guard in the Treasure Valley. Results from this study found that the annual activities of the Idaho Army National Guard supported 2,800 direct and indirect jobs and contributed more than $155 million to the local economy. The study appeared sound and was very well-received. That’s when things got interesting.
Kevin E. Cahill, Ph.D., wrote a guest opinion piece for the Idaho Statesman in which he prescribed a framework for interpreting the results of the study and providing a wider social context for the economic benefits it described. Last year, Jenny Thorvaldson (IMPLAN’s Chief Economist) and I got to talk to Dr. Cahill about his opinion piece and how publishers of economic impact studies can qualify their results to account for the non-economic factors at play in a study region.
Tim French: What motivated you to write the piece? Do you regularly look out for studies like this to write about for the opinion section or what was the genesis of all of this?
Kevin Cahill: I wasn't on the lookout for anything in particular with respect to IMPLAN. I’m a labor economist and an applied econometrician. My field is actually the economics of aging. I work for Boston College and for Econorthwest and my work typically has very little to do with IMPLAN. So I moved from Boston to Boise in 2010, and since I moved here I’ve kind of been intrigued by this issue of the F-35 fighter jet being based here in Boise. Personally, I'm not happy about it because I think that it would really have a negative impact on quality of life. But as an economist, I think from a public policy perspective, if this area wants to invite something like the F-35 into town, there should be some sort of cost/benefit analysis that’s done well. Back in 2012, the Air Force issued an Enterprise Infrastructure Solution (EIS) and they used IMPLAN and somebody asked me to take a look at it through an economist’s lense and to comment on it. I did and my critique was essentially the same as what you saw in this one. IMPLAN only gets you so far and it does a very good job at quantifying certain kinds of impacts. But when you want to use it in the context of a public policy decision, and (implicitly) as a cost/benefit analysis, it falls short because of these negative extra analyses that might exist. So that’s why I wrote the op ed: Because this topic resurfaced and the department of commerce here started using the results from the 2015 impact analysis that was done by the National Guard to talk about the F-35. As an economist, that is just wrong and I just wanted to express that viewpoint.
TF: Are there a certain number of impacts, or is there a set of impacts that should always be included in a any kind of reporting on projects like this? There’s obviously the economic impact but also I think you'd mentioned the quality of life impact, social impact—are there others that should always be considered as a standard set?
KC: Great question, Tim. I was testifying in Seattle last month on a case that involved a composting facility in Seattle and the composting facility has all these positive social benefits: It reduces waste in the landfill and helps with water runoff and all the positives that come with composting. But some the residents in the area complained that when the wind blows a certain direction, the foul odors from the composting facility impact their quality of life. I was called on by the composting facility to do an analysis; one that involved the economic impact to the community. And then there was a public policy side too, where I talked about the benefits to the community. I was on the stand and I actually used IMPLAN and I talked about the economic impacts. In the cross examination, the other side's attorney said exactly what you said which is “When is it that you need to use all sets? Because you have a pretty big mouth, Dr. Cahill, and you talked about the F-35 and you talked about the negative impact but in this case you're not talking about it.” and, you know, I think it comes down to judgement. I'm a researcher and economist. What is such a large negative impact where you'd say you actually need to go through these offsets? Right in that case I made a professional judgement that those certainly are negative, having a fairly foul odor every now and then. But that it's not enough... it's almost a footnote, you know? You'd say “here are some negatives that you might want to consider in addition to some positives that were not quantified” where in that case I didn’t use offsets. But with the F-35, I think the negatives are massive. Another good example: I testified in South Dakota for the Sioux tribe about what I thought about the State Department’s EIS. The State used IMPLAN and that was one where I think the negative consequences of the oil spill right next to your water supply is not trivial and needs to be taken into account. So, I’ve been on both sides of this and I didn't think it comes down to professional judgment to determine how we prioritize how do we think about the negatives, are or aren’t they enough to warrant an analysis where you have to consider offsets. So, sometimes they are and sometimes they’re not.
TF: When you were writing the opinion piece, how much did you consider the readership? Were you hoping to reach other economists or were you really writing for the local legislators who were considering taking action based on the impact analysis?
KC: I was writing to the educated lay person. Someone who sees the numbers thrown out in these economic impacts for the 2015 study and they just float around... it’s almost like in the air if you put your ear up you'll hear it. I wanted to make a point about what those numbers really are and have a proper way to interpret them. So, it wasn’t to the legislators (although I hope they’re listening), it wasn’t to the governor (I hope he’s listening), and it wasn’t to the mayor (although I hope he’s listening, too). It was to the broader community that I wanted to say, “Here's something very important to us all and if you're going to consider the pro-policy standpoint, we want to do it right, so here's how you would do it right.”
TF: Why do you think people generally conflate economic and social quality-of-life impacts? I think the average person (myself included) reads that something is going to have an economic impact of whatever and I think that a lot of people take that at face value. Why do you think that is? Or is it a reporter's mistake to not include some qualifiers when publishing those numbers?
KC: I think it’s just overall confusion. Jenny, you know economics, too—when we think about these decisions, we think about them from the “social optimum.” That’s the basis. Is it socially optimal to move forward with policy acts? And that includes the things you mentioned, Tim—it includes economics: The "economic” (like expenditures and jobs), it would include social factors, include quality-of-life factors, and I think an economist thinks broadly in terms of the socially optimal and includes all these things. And so, when you start hearing what people start reporting on, I think it's just natural to get confused and not really know what’s going on. You hear the word “economic” and people kind of think they know what it means so they run with it. I don’t think there’s anything mischievous necessarily, it's just that there's confusion as to what is the socially optimal outcome and how does the "economic outcome” fit into it. I am sure Jenny has run into this throughout her career as well. You know, there is a really strong misunderstanding of what the field of economics is to the general public. We think about the efficient allocation of scarce resources. You take it for granted that that's what it is. But I think people don't quite understand what economics is. That often plays a part here, too.
Jenny Thorvaldson: I also think that sometimes a challenge is that it's really hard to compare social impacts with economic impacts—especially when the social impacts don’t always have a dollar sign assigned to them or it’s difficult to do that type of research. Are there methods that currently exist that would allow for quantification of these types of social impacts and can they have dollar values or would you not recommend that?
KC: What I like to say when in these kinds of situations is it is difficult to quantify some of these negatives or offsets (or however you want to characterize it) but to not quantify it, we’re implicitly assigning it a zero dollar value and and we know that is flat out wrong. So, what I always like to say is, there may be imperfect approaches to quantifying (in dollar terms) an offset, but we know for sure is that it’s not zero. So, if we ignore them, we’re assigning them a zero dollar value and that’s just wrong. Then the question is, “Okay, well, zero might be wrong, but what we get on the other is just some kind of dollar amount.” I think some of these studies on hedonic damages are are insightful. One example is the impact of housing values (for talking about the F-35). How much would noise impact the value of the property? There are several studies that are pretty good and talk about how much a house value might go down with each additional decibel of noise in the surrounding area and measure lots of different ways. It gives you a hook of saying, “Okay, if noise increases by 10 dB or by X percent, in the econometrics you could expect a particular house price to decline in value by Y dollars. If you could use the noise effect contours that are available, then you can take the number of houses and multiply it by the expected increase in noise, multiplied by the houses and you get some measure of the dollar amount that the property value has gone down.” That’s one approach and that’s just one negative. You have at least some measure of declining property value, what the quality of life is, that there've been studies that use surveys to find out how you would value not having the park in your neighborhood—things like that. Those are notoriously squishy and not very reliable, but they give you a range you could use to come up with some kind of range for the offset. So, for the F-35, if the impact is $150 million (or whatever it is), if you know the offset is $10 million, then that means something different than if its $100 million. So, at least you get a ballpark figure of what you're talking about when you’re thinking of offsets. I think that's where I got my perspective on what happened—just ballpark how big that offset is and if it’s $5 million, you’d say, “Well, that's trivial. Does it matter?” So, I think the F-35 is much much larger and if you were to do some kind of analysis (I’m just guessing here), I would guess you may get a negative number. You have positive impacts, but they may be completely offset by the negative just given how, in this airport (if you guys are familiar with the Boise airport, it is 3 miles from downtown), if you have something that is as loud as an F-35, that’s really gonna have a massive impact on downtown.
JT: I'm curious if you’ve gotten any feedback after publishing this opinion and if anyone is trying to quantify those that you know of?
KC: Yeah, you know, I’ve gotten a lot of kind emails. People saying, “Thank you for writing that, the message needs to get out”. I think there are lots of people talking about impacts on wildlife but nobody is really looking at it from the economics of it all. In my past I worked at a firm called Analysis Group in Boston and the resources that it takes to do this right... it’s a lot. It just costs a lot of money. So there are just not the dollars here I think to fund such a study. I haven't heard of one and I’m not surprised either.
TF: It would be interesting to me to consider the research on the impact of noise on the environment or cattle grazing or what-have-you... I mean, it can spin out of control the levels and layers of research required to do it right?
KC: It’s difficult. Just because it’s difficult doesn’t mean it's zero. I will let you know a little anecdote here about the study I did for the Sioux tribe. I can't believe the State Department said this, but they did have it in writing that the consequences of quantifying the consequences of an oil spill are beyond the scope of what the State Department can measure. Of course, I took that nice little quote which is buried in the footnote and I put it front and center. Just because it’s hard doesn't mean you don't do it and you certainly don’t call it zero dollars. So I agree that it's difficult, but again, just because it’s difficult doesn’t mean it’s a zero impact.
TF: Is there something like an ethics of economics conversation that should be affecting how economic impact analyses are done (especially when you talk about governments)?
KC: There’s not. I think economists think that the market of ideas, with the free exchange of ideas, that the good ones will rise to the top and the bad arguments will sink. But I am not aware of any ethics training in economics. There’s certainly no certification to be an economist. I have a Ph.D. in economics and I consider that as a little bit of a basis to call myself an economist. But my wife is a clinical psychologist and she had to get certified, she had to get licensed, and she has to get relicensed or recertified every two years, I think. And there’s nothing like that in economics. So, sadly, the answer is no. There’s really no ethics training.
TF: What advice would you give to decision-makers who are presented with impact analyses about how to make the right assumptions or equitably weight the significance of the economic impact relative to the other types of impacts that might be available or researchable in a region?
KC: So this is squishy. This is that intersection of politics and economics, right? So I'm not so sure that the decision-makers really want to know what the answer is. I'm not sure because there's the the incentive for politicians to invite the F-35, and so you have this conflict between footable careers versus what’s socially optimal. There’s a tension there but you would think that the answer is you get a good economist, you get someone with integrity to stand up and say, “If you want to do this right and you want to know what the socially optimal outcome is, here is how you go about doing it.” IMPLAN is a fantastic tool. It’s fabulous. It's reliable, it’s rock-solid, and it gets you part of the way there. And for some questions, it gets you all the way there. But for another question, it’s only part of the puzzle—it’s not the whole puzzle. So you need to have an economist with integrity (and obviously knowledge about what they’re doing) to lay out what needs to be done.
TF: Or at the very least, have a really good economist friend that can check your work for you.
KC: Yeah, auditing is always good. That’s why we have peer review, right?!
JT: Sometimes we see that there is a pretty solid study out there but it may be misconstrued by a reporter who doesn't have economic training. I'm curious if you’ve come across that in your own work and if there's any any remedy that you see for something like that?
KC: It happens a lot and not just with IMPLAN work. It happens a lot in my economics of aging work. It’s funny, usually this hasn't happened often, but I’ve been quoted in the national papers and those reporters (at least for me) have never botched anything. They're really good so it doesn’t seem to be a problem with the reporters at the national level that are at the New York Times and the Wall Street Journal. They seem to get it. They’re pretty sharp. I've never had them destroy anything but I’ve worked with other reporters, you know, kind of local (not here in Idaho, like local in Boston) and I got quoted saying something like “the increase in self employment among other workers is one of the most notable trends of the last hundred years” and I called the reporter and I said “I'm not sure where you got that, but I'm certain it wasn't from my mouth.” So it happens. But I don't know if there’s much to do about it except just to, in that instance, I called the reporter back and I was very nice and just said that it can't stand because it’s just not true. Sometimes the issues are complicated and sometimes the message just gets distorted.
JT: Do you have any other advice that you would recommend to an analyst conducting economic impact studies or to a journalist reporting on study results that might help people package results to be better understood?
KC: There’s one topic that has really nothing to do with the op ed I did—but it is something that comes up a lot. It has to do with the kinds of projects that are being assessed with IMPLAN. So, there’s an organization that’s used IMPLAN to look at Intel. [Intel] wanted to invest in a new facility in a certain area and ask what’s the impact on a local economy. For that, IMPLAN just fits like a glove. It’s the exact tool you’d want to use to examine that. And I’ve done some work with a private institution that’s doing great work at introducing new schools throughout the country. They’re building those schools from scratch. They wanted to know the impact on the local economy that building that school has. And again, IMPLAN is absolutely ideal for that kind of analysis where you’re injecting private resources into an economy where you want to know what the multiplier is, you want to know what the job impacts are—there’s nothing better than IMPLAN for that. The issue that comes about is when it’s a local government that’s introducing a project. And there you’re looking at what the economic impact is of a particular public project. It gets a little bit cloudy because, really, the impact that you’re after is not the project that you’re funding and the jobs and the multipliers associated with it. It’s the impact amount relative to the next best use of those funds. It’s almost like there were two projects in mind and you could use IMPLAN on both of them, then you could assess what the net impact is for project A versus project B. But oftentimes what happens is that IMPLAN is used to assess the use of those public dollars on a particular project and end of story; not really thinking about the best use of those funds. It could be giving them back to the taxpayer. Those dollars aren’t just going away and that’s the only use. So when you have a chance to talk to reporters or you’re educating users of IMPLAN, maybe just highlight that it’s a different animal when it’s public sector projects versus a private sector one.
JT: Agreed, we see that a lot as well: Reporting the positive side and not the negative potential that comes with it. I appreciate your article and pointing out some of the limitations. IMPLAN can't necessarily answer every question for every study. We appreciate it when more people know how to properly schedule events, define their context, and what you can or can't do to make your study better by looking at different aspects of the situation—even by looking at the negative and the positive trading process. I appreciate when someone straightens this type of conversation out. Great, great article.