Evaporation losses

Status
Not open for further replies.

Trix

Keeper of San Juan Secrets
I'm hoping I can pose this question to our resident Water Wizards without violating site guidelines and satisfy my curiosity.
A recent study by UCLA was used as a basis for mainstream news articles with the headlines (paraphrasing), an additional 10 trillion gallons, enough to fill Lake Mead, evaporated from the Colorado River basin over the period 2000 to 2021. This was attributed to higher temperatures. Ignoring the attribution issue, can you gentlemen who have the "official" evaporation estimates handy, which you show us in the calculations of river flows in acre feet, put this 10 trillion gallon figure in context? What has been the "normal" evaporation over this 20 year period used for water calculations? Or some other ways to understand the study's conclusions.
If this makes it to the board, thanks Bart, and thanks for your responses.
 
Last edited:
I can't fathom how your question would violate site guidelines, since we dig into this kind of stuff all the time. I will provide a quick answer though.

My take: The USBR data doesn't necessarily corroborate what the studies say, but that doesn't mean the USBR is right either, particularly since they aren't the atmospheric authority.
Into the data...
10 trillion gallons is equal to 30 MAF. USBR's official evaporation data from 2000 to date for Lake Powell has a total of 9.1 MAF of evaporation. Therefore, the 30 MAF number is 3x the evaporation of Lake Powell.

If we look at the evaporation from Lake Powell all time, you see big variations in annual evaporation, but they are largely tied to lake level rather than any other phenomena:
Picture1.png
Of course, USGS has taken this question very seriously and has developed means and methods of both measuring and modeling evapotranspiration and evaporation from the Colorado River basin (see Actual Evapotranspiration (Water Use) Assessment of the Colorado River Basin at the Landsat Resolution Using the Operational Simplified Surface Energy Balance Model).

If you don't want to read that article, they have established a means of measuring evaporation and evapotranspiration using updated Landsat satellite sensing. They validated the sensing using field based measurements and found reasonable correlation. They have used both to create new models for predicting evaporation rates from these areas.

Then, USGS spent some more money and worked to develop even better modeling with better satellite data (see A hydrological simulation dataset of the Upper Colorado River Basin from 1983 to 2019 - Scientific Data). When they do this there is necessarily a difference in what you see from the old model and the new model. Eventually somebody compares data generated using the old model and the new model and decides there is a fundamental shift in the system.

If you look at the 20 year period in question, undoubtedly there was an increase in evaporation due to higher overall regional temperatures and lower regional humidity at times due to periodic failure of storm flows/systems. Quantifying this volume is nearly impossible though. They are relying on models developed after the fact from measurements that may not be in the same range as the measured values. They are then essentially modeling what happened in the past. If we assume the model is a reasonable facsimile of the real world condition, then that is OK, but you will also have the modeling error, and for a massive basin like the one being modeled, the modeling error buildup over the course of 20 years will be a massive number. Part of the reason we have had such erratic predictions in stream flow is that the precipitation has been way off the chart and there was insufficient data to really drive the model appropriately. The soil was too dry, the precipitation too low, and the temperatures too high to allow the models to function appropriately. Sure, they provided a value after the model was run, but was there any hope of it being accurate? No. Models don't interpolate well. It was part of what made Chaos Theory popular for a while. Small shift in specific parameters that make models swing wildly make for major issues.

I think that is what we are seeing in these latest reports is perhaps data refinement, looking at a period that may be unrepresentative, and perhaps misinterpretation of models. Did evapotranspiration increase by 10 trillion gallons over a 20 year period? It isn't out of the question. Would that have solved all the Lake Powell/Mead shortage issues? Potentially. Of course, in a world where evapotranspiration in the basin decreases by 10 trillion gallons would there be the demand for the water from either Lake Powell or Lake Mead that we have seen? Undoubtedly no.
 
Recognizing your question was not pointed at me, I felt compelled to offer a few quick thoughts. I mean no offense. Intriguing!

1) There's lies, damn lies...and STATISTICS...lol;

2) I like to also consider evaporative losses in feet....about 6' for Mead and maybe slightly less for Lake Powell. Obviously, dependent upon average lake elevation for a given year, the translation to acre feet is proportional and variable;

3) Not part of your question, but I always get a kick out of folks who directly or indirectly get up in arms about evaporation losses. Why? Because it doesn't matter... AT ALL... because it's simply a cost-of-doing-business. That doesn't mean it isn't cool-guy to understand it, though. I digress.

Notwithstanding, assuming rough losses of 6' per year. 21 years approximates 130'. Even with a 20% error... about 155 feet heretofore. Big deal (see "3)", above)!

Trix' original post cited Colorado River "basin". Evaporative losses in the basin (i.e., entire land area) are orders of magnitude greater than simply the area occupied by streambeds and reservoirs. Therefore 10,000,000,000,000 gallons, plus-or-minus a couple of gallons, is not at all unreasonable considering the hundreds of thousands of square miles which comprise Arizona, California, & Nevada.

I look forward to others thoughts on the matter.



 
Last edited:
My hesitancy on asking, didn’t want discussion to wade into climate change debate. Got to get to burgers on the bbq at neighbors, looking forward to digesting your info.
 
Last edited:
So I just read the whole UCLA study, which is very interesting. And really what it's focused on is this thesis: warming in the past 20 years in areas that usually hold snowpack has reduced runoff in the watershed from 2000-2021 by 32.5 maf (or about 1.5 maf/year on average) within the watershed. (The comparison to the capacity of Lake Mead is just to illustrate the magnitude of the loss, not to point to the lake as part of the problem.) The study also says the river flow is about 10.3% less than it has been in the past.

The study is not really about evaporative loss per se, or the relative responsibilities of the reservoirs. It does touch on evaporation, noting that it has risen (see Figure 5), but does not dive deeply into that topic, nor does it quantify the magnitude of evaporative loss in terms of acre feet or gallons.

So I have no basis to argue with any of its conclusions, which seem reasonable on the face of it. I think the measurable diminished snowpack and runoff over the past 20 years appear consistent with what is described in the study.

Although not discussed in the UCLA study, I think it's interesting to look deeper into the topic of evaporation as it relates to temperature, since that's the nominal topic of this thread. And since on this forum people tend to care about Lake Powell (or Lake Mead), I want to focus on those facilities for a minute to make a point. According to USBR numbers, here's the annual average evaporation in different timespans:

2011-22

Mead - 0.55 maf
Powell - 0.37 maf

2018-22

Mead - 0.53 maf
Powell - 0.33 maf

2021-22

Mead - 0.50 maf
Powell - 0.24 maf

Two things jump at you from this. One, Mead evaporates a lot more water than Powell. Just look at the average volumes in each reservoir from 2011-22, as measured on October 1:

Mead - 10.4 maf
Powell 11.9 maf

Powell on average has held more water than Mead in the past decade, and yet has had less evaporation. That means on average, Powell’s evaporation rate is much less than Mead’s. Do the math, and here's the rate of evaporation based on volume for each reservoir, from 2011-22:

Mead - 5.3% per year
Powell - 2.5% per year

That tells me that temperature is a much more important factor in evaporation rate than lake surface area or volume, since Mead is located in a much hotter area than Powell. It also tells me that if I have to store water in one of those reservoirs and minimize evaporation, it's Powell. Not even close. And those evaporation rates are largely consistent for any of the timescales shown above, if I did the math right.

But I said two things jump out at you. Here's the second one, and it's related to the first: As volumes in the reservoirs go down, so does the evaporation from them. And taken together, those two things mean this:

Powell evaporation is typically around 2.5-3% of its volume, and Mead is typically around 5-6%, no matter the volume. And the trend for each appears to be slightly rising over the past 10 years.

The attached figure illustrates this. So it figures that there's a way to correlate evaporative loss in the reservoirs with temperature and model that more precisely. I'm sure somebody's done that already. The figure I generated does seem to indicate that the evaporation rates in both reservoirs are generally rising, and that probably correlated to rising temperatures. Yes, surface area makes a difference, but generally, it looks like evaporative loss in the watershed increases with temperature, which is true both in terms of time, but also geography, measurable as you go farther downstream into warmer climes (i.e., evaporation rate in Mead is double that of Powell). And so it stands to reason that if temperature goes up even just a bit over he watershed, that would have an enormous effect on evaporation throughout the system. And of course, that's a big component of water availability, and a significant component of diminished river flow it would seem.

What I'm wondering, to tie this all back to the UCLA study, is how much the rate of evaporation in the two reservoirs has increased (if any) with rising temperatures.

The graph below is one place to start…

Powell and Mead Evaporative Loss.jpeg
 
Last edited:
I have not yet had a chance to examine in detail the recent UCLA study that was the basis for news reports on evaporative losses, but my impression is that they were calculating the aggregate loss in river discharge over time based on increasing evaporative demand and atmospheric vapor pressure deficits across the entire Colorado River basin. Although the reservoir evaporative losses are part of that overall dynamic, there is a lot more going on, and their results largely seem to align with a previous NOAA study that addressed the same issues. The underlying conclusions in both cases were that there is going to be less water in the river going forward, no matter where exactly we decide to hold it.
 
Other considerations would be wetland and forest losses due to developement, aridification and burn scars from large fires. To the extent any ground moisture has been lost and would be replaced by increased rains and snows (and then snow melt) this is a topic I would call ground buffering of moisture. If you speed up how fast moisture runs off instead of sticks around then you can get an idea of how much moisture would be held in the ground and gradually released vs. how much comes off immediately. When forests are burned then buffering is lost as the topsoil erodes and any forest detritus is washed away or burned up.

Increasing soil moisture by restoring damaged areas and getting more beavers back on the smaller contributing streams (if possible since beavers won't survive on rocks or above certain elevations), working on reforestation, etc. Pretty much anything which cools off the ground and provides protection from the dessicating winds will decrease evaporative losses.
 
Ok, the OP is back. As usual you gentlemen never disappoint on delving into our water issues. Question posed to anyone with opinions or knowledge or both. You too, Oracle! Key points for me:
-- Did evapotranspiration increase by 10 trillion gallons over a 20 year period? It isn't out of the question.
-- Therefore,10,000,000,000,000 gallons, plus-or-minus a couple of gallons, is not at all unreasonable considering the hundreds of thousands of square miles which comprise Arizona, California, & Nevada
-- the measurable diminished snowpack and runoff over the past 20 years appear consistent with what is described in the study.
-- their results largely seem to align with a previous NOAA study that addressed the same issues.
John and Nzaugg show us that TOTAL evaporation for Powell and Mead over the 20 year period is roughly 18 MAF. The published study conclusion is an ADDITIONAL 30 MAF due to a warmer climate (don't know if they are attributing it to 2 degrees F since preindustrial times, circa 1880, or the 1 degree F in past 40 years). I think that means our lake evaporation losses are a pretty small portion of total basin losses.
Very interesting data and chart on Powell vs. Mead and implications on different temps. I think I was always assuming surface area versus capacity was the big driver.
I need to study what ya all gave us, oddly that's fun for me. Cheers!
 
...TOTAL evaporation for Powell and Mead over the 20 year period is roughly 18 MAF. The published study conclusion is an ADDITIONAL 30 MAF due to a warmer climate (don't know if they are attributing it to 2 degrees F since preindustrial times, circa 1880, or the 1 degree F in past 40 years). I think that means our lake evaporation losses are a pretty small portion of total basin losses.
I don't think the study is concluding that there's an additional 30 maf in evaporative loss in the basin in that 20-year period. It's saying this is the total reduction in runoff during that period. Evaporation plays a part in that, but I'd guess it's a relatively small amount. Here's what the study actually says relevant to this point on page 9:

"From this analysis, we find that long term anthropogenic warming since the pre-industrial era has led to a total reduction in runoff or water availability 32.5 MAF during the 2000–2021 megadrought, exceeding the total storage of Lake Mead (32.24 MAF)."

The study only mentions evaporation a couple of times, and other than saying it's increased because of warming, it does not really quantify it, other than in a sort of vague way on Figure 5 (page 8).
 
I have not read the study, obviously misinterpreted the news article. I thought it stated warmer climate caused additional evaporation losses of 10T. Should have invested more time before posting, busy weekend, which is a rarity. Enjoyed the discussion.
 
Lake evaporation is a component of the hydrologic cycle

Yeah, I've always considered (in my simple mind) that water is in a zero-sum game. What we lose to evaporation is returned somewhere else. And conversely, someone else's loss becomes our gain. There's really no controlling for it. We can only continue to adapt our behaviors to live within what the system provides.
 
Yeah, I've always considered (in my simple mind) that water is in a zero-sum game. What we lose to evaporation is returned somewhere else. And conversely, someone else's loss becomes our gain. There's really no controlling for it. We can only continue to adapt our behaviors to live within what the system provides.
It's a good way to simplify things, but in reality there are two pools: freshwater and saltwater or brackish water. There are transfers between the two all the time, but there are different driving gradients. Hotter seawater (El Niño) results in a higher evaporation gradient and consequently more storm activity. Conversely, if excessive freshwater is withdrawn to the system, every time it is touched by people or as it flows through the environment, the salinity of the water increases, resulting in eventual retirement of the freshwater to the brackish water category unless effort is made to reclaim it. As we exploit pools of freshwater (i.e. groundwater) beyond the replenishment capacity of the hydrologic cycle to replenish the pool, we wind up transferring a balance of that water to the brackish pool as well. There are also massive pools of brackish groundwater, usually deep beneath the surface, that are nearly entirely unexploited that will have to be used in the near future if we don't adequately fix our water balance issues. This will cost megabucks.
 
Very interesting data and chart on Powell vs. Mead and implications on different temps. I think I was always assuming surface area versus capacity was the big driver.
Yes, I think it feels intuitive that surface area should be a driver of evaporation, so I did a deeper dive into the BOR data to see what I could figure out. I compared the annual evaporation from Mead and Powell since 2011, and compared that to the surface area of each reservoir on October 1 of each year, which is possible to derive from the existing reports that correlate surface area with volume or elevation.

From this, it's possible to measure the evaporative loss as a function of surface area. Conventional wisdom would say that the larger the surface area, the greater the evaporative loss. And that turns out to be true, no surprise--but only if the temperature is the same at the two locations. But what's not so obvious until you look at the following graph is that the amount lost through evaporation for a given surface area is much greater at Mead than at Powell. This of course relates to (and confirms) the predominant role of air temperature in driving evaporation.

Here's the short version, illustrated here:

Powell and Mead Evaporative Loss - area.jpeg

Lake Mead -

The lake loses a very consistent 6-7 acre feet per acre of surface over the course of a year, no matter the elevation or volume of the lake. Another way to look at that is that the lake drops 6-7 feet per year as a result of evaporation alone. The average from 2011-2022 was 6.6 acre feet per acre of surface area. This also implies that when the lake is closer to full, the surface area increases, so it loses more volume to evaporation. More on that later on.

Lake Powell -

The lake loses considerably less in evaporation than Lake Mead as a function of surface area--only about 3.8 acre feet on average per acre of surface area. That's a huge difference, and likely largely attributable to temperature. It's cooler at Powell than Mead (on average), so it loses less water.

Okay, so that's an important piece to understanding the puzzle, but it's only part of the picture. It would be useful to know something about the bathymetry of each lake. Intuitively, it seems as if Powell has a larger surface area, at least when closer to full. Does that mean it is subject to more evaporation, even if temperatures there are lower? The chart below begins to untangle that question.

Powell and Mead Evaporative Loss - area v. vol..jpeg



Here you can see that when the lakes are very low, they have a very similar surface area--in fact, Mead is slightly larger when the lakes are below about 7.5 maf--a figure that includes water held below "dead pool", not just live storage. But as the lakes store more than 7.5 maf and begin to spread out, the many side canyons of Lake Powell really begin to assert themselves and lead to a lake with substantially greater surface area than Mead. At 15 maf, Powell is 12,000 acres larger than Mead--nearly 20 square miles. At 20 maf, Powell is 16,000 acres larger. So it seems as the lakes hold more water, Powell may be more responsible for more evaporation.

Let's look at the next chart to put all this together. This one takes the evaporation rate of each lake (3.8 af/acre of surface area for Powell, and 6.6 for Mead), then applies that to varying surface areas for each lake.

Powell and Mead Evaporative Loss - evap v. vol..jpeg

This is the whole story. In spite of having a larger surface area when the lakes are nearly full, Powell still evaporates considerably less water than Mead--and that's because it's in a cooler location. The difference ranges anywhere from 150,000 to 350,000 af depending on the lake level, but Powell performs better at any surface elevation. This puts to rest any argument that the Fill Mead First advocates suggest concerning Powell's evaporation being a negative compared to Lake Mead. Intuitively, this also suggests that holding water at even higher elevation locations (where temperatures are even cooler) would result in even less overall evaporation in the system.

One final thought here. Some in this thread have noted (correctly) that evaporation happens all time time over any body of water. That's true. But in the case of the Colorado Basin, here's how we need to think about that. As Powell and Mead both demonstrate, the rate of loss as a function of surface area is very consistent in their particular locations. For Powell, it's about 3.8 feet annually per acre of surface area. For Mead, it's higher--6.6. (By the same logic, I'd guess that in Flaming Gorge, it's a little less than in Powell.) That all suggests that if there were no reservoirs, the evaporation rate of the river in those locations would be similar, because its mostly a function of temperature at a given location. More water evaporates farther downstream where it's hotter. It's true that the total evaporation might be less without reservoirs, but that's only because with no reservoirs, there would be less water in the system at any given time. That is to say, "evaporation loss", as an argument against having reservoirs, doesn't hold water, so to speak...
 
Last edited:
Ah hah! Now I know why my pool leveler can’t keep up at 115.
But seriously folks, ain’t John the greatest?! At higher levels Powell’s surface area exceeds big ol’ Mead basin, . . who‘d a thunk it? Now, if I can just convince my golf buddies this is interesting stuff.
 
Thanks Trix. Good luck getting your golf buddies onboard... I think it would be easier to make the green from the tee on a dogleg par 5...

But if you want to give them some more food for thought, here's a concrete example of what all this means. Let's say there was only 25 maf to distribute between Lake Mead and Powell, and you have to decide where to put it. And that includes the water below dead pool, which for Mead is 2.5 maf, and for Powell is 1.7 maf. Well, if you're the GCI, you'd put all you can in Mead first. But let's say they're in a generous mood and concede you need to leave enough in Powell to generate some electricity through the dam. So in that scenario, you'd distribute the water this way:

Scenario 1; "Fill Mead First"

Mead - 18 maf (elevation 1140; 15.5 maf live storage)
Powell - 7 maf (elevation 3522; 5.3 maf live storage))

Then there's scenario 2, the approach advocated by some people here, to keep as much water in Powell as possible, while keeping a minimum amount in Lake Mead to generate power there. So let's just flip the numbers:

Scenario 2: "Fill Powell First"

Mead - 7 maf (elevation 1000; 4.5 maf live storage)
Powell - 18 maf (elevation 3652; 16.3 maf live storage)

Here's the annual evaporation for each scenario:

Scenario 1:

Mead - 0.71 maf
Powell - 0.21 maf
TOTAL - 0.92 maf

Scenario 2:


Mead - 0.37 maf
Powell - 0.47 maf
TOTAL - 0.84 maf

So in this example, by keeping the bulk of water in Powell and not Mead, in a year you save about 80,000 af not lost to evaporation...

...food for thought on any golf course, especially one that gets its water from the Colorado River....
 
Last edited:
Yeah, I've always considered (in my simple mind) that water is in a zero-sum game. What we lose to evaporation is returned somewhere else. And conversely, someone else's loss becomes our gain. There's really no controlling for it. We can only continue to adapt our behaviors to live within what the system provides.

i really wish i could reply at length to this as what i've found over the past 20 years is that even one person in a limited capacity can make a lot of difference...
 
Just to add a bit of supporting theory to JFRCalifornia's as always insightful analysis: According to the 2nd law of thermodynamics, the water vapor-holding capacity of the atmosphere is, assuming a steady source of evaporating water, very strongly sensitive to changes in air temperature alone. In the temperature ranges we are talking about (a desert on Earth), an increase of nearly 2deg F means an increase in the amount of water vapor that can be evaporated into the air by about 7%. That means evaporation is strongly controlled by small differences in temperature. Using Las Vegas temperature as a proxy for Mead, and Page AZ for Powell, on average Mead is a good 7-8 deg F warmer than Powell day/night, summer/winter as JFR suggested. So just considering temperature alone, you are looking at a ballpark increase in evaporation (Mead vs Powell) of order 30% in an average year, current climate. In the southwest US for most of the year, much of that extra water vapor gets transported out of the region before it turns to cloud and rain. I'm attaching graphs (weatherspark.com) of the average annual range of max/min temperature in Las Vegas and Page.
 

Attachments

  • Screenshot 2023-08-09 at 6.01.33 PM.png
    Screenshot 2023-08-09 at 6.01.33 PM.png
    310.6 KB · Views: 9
  • Screenshot 2023-08-09 at 6.01.53 PM.png
    Screenshot 2023-08-09 at 6.01.53 PM.png
    328.2 KB · Views: 9
Just to add a bit of supporting theory to JFRCalifornia's as always insightful analysis: According to the 2nd law of thermodynamics, the water vapor-holding capacity of the atmosphere is, assuming a steady source of evaporating water, very strongly sensitive to changes in air temperature alone. In the temperature ranges we are talking about (a desert on Earth), an increase of nearly 2deg F means an increase in the amount of water vapor that can be evaporated into the air by about 7%. That means evaporation is strongly controlled by small differences in temperature. Using Las Vegas temperature as a proxy for Mead, and Page AZ for Powell, on average Mead is a good 7-8 deg F warmer than Powell day/night, summer/winter as JFR suggested. So just considering temperature alone, you are looking at a ballpark increase in evaporation (Mead vs Powell) of order 30% in an average year, current climate. In the southwest US for most of the year, much of that extra water vapor gets transported out of the region before it turns to cloud and rain. I'm attaching graphs (weatherspark.com) of the average annual range of max/min temperature in Las Vegas and Page.
The 2nd law is tough to apply cleanly as truly closed systems are hard to find. But fundamentally, law #2 says heat never flows from a colder region to a hotter region. So assuming a constant atmospheric temperature, more energy (in this case evaporating water) will flow from the region of higher average temperature, given the increasing entropy that follows. Very much like an extensive explanation from JFRCalifornia on most any Powell/water topic - it may appear to be decreasing the disorder of the topic through his explanations and conclusions, but in fact the amount of energy he must input for every explanation he provides substantially increases the overall entropy involved. So for JFRC: ΔS >> 0.
 
Last edited:
Status
Not open for further replies.
Back
Top