Chapter 6—
Coping with Salinity
by Jan van Schilfgaarde and J.D. Rhoades
Abstract
Four independent management strategies are identified to cope with increasing levels of salinity. First, saline springs or other point sources of saline water can be intercepted; the water then can be evaporated, desalted, and reused, or diverted for use in industrial applications. Second, the amount of water applied for irrigation can be reduced, thus lessening the amount that seeps through the soil and reducing the salt load in return flows. Third, somewhat saline water can be used as a source of irrigation water for salt-tolerant crops. Such use reduces the amount of brackish water needing disposal and provides a substitute water supply. Fourth, the cropping pattern can be changed by choosing tolerant rather than sensitive crops. Through plant breeding more tolerant varieties of common species may become available.
None of these options is without cost, and all have socioeconomic as well as technical aspects.
The primary theme of this volume is how to deal with water shortages. Water supplies, however, cannot be viewed solely in terms of quantity; clearly, the quality of the water must be considered as well. Water quality is nevertheless such a broad subject, that a restricted definition must be chosen for this discussion. The leading water quality parameter that affects irrigation agriculture no doubt is salinity; we will therefore focus our analysis on salinity-irrigation interactions and associated water management options.
Clearly agriculture is affected by, and affects, water quality in terms of pesticides, of nitrates and phosphorus, of heavy metals, and/or of sediment. The list could be extended, but none of these is uniquely tied to irrigation in the semiarid West, or is directly affected by diminishing water supplies. Salinity, however, is. As water supplies become more limiting, the use of sewage water for irrigation becomes a more important option; in fact 25 percent
(109 m3 /yr) is reused in California already.[1] Therefore, we treat this subject briefly in the context of alternate water supplies. Industrial development, such as the extraction of oil from shale, not only would compete with agriculture for water supplies, but also could well result in quality degradation of remaining waters; such degradation, however, would be dominated by an increase in salinity.
Although salinity problems are indeed aggravated directly as irrigation water supplies are diminished, salinity per se is not restricted to irrigated lands. Salinity problems are widespread across the world. Although the literature is extensive, good statistics on the extent are hard to find. Szabolcs,[2] for example, implied that there are some 30 million hectares of salt-affected soils in Europe; and Shalhevet and Kamburov[3] estimated, from a mail survey, that worldwide 50 million hectares of cultivated land are salt affected, exclusive of the USSR. For the U.S., a recurring estimate is that up to one third of irrigated land is salt-affected, but reliable data are lacking.[4] Severe salinity problems are encountered along the Pecos River and the Rio Grande; it has been estimated that over one-third of the salt load of the Colorado River can be attributed to irrigation; and in California, salinity is a fact of life in the Imperial Valley, the lower San Joaquin, and elsewhere. Thus salinity problems are widespread, even if exact statistics are not available. The thesis of this discussion is that salinity is closely tied to water conservation measures.
Before delving into options for coping with decreasing quantities of increasingly saline irrigation water, it seems appropriate to define some terms and to orient readers who may be less than fully conversant with salinity issues.
The term "salinity", when applied to water, refers to inorganic ions (or compounds) in solution. Though an appropriate method for expressing salinity is to list the concentrations of the primary cations and anions in, say, mol L–1 , a common shorthand is to use concentration of total dissolved solids on a mass basis, in mg L–1 . With all its obvious shortcomings, this custom emphasizes the view that, as a first approximation, plants (but not soils) respond to total salt concentration, more than to its specific constituents. For reasons of analytical convenience, an equally common unit is electrical conductivity, in terms of S m–1 . A rough conversion (rough because it depends on ionic composition and concentration) is 1 dS m–1 = 650 mg L–1 .
A similar usage has developed for soils, where the variable of interest is the salt concentration of the soil solution. Unfortunately, the soil water content changes all the time and so does the soil solution composition. In an attempt to standardize "soil salinity", there was introduced the electrical conductivity of an extract of a saturated paste made from a soil sample. The primary point here is that soil salinity is not an easily defined, single-valued parameter. Furthermore, soil properties are affected by the composition of the soil solution and thus, in turn, of the irrigation water.
Though the interactions among soil properties and the salts in solution are numerous—and dependent on mineralogy—it is sufficient here to stress the adverse effects of sodium. At high levels of sodium relative to divalent cations in the soil solution and thus, at equilibrium, on the exchange complex, clay minerals in soils tend to swell, and aggregates tend to disperse under conditions of low total salt concentration. Whether from swelling or from dispersion, the soil hydraulic conductivity is reduced, and the surface tends to crust. Thus the ability of the soil to infiltrate and transmit water can be severely reduced. It is the relative amount of sodium on the soil and the total amount of salt in the water that are important. High total electrolyte concentrations tend to increase a soil's stability; thus we distinguish between saline soils and sodic soils, saline waters and sodic waters.
Salinity is an inescapable concomitant of irrigation in arid areas. The water in pure mountain streams picks up salts as it moves over and through rocks. Part of the water diverted for irrigation evaporates, leaving the salts in a smaller volume of drainage water. This drainage water, in turn, may dissolve or displace salts of geologic origin. Evaporation also takes place from open water surfaces, such as lakes and storage reservoirs. In all, natural processes are accelerated by man's impact, and water generally becomes more and more saline as one moves downstream in the hydrologic system.
To avoid a continuing increase in salinity in the soil water, there must be adequate soil drainage to remove the excess salts accumulating from irrigation. Drainage is also required to avoid water logging, or a high water table, which in turn increases the salinity problem. Hence the concept of salt balance: the amount of dissolved salt brought into an area in irrigation water must be matched by that removed by the drainage system, if salination is to be avoided. The salt balance concept is often misused and
misinterpreted.[5] Qualitatively, however, it is sound—and helpful in visualizing the situation.
The reason we irrigate is to increase the production of agricultural crops, but salinity tends to decrease crop yield. A definitive description of what is meant by tolerance of plants to salinity is difficult to construct. Nonetheless, some crop plants are more tolerant to salt-induced stress than others. As with other stresses, plants expend energy to overcome the stumbling blocks they encounter. For example, it requires more energy to extract water out of a concentrated soil solution than out of a dilute solution, and for the plant cell there is a cost associated with manufacturing (or secreting) the compounds needed for osmoregulation. Though the details are complicated—if understood at all—a helpful working hypothesis for operational purposes is that plants respond to the total potential of the water in the rootzone, i.e., the sum of the osmotic potential (salt stress) and the matric potential (soil water deficit).
The tolerance of crops to salinity is most readily expressed in terms of a threshold salinity (preferably in the soil solution) below which no adverse effect on yield is noted, and a rate of decrease in yield with increasing salinity beyond the threshold.[6] Though these indices imply an absolute tolerance, we prefer to think of them as relative values useful in ranking the tolerance of various crops. The values obtained depend on the management practices used and the environment in which the crops are grown; they do not take account of differences in sensitivity at various growth stages, such as germination versus grain filling.
Aside from tolerance to unspecified (and presumed mixed) salts, we must sometimes be concerned with toxicity of specific ions. Some reports of Na toxicity may well have been misinterpretations of Ca deficiencies or salinity excesses. However, sensitivities of woody plants to chlorides and of almost all plants to boron are factors of concern.
Returning to our main theme, we recognize that increased demands on a limited water supply tend to increase the salinity in the system. Because of increased use of water out of the Colorado, contribution from saline springs is a larger percentage of the remaining flow; recycling groundwater in a closed basin increases salinity; expanded irrigation in the Central Valley of California increases the need for disposal of saline drainage water. Many different situations are encountered, yet the end result is generally the same: water development in arid regions leads to increasing salinity problems. The question is, what
options exist for reducing the threat of salination? We shall consider several types of situations and attempt to illustrate them with some examples.
Management Options
A number of options are open to us to minimize the adverse effects of salinity in irrigation in particular, and in water resource use more generally. For purposes of discussion, we here group these options into four classes, recognizing that they are not fully independent or, for that matter, truly parallel.
Diversion or Desalting
First, we consider intercepting brackish water and diverting it out of the system, or desalting it for reuse. The latter part of this option, desalting, is one we can dispose of briefly. Though technically clearly feasible, and no doubt appropriate under special circumstances, we do not see desalting, now or in the foreseeable future, as a viable option for obtaining water for irrigation. Desalting is planned for the drainage water from the Wellton-Mohawk Irrigation District in Arizona, but the decision in that case was not based on best resource use or on economics.[7] The Department of Interior also has plans for desalting brackish water at LaVerkin Springs in Utah and Glenwood-Dotsero Springs in Colorado at costs estimated to be three times the benefits.[8] In California, there is continued interest in use of desalting technology in the Central Valley. Although we are not adequately informed to assess the progress, it is doubtful that the economics would come out very differently from those experienced by the U.S. Bureau of Reclamation.
The other half of this option, diverting brackish water out of the system, offers some interesting possibilities. Several years ago, the Kern County Water Agency considered the use of brackish drainage water (around 6,000 mg L–1 ) for use as cooling water in a power plant. This option evaporated when, for other reasons, plans for the power plant were dropped. More recently, in the special report just cited, USBR engineers concluded that use of brackish water for power plant cooling offered substantial opportunities for reducing salinity in the Colorado River. Another interesting option that emerged was transport of coal in a slurry or—an intriguing concept—hydraulic transport of bagged coal through a pipeline. California hardly has the option to transport
coal to its shores, but Colorado well may be able to use some of its brackish water in this manner.
None of these uses deals directly with agriculture. Their implementation would affect agriculture by reducing the salinity of the water remaining, or by reducing the volume of drainage water needing disposal. Still, they are not central to the consideration of agricultural water quality. Thus we will not elaborate on them any further.
Decreasing Irrigation Water Use
Claims are often made that irrigation water is used wastefully and that irrigation efficiencies can be increased substantially. Such claims must be put in proper perspective. Since on-farm water conservation is the topic of the next several chapters, we consider here only those aspects that deal specifically with water quality.
Often the inefficiency that is observed is excessive tailwater. Though tailwater may well have other negative impacts, it is not likely to affect significantly the salinity of the receiving waters. On the other hand, excessive seepage (in-field deep percolation or seepage from ditches and laterals), while less obvious, can and often does affect downstream water quality in one of several ways. It may displace saline groundwater that has accumulated over time. Probably this is the case in the southwestern part of the Palo Verde Irrigation District, called the Palo Verde Subarea. It has been estimated that an increase in on-farm irrigation efficiency to 60 percent initially would reduce the salt discharge from the 4,000 hectares in this area by about 60,000 tonnes annually, or the salinity at Imperial Dam by about 8 mg L–1 .[9]
In other cases reducing deep seepage may reduce the dissolution of salts from underlying formations. In the Grand Valley of Colorado, studies indicate that the salt loading of the Colorado River can be reduced by about 400,000 tonnes annually by decreasing the amount of irrigation return flow and conveyance system seepage moving through underlying saline substrata. Such a reduction would result in a decrease of the salinity of the river at Imperial Dam of approximately 43 mg L–1 .[10] Since current estimates, based on detailed economic studies, give the impact of a change of 1 mg L–1 at Imperial Dam as approximately $500,000 per year,[11] the economic impact of the Grand Valley project is indeed substantial.
Changes in leaching fraction—i.e., the fraction of the irrigation water infiltrated that becomes deep seepage—can affect the
salt regime in another fashion. As the electrolyte concentration of soil water is increased by evapotranspiration, the tendency of water to dissolve salts shifts towards a tendency to precipitate salts. Thus a reduction in leaching fraction—which leads to increased salinity of the drainage fraction—tends to reduce the total salt load in the drainage water. This principle, illustrated in earlier papers,[12] was applied to a set of hypothetical river and groundwater basins by Rhoades and Suarez[13] and by Suarez and van Genuchten.[14] They demonstrated that the effect of reduced leaching on the salt regime depends greatly on the nature of the irrigation water, on whether the receiving water is a groundwater or surface water and on certain hydrogeologic conditions. Reduced leaching is no panacea, but in the proper circumstances, it can have significant impact on the quality of the receiving waters; it always reduces the salt load of the drainage water that percolates below the rootzone. In the short run, before equilibrium conditions are obtained, the effects of salt precipitation can be far greater than predicted in the above studies.[15]
Attempts have been made to extend the modeling of these systems to take into account irrigation scheduling and crop response. An example is the work of Yaron et al.[16] Unfortunately, the data base available is not yet sufficient to make such models very useful.
The concept of increasing irrigation efficiency by reduced leaching is being applied effectively in various areas. Examples are the Wellton-Mohawk area of Arizona and the Grand Valley of Colorado cited above. To put the matter in perspective, however, a number of reservations must be addressed. For example, though it is expected that reducing the water applied in the 4,000-hectare Palo Verde Subarea (cited above) would reduce the salinity at Imperial Dam, "improving" the currently very low water application efficiency in the remainder of the 36,000-hectare District is not expected to affect downstream water quality because there is no evidence of salt stored underground in that area and salt should not precipitate from the applied water in this case.[17] Presumably, the economic efficiency of water management at present is quite high there.
Reduced leaching implies a lower margin for error in providing adequate salinity control. Thus it calls for a more uniform, better managed system of irrigation and, especially when pushed to the limit, requires some system of monitoring to avoid crop yield reductions or adverse salt buildup in the soil. Precise and innovative irrigation management is aided by recent developments in
irrigation technology, such as linear-move sprinklers, more reliable trickle systems, and laser-graded level border systems. Regular monitoring of salinity status is made feasible by developments in instrumentation and techniques originating at the U.S. Salinity Laboratory in Riverside.[18]
Reduced leaching and controlled seepage will reduce the drainage requirement and, in the absence of adequate drainage, postpone the day of reckoning. As an example of the first situation, it is likely that the alleged need for increased drainage intensity in Imperial Valley now compared to 20 years ago is due to excess canal seepage; the second is illustrated by the concerns in Fresno and Kern counties, California, with rising water tables.[19]
Use or Reuse of Salty Water
In the western U.S., water supplies have been relatively plentiful and generally of excellent quality. As the pressure on water resources increases, there is increasing reason to consider use of more saline water in agriculture. Rhoades[20] pointed out that most of the typical drainage waters in the U.S. have potential value for irrigation, and presented results of detailed calculations to illustrate this observation. Our interest, however, is not so much in drainage waters per se, but in water with increasing levels of salinity.
The number of documented reports on the successful use of brackish water for irrigation is relatively limited. Claims that seawater can be used for crop production[21] are far from convincing. Some other claims, such as 10 tons per hectare–1 yield of alfalfa with 12,500 mg L–1 water in the USSR[22] may well be tainted by poor translation or misunderstanding. Data on cotton irrigation (p. 166) are more consistent with U.S. experience; comparing long-term irrigation in Uzbekistan with drainage water (5-6,000 mg L–1 TDS), mixed water (2-3,000 mg L–1 ) and canal water (<1,000 mg L–1 ), yields were as shown in Table 6.1.
Paliwal[23] gives a number of examples of irrigation in India with waters of relatively high salinity. Shalhevet and Kamburov[24] in their worldwide survey of irrigation and salinity cited earlier, suggested that waters up to 6,000 mg L–1 were often classed as acceptable and indeed used. Dhir[25] reported the use of water ranging from 5 to 15 dS m–1 for wheat production in India, but these areas receive annual monsoons. Hardan[26] irrigated pear trees with water ranging up to 4,000 mg L–1 without yield reduction. Frenkel and Shainberg[27] and Keren and Shainberg[28]
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
reported that cotton is grown commercially in Israel with water having an electrical conductivity of 4.6 dS m–1 .
In the U.S., extensive areas are irrigated in the Arkansas Valley of Colorado with water containing more than 1,500 mg L–1 and up to 5,000 mg L–1 .[29] Alfalfa, grain sorghum, and wheat are grown with these waters. In the Pecos Valley, water averaging 2,500 mg L–1 but ranging far higher, has been used for years.[30] Jury et al.[31] grew wheat in lysimeters with water up to 7.1 dS m–1 without effect on yield. Ayers et al.[32] were able to grow barley without yield reduction with 20,000 mg L–1 in the irrigation water, as long as a better quality water was used for stand establishment.
Most of these data and observations can easily be misinterpreted, since very often the crops were grown in climates where rainfall made a significant contribution; also, the absolute yields may not always be as high as desired. Just the same, it is evident that water containing substantial levels of salt still has agronomic value.
Rhoades, with colleagues, has established two elaborate field experiments to more explicitly assess the potential of using brackish water for crop production in California. The first, in Kern County, concerns the use of drainage water (8 dS m–1 and 5.5 mg L–1 boron) to grow cotton.[33] Comparison treatments use "aqueduct" water (0.6 dS m–1 ) and a 50-50 mix. Results to date have been encouraging. Notwithstanding a series of difficulties encountered during the experiments, it has been shown that highly respectable yields can be obtained, with good management, even on the extremely difficult soils of the site, with 8 dS m–1 water, especially when the stand is established using aqueduct water. The second experiment, started at the beginning of 1982, has somewhat different (and probably more ambitious) objectives. This experiment is located in Imperial Valley and is intended to investigate the potential of using, one after the other, saline water (3,000 mg L–1 from the Alamo River) and normal water (860 mg L–1 from the Colorado River) to grow a rotation of crops including both tolerant and sensitive species. If indeed one can grow a crop of sugarbeet or cotton with "poor" water for all but stand establishment, and follow it immediately with lettuce using "good" water, then serious concerns of farmers—namely, that recovery after use of brackish water will be slow or impossible, and that use of such water on the land will thereafter restrict its use to only a few tolerant crops without foregoing cropping during a long period of reclamation—will have been resolved.
Both of these experiments are based on the premise that water not now used because it is deemed too salty can in fact be used effectively for irrigation if properly adapted management practices are applied; and that such use would benefit the landowners and the general public. In Imperial Valley, much of the drainage water now discharged to the Salton Sea could become available for irrigation either as an additional supply or as a substitute supply. In the Lower San Joaquin Valley, use of drainage water for irrigation would convert a waste product into an asset, reducing the volume of drainage water needing export.
Another source of water, in lieu of current supplies or in addition thereto, is sewage effluent. Secondary sewage waters are typically increased in total dissolved salts, carbonate, boron, nitrogen, and in the proportion of sodium relative to calcium and magnesium.[34] However, the degree of increase is such that, in general, if the initial quality of domestic water is suitable for irrigation, then so is its secondary sewage effluent, as assessed using conventional criteria and standards.[35]
Secondary sewage waters are enriched in some plant nutrients, especially nitrogen, and have value as a source of plant nutrients. In fact, reclaimed sewage water has been advocated as a satisfactory hydroponic growth medium for plants.[36] Much research on the effect of sewage effluent on plants and environment has been conducted at sewage disposal sites where high-rate effluent application has been used primarily for purposes of disposal, purification, and groundwater recharge.[37] Lower rate applications for irrigation should pose less hazard of groundwater pollution and make more effective use of the effluent as a source of water and nutrients for growing crops. Results of research and practice using low-rate applications are promising, though such use is not without its problems and potential hazards. Numerous crops have been produced over long periods of time using secondary sewage waters for irrigation.[38] Secondary sewage effluents typically contain nitrogen in the concentration range 20-40 mg L–1 of N.[39] According to Ayers and Tanji,[40] cropping problems due to excess nitrogen concentration occur in the range 5-30 mg N L–1 , and severe problems can occur when the level of N exceeds 30 mg N L–1 . Reduction in sugar content in sugarbeets, reduction of starch in potatoes, increased crop lodging and reduced milling and baking quality of small grains, reduction in yield and lodging of cotton, and reductions in yield and quality of citrus, avocados, and various other fruit crops have been reported to occur from
irrigating with sewage waters.[41] To cope with a high N supply in sewage waters, the farmer may have to substitute other water supplies at times.[42] The high organic matter loads of secondary sewage effluent also may reduce soil permeability through plugging of pores.[43]
Thus, though one must impose some limitations, there appears to be good potential for reusing secondary sewage waters for irrigation. Such use would add to the immediate water supply, reduce or delay need for developing new supplies, decrease need for synthetic fertilizers, and reduce need for disposal facilities. However, potential health and environmental hazards associated with use of sewage waters prevent us from advocating their widespread general use for irrigation. Even though sewage waters have been used for irrigation in California for more than forty years and elsewhere for similar long periods without any confirmed case of disease resulting from their use, there still exists the potential for contamination of crops, soils, and groundwaters with viruses, bacteria, parasites, trace organic compounds, nitrate, and heavy metals.[44] These health and environmental issues do not relate to salinity—our area of expertise—and hence we will not expand on them. Suffice it to say that the reuse of sewage effluents on croplands will not generally cause serious salinity problems, but the potential of health and environmental problems must be recognized. In any case, the use of sewage waters on croplands should always be evaluated on the basis of site-specific soil, crop, hydrologic, climatic, and management conditions, and the benefits weighed against the potential long-term consequences.
Aside from reuse of drainage or sewage water, learning to use brackish water effectively opens up new supplies of hitherto untapped groundwater. For example, Bahr estimated that New Mexico has in storage about 18 x 1012 m3 of brackish groundwater.[45] If techniques were developed to use only 10 percent of that amount, this would supply the state for 500 years at current rates of use.
Changing the Cropping Pattern
Another approach towards "learning to live with salinity" is to make changes in agronomic management that overcome some of the problems. To some extent, it may be feasible to change management within an existing cropping pattern as salinity increases, as by increasing planting density of cotton to compensate for smaller plants.[46] More likely, one will need to substitute
more tolerant species. This, of course, involves a cost in terms of loss of flexibility—an opportunity cost.
Alternatively, one may attempt to breed more salt-tolerant varieties of desired species. Though not a new idea, it has only been in recent years that much activity has taken place in this area. Because of the nature of the selection process, the degree of intraspecific variation in salt tolerance tends to be small. Nevertheless, such variation has been observed in a number of species.[47] Without attempting to document recent progress, we suggest that breeding for improved salt tolerance deserves high priority in research, but one should be prepared for a relatively long gestation period. While it is reasonable to expect success in terms of 10 or 15 percent increases in tolerance for a given species, it appears unrealistic to anticipate increases that will permit the use of seawater for conventional crops.
A third alternative is the selection for cultivation of species not currently used in agriculture. Numerous halophytic species are known that, by definition, tolerate high salinity stresses. Some exhibit reasonable growth rates, but they tend to produce biomass at a rate that is low compared to that of typical agronomic crops.[48] Yet to the extent a market can be identified for these crops or their derivatives, they may offer the potential of a new agricultural industry that does not depend on fresh water supplies. It is also possible that disposal of brackish water by evaporation can be profitably coupled, under certain circumstances, with the production of halophytes that enhance or establish wildlife habitat.
Unfortunately, some of the recent advocates for new plants for new uses have overlooked some fundamental difficulties. For example, the introduction of guayule as a substitute source of rubber may well be highly desirable. However, guayule appears to be fairly sensitive to salt and, although it may well survive under droughty conditions, it appears to require substantial amounts of water for maximum production. Thus guayule could become an important new crop; one may be able to grow it with limited irrigation in, say, the Texas High Plains; but it will not produce much salable product without adequate water. Far more extreme have been claims about euphorbia as a source of petroleum. Aside from the question whether the energy balance in producing petroleum from euphorbia is sufficiently favorable to warrant further investigation, it is clear that euphorbia can only be grown, at reasonable production rates, with adequate irrigation. Thus many of the claims for potential new crops overlook
some fundamental relations between water use and dry matter production.
Options in Perspective
The objective of this paper is to explore means to sustain agricultural production in the face of deteriorating water supplies. We briefly considered, then dismissed as either impractical or outside the purview of agriculture, the interception of brackish water and its disposal or desalination.
We explored, in a bit more detail, the consequences of changes in irrigation efficiency, stressing the potential reduction in salt returned to receiving water bodies by reducing subsurface flows or by inducing salt precipitation in the rootzone. In the right circumstances, these reductions can be very substantial. It would be wrong, however, to deduce that minimal leaching is likely to affect total water use significantly. As long as biomass production is not reduced, the amount of water evapotranspired by the crop will remain approximately the same; any water saving would have to be associated with reductions in phreatophyte use or in unrecoverable deep seepage losses.
Increased use of brackish water for irrigation, our third option, presents technical challenges with obvious socioeconomic and institutional consequences. Whether one accepts the estimates of the Interagency Drainage Project[49] or the much higher ones by Dudek and Horner,[50] all parties agree that the existing drainage problems in the San Joaquin Valley will become more extensive and more serious. IDP estimates that, in 1990, 153,000 hectares will require artificial drainage; Dudek and Horner's estimate is 690,000 hectares. Use of drainage water for irrigation would reduce the volume needing export out of the San Joaquin Valley and, presumably, open up new disposal options. It would also raise questions of equity, and require establishment of dual pricing policies and resolution of water rights questions.
In view of the high marginal cost of developing new water supplies for Los Angeles—whatever the exact figure—the potential for reducing the total diversions to Imperial Valley without reducing the area irrigated opens up interesting questions for speculation: what if the Metropolitan Water District were to purchase some of the Colorado River water currently allocated to Imperial Valley? Since the drainage into Salton Sea exceeds 1.2 x 109 m3 per year, the amount of water potentially available is
far from inconsequential. The complications, legal and otherwise, also could be substantial.
Our fourth option, the introduction of new crops and the breeding for higher salt tolerance, we see as important and deserving of far greater attention than it has received. However, we counsel against extreme optimism. We do not anticipate practical use of seawater for irrigation nor high levels of production per unit area of halophytes. We do anticipate, over time, substantial but modest increases in salt tolerance of several crops.
Conclusions
As the pressures on water supplies mount, water quality—especially salinity—will develop into an increasingly more severe constraint. Several options do exist to reduce the adverse effects of salinity on agricultural and other water uses. There are ways to reduce the amount of salt returned to the rivers and ways to make more effective use of waters with relatively high levels of salinity.
None of these options is without cost. Nor does one substitute for the other. For each set of circumstances, a specially tailored set of management practices can be developed. However, whatever the technical merit of such a plan in any given set of circumstances, the degree to which it will be implemented will often depend greatly on other technical considerations. Without adjustments in institutions, without acceptance by the water user, without government participation, we do not foresee extensive reuse of drainage water, or reductions in leaching volumes. We may hope that socioeconomic adjustments will occur in step with technical developments.
Discussion:
James W. O'Leary
Whenever the issue of water and agriculture is discussed, a couple of important points must be kept in mind. First, crop productivity, or dry matter production, per unit of irrigated land in the western United States is much greater than crop productivity per unit of nonirrigated land. The difference is so great that it recently has been claimed that irrigated agriculture has considerable land-conserving implications.[1] It has been calculated, that if the 18.5 million irrigated acres of corn, sorghum, wheat, and cotton were dryland-farmed, it would take an additional seven million acres of land to make up the loss in yield.
The second important point to keep in mind, however, is that the high productivity is accomplished at the cost of high water consumption by the crops. Even though some plants are more efficient than others in their use of water, it is impossible for plants to assimilate carbon dioxide from the air and produce dry matter without losing water in the process.
The great concern for conserving water in agriculture has led to the increasing promotion of new crops that use less water. Some of these potential new crops are well known drought-resistant plants, such as guayule and euphorbia. As van Schilfgaarde and Rhoades have pointed out, some fundamental relations between water use and dry matter production often are overlooked in these cases. This point is too important to let it become lost. Even though these plants, and many others, have the ability to survive and even grow with extremely small quantities of water, quantities of water so small that contemporary crops could not even begin to survive, they do not have higher productivity than conventional crops when moderate or greater amounts of water are applied. The most efficient use of water by many of these plants occurs at growth rates that are far less than maximal. At growth rates that give high dry matter
production, they are no more efficient in their use of water, and in fact, often are less efficient than conventional crops.
Thus, the important question that emerges from this brief analysis is: if forced to choose, which is more important, high productivity per unit of water used, or high productivity per unit of land used? I think it is the latter. With the increasing loss of good farmland to urbanization, the demand for high productivity per unit of land area is increased.
The only solution to this dilemma is an increased use of lower quality water in agriculture. Van Schilfgaarde and Rhoades have discussed several very appropriate ways in which this could be accomplished, such as multiple use of irrigation water. We should be prepared to see use of drainage water from irrigated fields to irrigate additional fields in the near future. The use of blended drainage water and fresh water, or even undiluted drainage water, for irrigation in places like the Imperial Valley in California and the Wellton-Mohawk area in Arizona should be feasible. It already is being done on a limited basis. Similarly, we should be able to use treated sewage effluent, as has been suggested. Blending with other water sources, such as brackish water, may be an effective way to dilute the high nitrogen content of the sewage water and dilute the high salt content of the brackish water.
It will be necessary to continue breeding efforts toward increasing salt tolerance in contemporary crops, but it must be pointed out that progress will be slow and there is an inevitable upper limit to tolerance, far below seawater concentration. I agree with the authors that it is unrealistic to anticipate increases in salt tolerance in conventional crop plants that will permit the use of seawater; however, I think I can be more optimistic than they are about the potential for obtaining high productivity from halophytes irrigated with seawater. The conclusion one arrives at when comparing productivities often depends on the figures selected to represent the crops or crop types being compared. We have grown several species of halophytes irrigated with seawater, and the most productive ones yielded from 895 to 1,365 g DW m–2 yr–1 .[2] This compares very favorably with yields from such conventional freshwater crops as alfalfa (760 g DW m–2 yr–1 —U.S. average 1977).[3] This is an area of research that has been active for only a short time and in relatively few places, so there is not an abundance of good data yet. Nevertheless, I am extremely optimistic about the prospects of developing valuable crop plants from halophytes that could be
used in either seawater or brackish water based agriculture. With brackish water, the productivity should be even higher, due to the energy subsidy given to the plants. It is important to emphasize that this is a site-specific solution. Brackish or saline water can be used without detrimental effects on the site if the proper soil type is present and if proper management techniques are used.
In general, I am in strong agreement with van Schilfgaarde and Rhoades regarding the options they have discussed and the emphasis they have placed on the need for other than technical considerations in order to see them implemented.
Discussion:
J. Eleonora Sabadell
Van Schilfgaarde and Rhoades give a clear description of causes affecting salinity and main management strategies to cope with the problem. I would like to point out other management options and some causes and effects of salination that may affect agriculture directly as well as indirectly.
As has been said, salinity is due to salt loading and/or salt concentration. Any proposed remedial or preventive measure will have to deal with either or both phenomena. The increase in salt loading is due not only to saline springs and farming, but also to other land uses by which salts, natural or man-made, are added to the hydrological system. Some examples of these activities are strip-mining, livestock grazing, urbanization, processing fuel and nonfuel minerals, recreation (off-road vehicles), disposal of industrial waste, etc. These activities that disturb the soil, releasing salts into the system or adding to the total chemical load, in turn affect every land use in a feedback mode. The point is that to control salinity, it is necessary to adopt measures not only on the farm but also outside the farm.
Agriculture today is highly mechanized and statistics show that the number of farm workers has declined steadily, even if acreage in crops has not changed substantially; and farmers derive income from other activities besides farming. This fact, coupled with strong growth in the population of the western states and with adopted policies on energy development, indicates that the economic base in the West has diversified significantly, and that the number of land users and intensity of use has increased. Even with existing environmental legislation and regulations, the control of the synergistic and cumulative impacts on water resources of intense and multiple use of drylands has not yet been successful.
Agriculture is nevertheless the number one industry in the country and in the western states. California, for example, produces more than 250 agricultural commodities and is the largest net exporter of food, grossing over $15 billion in 1980. Any change in the productive capacity of the region caused by declining water quality will be felt by the whole economic system. Along these lines, the Bureau of Reclamation has estimated that the direct damage to an area is roughly $400,300 (in 1982 dollars) per mg/L of increased salinity, and $534,000 (in 1982 dollars) of direct-plus-indirect damage per mg/L increase. If these costs continue to rise, they will impact the farming community and its financial ability to cope with its own salinity problems, to say nothing about the rest of the community and all other activities. Hence, a holistic approach to development and a recognition of the relationships among land uses and consequences seem to be in order.
The "do-nothing" and the "yield to competition" options have already been adopted in some parts of the West, resulting in a shift from irrigated agriculture to some other more profitable land use. The impact on remaining agricultural activities and on the natural system involved is not yet known. The Supreme Court decision in the Sporhase v. Nebraska case will probably bring about more of these changes. The consequences and costs of such shifts should be determined and appraised before they occur.
Another relationship not often considered, but significant for the solution of the salinity problem, is between water quantity and water quality. This is especially true for irrigated agriculture, as the authors have mentioned in describing the physical aspects of and the technological solutions to salination. Unfortunately, legal, political, and institutional entities view water
resources from two separate perspectives, quantity or quality. The result has been separate bodies of law and institutional arrangements, and responsible agencies, scientific communities, technical staff and managers that seldom communicate with each other.
On-farm practices to control salinity, specifically schemes three and four cited by the authors, are based on the natural or bred tolerance of plants for salt. The idea is to use as commercial crops plants that can be irrigated with saline water. The concern to be stressed is that input into the farming operation of higher loads of salt can eventually reach the point where the tolerance threshold of the new plants can be overwhelmed. The cost of rehabilitating soil and water resources from this more constrained situation would be higher than at present, and the possibility of developing increasingly tolerant plants is questionable.
Available options for on-farm salinity control are: building underground drains and drainable wells; adopting leaching irrigation, sprinkler irrigation, special planting and bedding practices, and using soil additives; increasing irrigation frequency, etc. Not all of these measures are applicable in all instances, nor is there a single solution for a given salinity problem. In practice there are constraints on the adoption of some of these control techniques, e.g., financial limitations on the building of structures, limited water supplies for leaching or for more frequent applications, bedding practices limited to specific crops, or lack of local cooperation. Salinity control schemes are in general a combination of procedures that complement each other and are appropriate for the selected cultivation practices used. Nevertheless, there is still room to develop more efficient methods. Some of the characteristics to be considered in devising the best control options are: local physical and economic conditions; the time frame in which each procedure works; available local capabilities; interaction among individual farms, irrigation districts, and other users; technical compatibility; and the cumulative impact of practices, if any. The economic benefit of the optimal use of resources is self-evident.