Visualization in the Earth Sciences

David DiBiase

Department of Geography
The Pennslyvania State University

Forward to the Web Edition I wrote this article at the request of Judy Kiusaalas, editor of Earth and Mineral Sciences, shortly after I joined Penn State in 1989. I'm grateful to Judy for granting permission to republish the article here. (A condensed version was also reprinted in Geotimes v. 36, n. 7, in July 1991.) Earth and Mineral Sciences is the research publication of Penn State's College of Earth and Mineral Sciences, an eclectic group of Earth science, materials science, and social science departments. About 19,500 copies of each issue are distributed worldwide -- primarily to current and former students and libraries.

The audience I meant to reach was faculty members of the college. My purpose was to challenge the perception that the cartography laboratory I was hired to direct was the graphical equivalent of a typing pool. I hoped to persuade researchers in the college to recognize that graphics are important in every stage of a research project, and that graphical expertise is valuable not just at a project's conclusion (when "figures" need to be "drafted" for publication), but right from the beginning. This idea had its origins in the work of John Tukey and Jacques Bertin.

Within the college, the article did not have the impact I'd hoped for. Although the Deasy GeoGraphics Laboratory has thrived, it has done so by focusing on education and service more than research. My interest in visualization has broadened to include instructional design and delivery. The widely reprinted "swoopy" diagram (Figure 1) did, however, seem to help cartographers here and elsewhere to articulate a renewed conception of the field that encompasses visual thinking as well as visual communication. Alan MacEachren is responsible for whatever notice this work has received, since he adopted and expanded upon the ideas it contains in many other publications. John Krygier gave the diagram its endearing nickname. The text of the version published here is unchanged from the original.

The term "visualization" has taken its place in the lexicon of science primarily by virtue of the stunning color imagery it is associated with. Rapid advances in computer graphics technology are making high-resolution color slides and video as common in presentations at professional conferences as they have been in pitchmen's booths at computer trade shows. The present explosion of demand for sophisticated presentation graphics is not unique to the scientific community, of course. A market-driven emphasis on cleverly designed color visuals pervades the popular media and business communities as well. But just as the emphasis on flashy graphics in newspapers like USA Today has coincided with a trend toward shallow journalism, many scientists are concerned that the integrity of the scientific enterprise may suffer if too much emphasis is placed on what skeptics like to call "pretty pictures."

Concerns about validity notwithstanding, visualization is attracting widespread attention in the earth sciences, as well as in fields like biochemistry, medicine, and astrophysics. Professional societies like the Institute of Electrical and Electronics Engineers (IEEE) and the Association for Computing Machinery 's Special Interest Group on Computer Graphics (ACM-SIGGRAPH) sponsor entire conferences on the subject. A new periodical Pixel: the Magazine of Scientific Visualization published its first issue early this year. At Penn State, a Scientific Visualization Interest Group was formed in October, 1989, and meets twice a year to discuss the local implications of developments in the field. In a major new book, Friedhoff (1989) observes that "visualization, because of the computer, is emerging as a distinctive new discipline."

The National Science Foundation commissioned an extensive report on visualization in 1987. "Visualization in Scientific Computing" (ViSC) remains the authoritative statement of the terminology, scope, and goals of scientific visualization. The report defines visualization as "a method of computing . . . a tool both for interpreting image data fed into a computer, and for generating images from complex multi-dimensional data sets" (McCormick and others 1987). The importance of the scientist 's ability to interact with data by manipulating visual representations is stressed over the production of visual artifacts. "The most exciting potential ... of visualization tools is not the entrancing movies produced, but the insight gained and the mistakes understood by spotting visual anomalies while computing." The goal of visualization in scientific computing is "to leverage existing scientific methods by providing new scientific insight through visual methods." The report predicts that the development of visualization tools will "enhance human productivity and improve hardware efficiency," thereby bolstering American industrial competitiveness.

Some observers criticize the ViSC report for portraying visualization as a technological innovation that is detached from traditional non-computerized visual methods. Penn State geographer Alan MacEachren and others (in preparation) argue that "Visualization . . . is definitely not restricted to 'a method of computing,' ... [it is] first and foremost an act of cognition, a human ability to develop mental representations that allow us to identify patterns and create or impose order." The contributions of visual imagination and problem solving are well documented in the history of science. Copernicus ' recognition of the dual components of apparent planetary motion, Kekule 's vision of a snake biting its tail as a visual metaphor for the ringed structure of benzene, and Wegener 's explanation of the similar shapes of the facing coasts of Africa and South America with a theory of continental drift are just a few examples of scientific insight through visual methods. Visualization in scientific computing should be conceived as a special case of visualization in science. "As we develop new techniques for data exploration . . . we must not lose sight of how scientists reason and what has been effective for centuries" (Ganter and MacEachren 1989).

The potential power of visualization has more to do with biological evolution than technological innovation. The complex and dangerous savannah environments in which our hominid ancestors evolved favored stereoscopic vision with acuity over distance. Individuals who reacted quickly and appropriately to subtle visual cues thrived. The development and use of tools extended the proto-humans' competitive advantage and stimulated evolution of the brain toward larger size and greater complexity. Verbal communication emerged as the behavior that most clearly differentiates homo sapiens from other animals. We communicate among ourselves mostly through speech, but vision is our primary connection with the rest of the world.

Human visual perception actively seeks meaningful patterns, occasionally imposing them where they do not exist. Since Plato, who warned of the illusory nature of sensory images, western educational systems have stressed fluency with words and numbers as the legitimate modes of reasoning. Graphicacy (fluency with images), which depends on visual perception, is less valued because perception has been assumed not to involve thought. But as perception has become better understood, the distinction between it and cognition has blurred. There is a kernel of intelligence in the eye 's proclivity for pattern detection. Art psychologist Rudolf Arnheim has observed that "an abstractive grasp of structural features is the very basis of perception and the beginning of all cognition" and suggests that the cultural bias against graphicacy represents "an unwholesome split which cripples the training of reasoning power" (Arnheim 1969).

Figure 1. The range of functions of visual methods in an idealized research sequence.

Despite this bias, visual methods are common and perform a range of functions in scientific research. Figure 1 idealizes the research process as a sequence of 4 stages: exploration of data to reveal pertinent questions, confirmation of apparent relationships in the data in light of a formal hypothesis, synthesis or generalization of findings, and presentation of the research at professional conferences and in scholarly publications. The process typically begins in a private realm of one or a few specialists who are intimately acquainted with the subject of the research. As the project comes to the attention of a wider circle of peers, the researcher's emphasis gradually shifts from answering his or her own questions to communicating ideas to others. Finally, the research is disseminated into the public realm of scholarly communications. The intent of visualization evolves in parallel with the progression from the private to the public realms. Visual thinking implies the generation of ideas through the creation, inspection, and interpretation of visual representations of the previously non-visible, while visual communication refers to the effective distribution of ideas in visual form. Existing visual methods, including microcomputers equipped with inexpensive design software, are already able to produce impressive information graphics for publication. The greatest potential contribution of new computer-based visualization tools may be in the private realm, where the emphasis is not so much on generating images, but using images to generate new ideas.

Exploratory visual methods

As the provocative statistician John Tukey (1980) has stressed, "Science ... does not begin with a tidy question." It is a truism that "finding the question is often more important than finding the answer," but scientific education generally emphasizes the latter, perhaps because it is easier to teach. Tukey observes that questions are generated mainly "by quasi-theoretical insights and the exploration of past data." His 1977 text Exploratory Data Analysis (EDA) introduced a number of robust techniques for making sense of complex data sets. More than a collection of techniques, EDA is in essence an attitude "a willingness to look for what can be seen, whether or not anticipated." Revelation of patterns and anomalies through visual representations of data is the primary mode of exploratory analysis, since "the picture-examining eye is the best finder we have of the wholly unanticipated." Tukey 's arguments for EDA may not be well known to earth science researchers at Penn State, but the attitude he advocates seems to be widespread.

Figure 2. Visualization is a way of seeing, not just a method of computing. This sketch map reveals an unexpected pattern of correlations between winter temperatures and a particular synoptic weather regime. The dark swath represents weak correlations (original in colored pencil)

The revelation of unexpected patterns through visual methods does not necessarily require elaborate computer graphics. A map rendered with the simplest visualization tools, colored pencils and paper, raised fruitful questions for associate professor of geography Brent Yarnal and graduate students Daniel Leathers and Michael Palecki (now faculty members at the University of Colorado at Boulder and the State University of New York at Buffalo, respectively). The map (Figure 2) displays correlations between winter temperatures and the Pacific North American (PNA) teleconnection, an index of the presence or absence of a particular synoptic weather regime. Strong correlations for both the northwest and southeast United States are revealed. A similar pattern was demonstrated for correlations between the PNA index and annual temperatures. The pattern forced Yarnal and his colleagues to question the conventional wisdom that strong PNA patterns are exclusively a winter phenomenon. They were able to show that the diagonal swath of relatively weak correlations was explained by the blocking influence of west coast topography. Links among strong PNA patterns, El Niņo events, and polar circulation patterns were subsequently demonstrated, with significant implications for the prediction of regional effects of global climate change.

John Louie, assistant professor of geosciences, uses computer-based visualization tools to explore the structure of large, three-dimensional seismic data sets. The solid volume depicted in Figure 3 represents expected travel times of seismic waves from survey points on the surface of the Earth to subterranean points surrounding an earthquake fault in southern California.

Figure 3. A computer-generated representation of seismic wave travel times from surface survey points to underground points surrounding an earthquake fault in southern California. The gradation from white to black equals 2.5 seconds of travel time. Figure 4. The transition of seismic waves represented as layers at half-second intervals.

The gradation from white to black symbolizes 2.5 seconds of travel time. In Figure 4, Louie has excoriated the solid volume, revealing a series of layers representing the transmission of seismic waves at half-second intervals. The bulges and ripples on these surfaces indicate variations in seismic velocity that affect how well fault structure can be inferred from the data. Figure 5 is the geologic model interpreted from the data. Subsurface faults and basins are visible through partially transparent recent sediments. Louie remarks that his work follows from a tradition of "high resolution seismology" in which "people have always looked at their data as imagery." Computer-based visualization techniques are becoming more important as seismologists come to grips with the increasingly massive data sets required to explore subtle crustal structures.

Figure 5. A geologic model interpreted from the data set explored in Figured 3 and 4.

The sheer volume of data becoming available to scientists, particularly Earth system scientists, is the strongest justification for research and development of new visualization tools. NASA 's Earth Observing Satellite program (EOS), for example, will launch five new polar orbiting platforms in the next decade carrying some fifteen high-resolution sensing instruments. A recent estimate suggests that the image data generated by these sensors will accumulate at a rate of a terabit (1012 bits) per day, "which is equivalent to about a million pictures every day or five billion in the course of the project. This collosal volume of data must be converted, reduced, catalogued, and then distributed" (Soffen 1990). It must also be explored and analyzed by an international cadre of scientists, including researchers at Penn State 's Earth System Science Center (ESSC), a major EOS research site. ESSC 's "share" of the expected daily data flow, based on the approximate proportion of the total EOS budget earmarked to support research at Penn State, is more than 100 gigabytes (8 x 106 bits) per day. By comparison, the total amount of digital mass storage currently available on the ESSC computer network is only approximately 1 gigabyte. Extensive visualization hardware and software tools will be indispensable in the exploration of this vast collection of digital imagery.

Confirmatory visual methods

Once a vague insight has been transformed into an explicit question, a scientist turns from an exploratory to a confirmatory mode of analysis. Confirmatory techniques, which allow researchers to confirm or reject hypotheses about a population on the basis of a sample with an estimate of the probability of an erroneous choice, are "one of the great intellectual products of our century" (Tukey 1977). Confirmatory visual methods are particularly important to modelers, for whom visual representations are windows not only to the realities they attempt to simulate, but to the workings of the models themselves.

Scientists concerned with global environmental change often rely on numerical models to simulate the behavior of unobservable physical processes. The sensitivity of model runs to altered initial conditions is a reflection of a model 's robustness. Robust models inspire greater confidence in simulated outcomes. According to the geographer Cort Wilmott (1984), "visual display of sensitivity analyses can significantly enhance a researcher 's understanding of a model when they are used in conjunction with the appropriate quantitative indices." An example of this application of visual methods is found in the work of Eric Barron and Bill Peterson (1989) of the Earth System Science Center, who have used a general circulation model to simulate mid-Cretaceous ocean circulation patterns. Finding that their results contradicted earlier simulations, they conducted experiments to determine the sensitivity of their model to different assumptions about bathymetry and climatic conditions (Barron and Peterson 1990). Figure 6 maps the simulated mid-Cretaceous surface ocean circulation. Model sensitivity was assessed by visually comparing the vector patterns on this map to others generated with different bathymetric and climatic conditions assumed. The sensitivity experiment maps were observed to be largely indistinguishable from the original simulation, indicating that their simulation is robust with regard to the assumptions considered.

Figure 5. Map of computer-simulated mid-Cretaceous surface ocean circulation. Comparison of several simulated maps based on different assumptions enables researchers to assess the influence of the assumptions on the outcome of the simulations.

Numerical models designed to simulate the behavior of environmental systems may consist of hundreds of thousands of lines of computer code. The more intricate these models become, the more difficult it is to effect small but significant adjustments in model performance. Computer-based visualization tools may soon allow scientists "to steer, or dynamically modify, computations while they are occurring" by manipulating visual representations of model parameters in real time (McCormick and others 1987). One Penn State researcher who would make use of this capability is meteorology professor Tom Warner. Warner and his colleagues have implemented a numerical model for mesoscale weather prediction in the eastern United States. The form of the model 's predictions is graphic maps overlaid with linear symbols representing atmospheric phenomena. The fidelity of the model 's predictions to actual weather patterns has been impressive, but Warner reports that occasionally some phenomena that he was certain should appear on predicted maps do not. In conversation he has expressed the need for software that would allow him to use a light pen to sketch in missing features on a computer display. Most important, his modifications of the display should alter the equations of the model itself, so that when the model was run again, the previously missing features would appear as expected. Such software does not yet exist except in experimental form at a few advanced computer graphics research laboratories. When the capability to interactively steer numerical models through a graphic interface finally does become available, one of the most ambitious goals of the proponents of scientific visualization will have been realized.

Synthetic and presentational visual methods

A research project is incomplete until it is communicated. Communication requires that ideas be made explicit. The construction of visual representations can be helpful both in clarifying ideas and conveying them to others.

The transposition of a tentative personal investigation into a cogent public expression is a synthetic process. Synthesis, in this sense, entails summarizing and generalizing the results of exploratory and confirmatory analyses, and articulating a new, integrated conception of how the components of the research problem interrelate. It is a bridge from the private to the public realms. A remarkable example of visual synthesis is Figure 7, meteorology research Nelson Seaman 's sketch of the major atmospheric phenomena contributing to an archetypal east coast cyclogenesis event (Lapenta and Seaman 1990). Seaman has described in conversation how a conceptual model like Figure 7 follows from "visualizing in your own mind . . . how a maddeningly complex system is solving its own problems." Computer-based numerical models and simulations contributed to the knowledge base which gave rise to this visualization, but a keen visual imagination was the most essential instrument of all.

Figure 7. Visual synthesis of the processes contributing to an East Coast cyclogenesis event.

Effective visual communication in the public realm requires expertise in graphic design and production in addition to the imagination needed for visual thinking. It is quite a different matter to compel attention and understanding in a diverse, hurried, skeptical population of readers than to communicate with an eager, familiar group of associates. The main objective of presentation graphic design in science has been most concisely expressed as 'graphical excellence ' "that which gives to the viewer the greatest number of ideas in the shortest time with the least ink in the smallest space" (Tufte 1983). Despite some thirty years of cartographic research dedicated to codifying a set of rules to maximize the effectiveness of maps (arguably the most complex form of visual communication), graphical excellence cannot be assessed objectively, much less generated automatically by a computer program. Since vision cannot be shared directly, the design of presentation graphics must involve subjective judgment.

Professional communication in science is highly competitive. It is not enough for a research report to be substantial and coherent, it must be convincing as well. Figure 8 is a rendering of Seaman 's design produced for publication at the Deasy GeoGraphics Laboratory of the Department of Geography, a science graphics design studio that supports research and instruction in the College of Earth and Mineral Sciences. Figure 8 is not only more 'realistic ' than the sketch on which it is based, and therefore more accessible to unfamiliar readers, it also imparts an atmosphere of professionalism that asserts the authority of the investigators, the institution they represent, and the conference or publication in which the research report appears.

Figure 8. Presentation graphic based on the sketch in Figure 7 produced for publication in a scholarly journal.

Overhead transparencies and 35mm slides are the standard forms of graphic presentation at professional conferences. Like figures in printed publications, these are essentially static, two-dimensional forms in which the illusion of a third dimension can be achieved. Animated "computer movies" allow earth scientists to exploit a fourth visual dimension time to represent dynamic processes occurring in earth 's atmosphere, in its interior, or across its surface. The widespread availability of VHS tape players and television monitors has made it practical for scientists to present animated visualizations on video tape at professional conferences. Enthralling videos like the well known "flyby" of the computer-simulated surface of Mars by the Jet Propulsion Laboratory have aroused serious interest in science moviemaking. At Penn State, Rob Fisher, artist-in-residence in the College of Engineering, has coordinated the production of an animated video which promotes an international solar sail competition in 1992. In a presentation at the Scientific Visualization Interest Group 's April 1990 meeting, Fisher proposed that the Visual Engineering Laboratory he supervises be established as a campus-wide center for visualization research and applications, on the model of the prestigious Center for Supercomputing Applications at the University of Illinois. In the College of Earth and Mineral Sciences, John Louie and Tom Warner 's mesoscale model team have produced animated videos based on their research. The planned addition of four-dimensional graphic design and video production capabilities will position the Deasy Lab to assist EMS researchers in preparing animated presentations for professional conferences. So equipped, the Deasy Lab will also serve the College as a locus of visualization research, in cooperation with similar centers across campus.

Publication in peer-reviewed scholarly journals is required for academic promotion and tenure in research institutions like Penn State. The printed form of the journal exerts a strong influence on the design of visual representations in the public realm. The ViSC report complains that the static medium of print is inadequate for scientific communication. Others claim that "Print is moving toward extinction. The digital electronic medium is rapidly supplanting it" (Seiler 1989). There is much evidence to the contrary. It has been estimated that nearly 5,000 scientific and technical journals were in publication in the United States in 1985, and a market analysis suggests that this number is likely to increase (Duke 1985). The rapid growth of on-line data base services notwithstanding, the printed form of scientific journals seems likely to endure, at least until computer terminals can be made as portable, self-contained, legible, and inexpensive as books. Color printing is becoming more common among scientific publications with large circulations, but economies of scale dictate that black-and-white graphics will remain an important medium of presentation graphics in science for the foreseeable future.

Figure 9. Composite map of a synoptic flow type designed for publication in a scholarly journal. Seven spatially-distributed variables are displayed simultaneously.

The more information one attempts to pack into a small black-and-white graphic, the more crucial graphic design expertise becomes. Figure 9 is a noteworthy example designed by John Lanicci, graduate student of meteorology, and staff of the Deasy GeoGraphics Lab. It represents a "composite mean analysis for a synoptic flow type that is both favorable for the formation of a capping inversion and severe thunderstorms over the southern Great Plains in early spring" (Lanicci and Warner, in preparation). Seven overlapping phenomena are depicted over the geographic base: mean sea-level isobars (thick solid contours), mean 500 mb heights (thick dashed contours), a mean surface 55° F dewpoint isopleth (thin dot-dashed contour), a mean 700 mb 6° C isotherm (thin dashed contour), and areas of greater than 50 percent frequency of unstable buoyancy (shading with thin, nearly vertical hatches), the presence of a capping inversion with a well-mixed layer above (shading with thick, nearly horizontal hatches) and the region of overlap, where the capping inversion overlies buoyantly unstable air (shading with thick cross-hatches).

Figure 10. The same data shown in Figure 9, plotted without the benefit of any graphics software contributes no more to the quality of graphics than word processors do to the quality of literature.

The same seven phenomena are shown in Figure 10 in an approximation of the appearance of the graphic if it were plotted "automatically," without the intervention of a designer. Figure 9 is more likely to help readers visualize the synoptic type because each of the seven phenomena appears to occupy its own separate layer, giving the impression that the graphic could be peeled apart and studied piece by piece. This appearance of good visual order is achieved by applying the 'visual variables ' size, value, texture, color, orientation, and shape according to general graphic design principles informed by a clear conception of the subject matter. The process typically involves a content expert (the author) and a cartographer, who iteratively rethink and revise the graphic until the author is satisfied, or runs out of time or funding.

Fine quality presentation graphics are beyond the reach of too many research scientists. Design studios like the Deasy Lab provide an important service, but for some the costs of professional graphic design and production are prohibitive. Ideally, scientists ought to feel as confident in designing a graphic as they do composing prose or calculating statistics. Scientific analysis and artistic design involve distinctly different cognitive styles, yet many scientists have a well-developed visual design sense and produce impressive graphics when afforded sufficient time and the appropriate tools. The graphics capabilities of widely available database, statistics, and mapping software packages are steadily improving, and are able to produce simple graphs and maps that meet the quality standards of most scholarly journals. However, none of the software scientists regularly use offers the flexibility to combine multiple information layers from different sources in a coherent composite graphic like Figure 9. The Deasy Lab relies on microcomputer-based graphic design software developed for the desktop publishing industry. Comparably flexible software has not yet been developed for the workstation and mainframe environments in which much scientific computing takes place, partly because the science market is too small to attract the attention of developers of graphic design software. Recognizing this problem, Bill Peterson, a research associate with the Earth System Science Center, and David DiBiase are incorporating new presentation design capabilities into the NCAR graphics package of the National Center for Atmospheric Research, a mapping and graphing program used by thousands of earth system scientists around the world. The development of general purpose 'graphics processing ' software of comparable utility to existing world processing packages is a worthwhile but distant goal.


Visualization performs a range of functions in earth science research at Penn State, and the importance of visual methods seems certain to increase. Computer-assisted visualization techniques will be the only effective way to explore the enormous data sets amassed by the new generation of earth observing satellites. Interactive model steering through graphic interfaces may become indispensable as numerical models and simulations are extended to incorporate increasingly detailed knowledge of the dynamics of environmental systems. Black and white print graphics will continue to be an important medium for the synthesis and communication of future research findings even as alluring new visual forms, including animated video, become available. New computer-based graphics tools will enable more scientists to create sophisticated presentation graphics on their own, and the expertise of visual design specialists will be applied to the visualization of complex multivariate data sets. Visualization is an integral activity among the several earth science disciplines, and may soon emerge as a distinct interdisciplinary research focus in the College of Earth and Mineral Sciences.


Arnheim, Rudolf. Visual Thinking. University of California Press, Berkeley, 1969.

Barron, Eric J and William H. Peterson. "Model simulation of the Cretaceous ocean circulation." Science 244, 12 May, 1989.

Barron, Eric J. and William H. Peterson. Mid-Cretaceous ocean circulation: results from model sensitivity studies. (1990, in preparation)

Duke, Judith S. The Technical, Scientific and Medical Publishing Market. Knowledge Industry Publications, Inc., White Plains, New York, 1985.

Friedhoff, Richard M. Visualization, the Second Computer Revolution. Harry N Abrams, Inc., New York, 1989.

Ganter, John and Alan M. MacEachren. "Cognition and the design of visualization systems." Presented at the the first meeting of the Scientific Visualization Interest Group, Penn State University, October 3, 1989.

Lanicci, John M. and Thomas T. Warner. "A synoptic climatology of the elevated mixed-layer inversion over the southern plains in spring. Part II: The life cycle of the lid." (in preparation)

Lapenta, William M and Nelson L Seaman. "A numerical investigation of east-coast cyclogenesis during the cold-damming event of 27-28 February 1982. Part I: Dynamic and thermodynamic structure." (in preparation)

McCormick, Bruce H., Thomas A. DeFanti, and Maxine D. Brown, eds. "Visualization in scientific computing." Computer Graphics 21:6, ACM-SIGGRAPH, New York.

MacEachren, Alan M., in collaboration with Barbara P. Buttenfield, James B. Campbell, David DiBiase, and Mark Monmonier. "Visualization." in Ronald Abler, Judy Olson, and Melvin Marcus eds., Geography 's Inner World. Association of American Geographers, Washington D.C. (in peparation)

Seiler, Lauren H. "The future of the scholarly journal." Academic Computing, September 1989.

Soffen, Gerald A. "NASA 's mission to planet earth." NASA Tech Briefs 14:1. National Aeronautics and Space Administration, New York, 1990.

Tufte, Edward R. The Visual Display of Quantitative Information. Graphics Press, Cheshire, Connecticut, 1983.

Tukey, John W. Exploratory Data Analysis. Addison- Wesley, Reading, Massachusetts, 1977.

Tukey, John W. "We need both exploratory and confirmatory." The American Statistician 34:1, 1980.

Wilmott, Cort J. "Evaluation of model performance." in Gary L. Gaile and Cort J. Wilmott, eds., Spatial Statistics and Models. D. Reidel, Dordrecht, The Netherlands, 1984.