ADVERTISEMENTS:
In this essay we will discuss about the recent trends that set forth a paradigm for geography.
Recent Trends in Geography
Essay Contents:
- Essay on the Paradigms in Geography
- Essay on the Quantitative Revolution
- Essay on the Systems Approach
- Essay on Systems Analysis
- Essay on Entropy and Information Theory
- Essay on Models and Analogue Theory
- Essay on Reaction to the Quantification
- Essay on Positivism
Essay # 1. Paradigms in Geography:
ADVERTISEMENTS:
In Kuhn’s terminology, geography was in the pre-paradigm phase until the time of Darwin. Kant did not set forth a paradigm for geography, and thus he was not the founder of a school of geography, but clarified the role of the subject and its position in relation to other contemporary sciences.
However, Darwinism did not completely discredit the major ideas of contemporary geography, and the study of development over time continued to be regarded as important. It was largely from the Darwinian tradition that Friedrich Ratzel led the subject into the first phase of professionalism, and then the ‘deterministic school’ founded by him represented the first paradigm phase in geography.
An interesting aspect of the development of geography during the latter half of the nineteenth century was the way in which research workers attempted to develop the subject as a nomothetic science to a greater extent than might have been expected. It was especially after Darwin that scientists were looking for the governing laws of nature (and the materially conditioned social laws) and to a considerable extent adopted a nomothetic or law-making approach.
The inductive explanation was replaced by hypothetic-deductive method, which was especially a characteristic of the natural sciences. Research workers, starting from inductive arrangements of their observations or from intuitive insights tried to devise a priori models of the structure of reality.
These were used to postulate a set of hypotheses which could be confirmed, corroborated or rejected by testing empirical data. A large number of confirmations led to a verification of the hypothesis, which was then established as a law. This law stands until its eventual rejection as a result of later research.
ADVERTISEMENTS:
Scientific knowledge, obtained through the hypothetic-deductive explanation, is a kind of controlled speculation. An increasing number of human geographers sought to apply such a procedure during the 1950s. The method, known as positivism, was developed by a group of philosophers working in Vienna during the 1920s and 1930s. It is based on a conception of an objective world in which there is order, waiting to be discovered.
That order— the spatial patterns of variation and co-variation in the case of geography—exists and cannot be contaminated by the observer. A neutral observer, on the basis of either observation or his reading of the research of others, will derive a hypothesis about some aspect of reality and then test the hypothesis – verification of hypothesis leads to building of laws. A key concept of this philosophy is that laws must be proved through objective procedures, and not accepted simply because they seem plausible.
The hypothetic-deductive method has led human geography to develop into a model building theoretical science because it deals with quantifiable phenomena which seem to have known situation in time and space. However, Minshull (1970) observed that the hypothetic-deductive method was not applied in strict sense in the subject, that the determinists and geomorphologists stated the generalisations first and then gave a few highly selective examples as proof.
ADVERTISEMENTS:
Geographers could not test hypothesis – by verification procedures involving a number of repeated experiments, as could the physicists. Statistical tests that might play the same role as the experiments were not then sufficiently developed to cope with the complex geographical material.
The ruling deterministic paradigm of Ratzel- Semple-Huntington was challenged by a new possibilist-regional paradigm, developed by the French School of geography led by Paul Vidal de la Blache. He stressed that man has free will and participates in the development of each landscape in a unique historical process. The possibilist paradigm presented a model of man perceiving the range of alternative uses to which he could put an environment and selecting that which best fitted his cultural dispositions.
Geographers were trained to concentrate on the study of the unique single region. This paradigm necessitated the adoption of the method called the ‘participating observation’ that involved the field study of the region in order to identify the uniqueness of the man-environment relationship.
While possibilism and the regional geographical school established a new paradigm, but this paradigm did not immediately displace its predecessor. It was because of the strong position of geomorphology and physical geography that the deterministic explanatory model survived side by side with possibilisim.
Debate over environmental determinism and possibilism continued into 1960s and was actively pursued in the United Kingdom in the first decade after the Second World War and Spate (1957), for example, proposed a middle ground with the concept of ‘probabilism’. Perhaps, the major reaction against the possibilist paradigm came from the Australian geographer
Grifith Taylor, who felt that the possibilists built their arguments in temperate environments, such as that of north-western Europe, which do indeed offer several viable alternative forms of human occupance. But temperate environments are rare; in most of the world (as in Australia) the environment is much more extreme and its control over man is much greater accordingly.
He used the phrase ‘Stop-and-Go’ Determinism to describe his point of view that man might attempt whatever he liked/wished with regard to his environment, but in the long run nature’s plan would ensure that the environment won the battle and forced a compromise with its human occupant.
The regional paradigm which was dominant in the years before and just after the Second World War was not only an attempt at generalisation, but at generalisation without structured explanation. It was thus of a very different type from the increasingly discredited law-making attempts of the previous paradigm.
The main focus of the paradigm was on areal differentiation, on the very character of the Earth’s surface, especially of the inhabited parts. Its picture of that variation was built on parallel topical studies of different aspects of the physical and human patterns observed.
ADVERTISEMENTS:
The regional paradigm flourished in France, Great Britain and the United States. Richard Hartshorne’s name is associated with the regional paradigm. By the 1950s, disillusionment with the empiricist philosophy was growing and slowly the topical specialisms came to greater dominance and gradually the regional paradigm lost much of its earlier acceptability.
Carl Ritter was probably the first geographer to have provided a clear description of his method, which appears to have approximated the Baconian Classical Model of how a scientist works.
The inductive Baconian route derives its generalisations out of observations: a pattern is observed and an explanation developed from and for it. This begins with an observer perceiving patterns in the world; he then formulates experiments, or some other kind of tests, to prove the veracity of the explanations which he has produced for those patterns. Only when his ideas have been tested successfully against data, other than those from which they have been derived, can a generalisation be produced.
Ritter used the inductive method as a framework for his presentation of data and as a means to arrive at some simple empirical generalisation. He offered a teleological explanation which sought to explain that everything that happened was directed to a purpose whose controlling conditions were laid down by God. Such a teleological philosophy cannot be tested empirically; and therefore does not qualify as a scientific explanation.
It does, however, have the qualities of a paradigm as defined. Because of Ritter’s teleological philosophy, it is difficult to identify the professionalism phase in geography. It is academically important to note that the Ritterian School of Geography, which was the first actively founded school, did not lead the subject into the first phase of a paradigm. It is somehow attributed to the growth of Darwinism.
Paradigm Shift:
It was Schaefer who brought about a paradigm shift, when he criticised the exceptionalists’ claims made for the regional paradigm. He claimed that in geography the major regularities which are described refer to spatial patterns and hence geography should be conceived as the science concerned with the formulation of the laws governing the spatial distribution of certain features on the surface of the Earth.
It is these spatial arrangements of phenomena, and not the phenomena themselves, about which geographers should be seeking to, make law-like statements. He assigned the nomothetic (law-producing) philosophy to geography as a spatial, social science, and recognised the problems of experimentation and of quantification, and favoured a methodology based on map correlations.
Schaefer, with his spatial organisation paradigm, initiated what may be called the quantitative and theoretical revolutions in geography. However, the major advances towards a unifying methodological and philosophical basis of the quantitative school were made in the 1960s by Peter Haggett, Richard Chorley and David Harvey. Haggett’s book Locational Analysis in Human Geography (1965) called for a debate within the subject on this paradigm shift.
In Models in Geography (1967), both Chorley and Haggett stated that they had looked at the traditional paradigm in geography and had found that it was only classificatory and under severe stress. They held the view that geography should adopt an alternative model-based paradigm and made it clear that the new development within the subject not only represented a wider range of methods, but demands a fundamental paradigm shift.
Model building was set up as the aim of geographical investigation and inquiry, a task to be done with the help of quantitative methods. The geographers have the option to choose between the traditional and the new model-based paradigm.
Webber (1977) identified an entropy-maximizing paradigm which focuses on location models (the probability of an individual being in a particular place at a particular time), interaction models (the probability of a particular trip occurring at a particular time), and on the location/interaction models.
The entropy-maximising paradigm asserts that though the study of individual behaviour may be of interest, it is not necessary for the study of aggregate social relations. The patterns predicted by the models are functions of the constraints (which are the information provided at the meso-state), so that knowledge of these means that the entropy-maximizing paradigm’ is capable of yielding meaningful answers to short-run operational problems and thus is of immense value for immediate planning purpose.
According to Weber, much of the economic system is variable, the constraints and the spatial form of the urban region may change.
The research task facing the entropists is:
(1) To identify the constraints which operate upon urban systems, which is partly economic problem.
(2) To deduce some facts of the economic relations among the individuals within the system from the use of the formalisms; and
(3) To construct a theory which explains the origin of constraints. Only when the third task has been attempted, may the paradigm be adequately judged.
Since early 1950s, the discipline of geography has been subject to frequent paradigm shifts as geographers have attempted to explore a multiplicity of philosophical avenues and research strategies to deal with a multitude of problems and observations. Berry (1978) suggests that pluralism and inter- paradigm conflict have been much more common than have periods of normal science.
He observes- A diversity arising from the mosaic quality of modern geography, they have moved from one paradigm to another, and in the last decade [1960s] they have been extremely dynamic…. With multiple ideas and origins, modern geography could rightly be characterized as a mosaic within mosaic.’ Geography at present is in the throes of a paradigm crisis because of larger accumulation of various kinds of problems which cannot be solved within the frameworks of the ‘ruling paradigms’.
Essay # 2. The Quantitative Revolution:
The quantitative revolution in geography was largely concerned with giving geography a scientific approach along with the application of statistical methodology to geographical research. Anglo-American geography underwent a radical transformation of spirit and purpose in the 1950s and 1960s, replacing an earlier ‘idiographic concern’ with areal differentiation by a ‘nomothetic’ search for models of ‘spatial structure’.
The revolution involved the acceptance of those elements in positivism which had previously been disregarded, namely, the concept that there is one science and one methodology which extends from the natural into the human sciences.
The quantitative schools set out to discover universals, to build models and establish theoretical structures into which geographical realities may be fitted. The quantitative methods and models have, to a greater extent, been developed because of their predictive values.
The movement which led the revolution in geography was started by physicists and mathematicians, and has expanded to transform first the physical and then the biological science. It has come largely as a result of the impact of work by non-geographers upon geography, a process shared by many other disciplines where an established order has been overthrown by a rapid conversion to a mathematical approach.
A strong reaction to environmental determinism has served to delay the coming of the quantitative movement to geography and has postponed the establishment of a scientific basis for our discipline that the quantifiers hope to provide. It is not surprising; therefore, that the quantitative revolution was resisted most strongly by American geographers, for it was in the United States that the reaction to environmental determinism was strongest.
Characteristically, the source of strongest opposition is now the source of greatest support and the American geographers achieved notable distinction in quantitative techniques. The report of a National Academy of Sciences—National Research Council (1965) committee on ‘The Science of Geography’— discussed geography’s problem and method with the statement – ‘Geographers believe that correlation of spatial distributions, considered both statistically and dynamically, may be the most ready keys to understanding of developing life systems, social systems or environmental changes. In the past, the progress was gradual, however, because geographers were few, the methods for analysing multivariate problems were rigorous and systems concepts were developed only recently.’
Space:
The word ‘space’ appears to be closely associated with the quantitative revolution. It was only in the late 1940s and early 1950s that the conception of space became a recognisable tradition of inquiry in modern geography when Schaefer (1953) declared that spatial relations alone are the ones that matter in geography. Space has now become the basic organizing concept of the geographer.
Blaut distinguishes between ‘absolute’ space and ‘relative’ space. He suggests that ‘space’ in absolute- conception ‘is a distinct, physical and eminently real or empirical entity in itself; and ‘space’ in relative conception ‘is merely a relation between- events or an aspect of events, and thus bound to time and process’. This distinction attempts at the development of general theorems of spatial organisation, because if geography is to generalise, it must be able to replicate cases and it has to use relative space.
The shift to a relative spatial context is still in progress and is probably the most fundamental change in the history of geography as it opens an almost infinite number of new worlds to explore and map. Space becomes relative because it is related to the perception of an individual.
In location theory, the builders of spatial models are also trying to use measures of relative rather than absolute location. Forer (1978) introduces the concept of ‘plastic space—that is, space which is continuously changing its size and form as a result of socio-economic demands and technological progress, making it dynamic and truly relative.
The quantitative revolution has led to the identification of two major approaches to the study of geography:
1. Spatial Analysis:
It refers to the quantitative (mainly statistical) procedures and techniques applied in locational analytic work, although such procedures and techniques are not particular to the philosophy of positivism.
2. Spatial Science:
The presentation of human geography as that component of the social sciences which focuses on the role of space as a fundamental variable influencing the organisation and operation of society and the behaviour of individuals. It was formulated during the quantitative revolution and is usually closely related to the philosophy of positivism.
The following aspects of the conception of ‘space’ under the above two approaches appear to be the central features of geography:
(a) Spatial Interaction:
It refers to interdependence between geographic areas. This interdependence is complementary to the society-environment interdependence within a single area and, therefore, it is a major focus of quantitative inquiry.
(b) Spatial Structure:
It refers to the resulting arrangements of phenomena on the Earth’s surface. It could be defined most sharply by interpreting structure as geometrical, from which it followed that the science of space (geography) finds the logic of space (geometry) a sharp tool. The revival of this classical geometric tradition appeared to be a central feature of the quantitative- revolution in geography and its reconstitution as spatial science.
William Bunge in his Theoretical Geography (1962) sought to extend the arguments of Schaefer to the effect that geography is the ‘science of spatial relations and interrelations geometry is the mathematics of space; and so geometry is the language of geography’.
The Hartshornian chorological paradigm, emphasising nature and interrelationships between specific places or regions, was rejected in favour of a geography (paradigm) based on spatial analysis which emphasised the geometric arrangement and the pattern of phenomena. The study of the whereabouts of things, their spatial distribution, which the regional- chorological paradigm considered as a deviation, formed the core of the geographic enterprise.
The development and the philosophical discussion which followed the debate between Schaefer and Hartshorne during the 1950s led to an early acceptance of theory-building and modelling. The account which Edward A. Ackerman presented in 1958, regarding the theoretical discussion, controversy and revolution, attempted to encourage younger generations to concentrate attention on systematic branches of geography enabling the development of more refined theories and models.
The acceleration of theoretical works was especially marked in institutions led by geographers who had studied the natural sciences, especially physics and statistics, and who had good contacts with developments in theoretical economic literature. The frontier between economics and geography became productive of new ideas and techniques during the 1950s at several American Universities. In the United States, there developed four schools of quantitative geography.
Three schools were developed in the departments of geography in the Universities of Iowa, Wisconsin and Washington, with Washington as the most prominent centre of innovation. The fourth quantitative school, known as the ‘Social Physical School’, developed independently at Princeton University, under John Q. Stewart and William Warntz, who sought to draw inspiration from physics rather than economics.
‘By far the largest volume of work in the spirit of Schaefer’s and McCarty’s proposals that was published during the 1950s came from the University of Washington, Seattle. The leader of the group of workers at Seattle was W. L. Garrison … (he) was influenced by Schaefer’s paper, although the dates of his earliest publications indicate that he was involved in applying the positivist method to systematic studies in human geography before 1953, and also Involved was E. L. Ullman, who moved to Seattle in 1951 and who had already done pioneering research into urban location patterns, and transport geography.’
Garrison and his associates at Seattle were mainly interested in urban and economic geography, into which they introduced location theory based on concepts from economics with associated mathematical methods and statistical procedures.
Outside the United States, the department of geography at Lund University in Sweden, developed into a centre of quantitative and theoretical geography, attracting scholars from many countries. A logical academic contact developed between Lund and Seattle from the very beginning of the quantitative revolution.
Hagerstrand, the Swedish geographer, taught at Seattle in 1959 and R. L Morill, one of Garrison’s associates at Seattle, studied with him in Lund where his work on migration and growth of urban centres was presented.
In this article we will discuss about the system approach and system analysis of geographical study, which consists of a set of entities with specification of relationships between them and their environment. It usually contains a large number of entities and the relationships between them imply a high degree of interdependence. The study of systems, therefore, appears to be associated with the study of complex structures.
Essay # 3. Systems Approach:
Geographers have used forms of systems concepts since the dawn of the subject. Despite its venerability, a systematic approach has tended to remain a philosophical concept rather than providing guidelines for practical research.
No methods and techniques had been developed to enable the analysis of complex systems in an accurate way before Second World War when systems concepts were invoked in descriptive contexts with particular reference to consideration of a balance in nature.
The concept of systems is often associated with particular theorizing styles, i. e. positivism or functionalism. However, Williams (1983) observes the relationship between systems concepts and structuralism.
Ludwig Von Bertalanffy (1950) is credited with the development of the general systems theory. It aims to provide theoretical properties of different types of systems. It reflects an attempt to unify science via perspectivism instead of the more usual division of science through reductionism. Its focus is on isomorphism, the common features among the systems studied in different disciplines.
Its subject matter is the formulation and derivation of these principles, which are common for systems in general. Many of the basic ideas have a long history, but in geography their formal incorporation into a ‘metalanguage’ occurred during the 1960s. This involved a set of recognisably scientific procedures which could be connected up to those of the quantitative revolution. This also involved a set of concepts which offered the prospect of a theoretical integration of physical and human geography.
R. J. Chorley is the first geographer to have introduced general systems theory in geography. His paper ‘Geomorphology and General Systems Theory’ (1962) was the first major devoted exclusively to a Systems approach within the framework of General Systems Theory. A major part of the paper is devoted to the application of the concept of open and closed systems to geomorphology.
Many of the later works appear to have used the formal procedures and concepts to identify isomorphisms between different types of geographical systems. Ray and Others (1974) applied the concept of allometry to show that the growth rate of a component of an organism is proportional to the growth of the whole.
Berry (1964) pointed out that cities are open systems in a steady state, as exemplified by the stability of their behaviour-describing equations. Woldenberg and Berry (1967) drew analogies between the hierarchical organisation of rivers and of central place systems.
The most explicit recognition of the general systems theory has come from macro-geography, where Warntz (1973) claimed that, properly understood, geography is general spatial systems theory. The advantages of the general systems theory to human geography lie in its inter-disciplinary approach, its high level of generalisation, and its concept of the steady-state of an open system.
But it is realised that geography’s strong empirical tradition means that it has more to contribute than to take from the general systems theory. However, to Chisholm (1967), the general systems theory seems to be an irrelevant distraction.
Essay # 4. Systems Analysis:
It is a methodological framework for investigating the structure and function of a system. As a method, systems analysis concerns abstraction rather than truth. The system must be seen as a useful abstraction or model which enables a particular form of analysis to be made. The keystone of the study of systems is connectivity.
As Harvey (1969) points out, reality is infinitely complex in its links between variables but systems analysis provided a convenient abstraction of the complexity in a form which maintains the major connections.
A system comprises of three components (Fig. 14.1):
(a) A set of elements;
(b) A set of links (relationships) between those elements; and
(c) A set of links between the system and its environment.
In a system, the elements have volumetric qualities and material flows along the links. As the system operates, so the various quantities may change. Every system has three basic aspects: structure, function and development.
The structure is the sum of the elements and the connections between them. Functions concern the flows (exchange relationships) which occupy the connections. Development represents the changes in both structure and function which may take place overtime.
The structure of the systems can be treated in two separate frameworks—closed systems and open systems. Closed systems have definable sealed boundaries across which no input or output of energy occurs. Such systems are rare in geographical studies. Open systems, on the other hand, are those systems which have both inputs and outputs of energy to maintain the system (Fig. 14.2).
Closed systems are extremely rare in reality, but are frequently created, either experimentally or, more usually in human geography, by imposing artificial boundaries, in order to isolate the salient features of a system. An open system interconnects with its surroundings. All real systems (such as landscapes) are open systems.
Chorley and Kennedy (1971) characterise systems in relation to their structures and flows within these structures.
In fact, they identify four types of systems:
1. Morphological systems, which consist solely of the physical properties of their components and where the relationships between them are expressed through a web of statistical correlations. Much of the spatial analysis outlines such morphological systems.
2. Cascading systems, which consist of a chain of sub-systems linked by a cascading throughput such that the output of one subsystem forms the input for the next. Berry (1966) has linked two examples of cascading systems of Haggett’s nodal regions and Isard’s input matrix representation of an economy in his inter-regional input-output study of the Indian economy. However, factories can be portrayed as cascading systems, in that the output of one factory is in many cases the input for another.
3. Process response systems, which are formed by the interaction of morphological and cascading systems. The ecosystem is a process-response system concerned with the flows of energy through biological environments, most of which include, or are affected by, man.
4. Control systems, which are process-response systems and which are structured by the intervention of decision-making agencies at certain key points (values) so as to alter the disposition of the throughputs in the cascading system and hence change the equilibrium relationships in the morphological system.
The ecosystem is also a control system in that the living components act as regulators of the energy flows. They further represent a major point at which human control system must interact with the natural world.
Langton (1972), on the other hand, classifies systems into two types: Simple action systems and feedback systems. Simple action systems are unidirectional in their nature; a stimulus in X produces a response in Y, which in turn may act as a stimulus to a further variable, Z. Such a causal chain is merely a reformulation of the characteristic cause and effect relation with which traditional science has dealt.
In another language, it is a process law. More important, and relatively novel to human geography, is the feedback system, which is the property of a system or sub-system such that, ‘when change is introduced via one of the system variables, its transmission through the structure leads the effect of the change back to the initial variable, to give a circulatory of action’.
The relationships between entities of the system and its environment can often be characterised as feedbacks. The feedbacks are of two types- positive and negative. Positive feedbacks are self-enhancing mechanisms which seek to enhance, accentuate and reinforce the changes in a system.
Such feedbacks push the system farther and farther away from its initial condition. Negative feedbacks, on the other hand, are self-regulating mechanisms which counter -balance changes in a system and bring the system back to the state of equilibrium or to a situation which existed prior to the change.
With the positive feedbacks, the system is characterised as morphogenetic, changing its characteristics as the effect of B on C leads to further changes in B, via D. But with negative feedback the system is maintained in a steady state by a process of self-regulation known as homeostatic or morphostatic.
Feedback may be either direct—’A’ influences ‘B’ which in turn influences ‘A’—or it may be indirect, with the impulse from ‘A’ returning to it via a chain of other variables. The concept of feedback with associated notions of homeostatics and morphogenesis, gave the nuclei of the systems theory of change. As a consequence, the nature of feedback should be the focus of geographical study.
Bennet (1975) has attempted to apply systems theory to a problem in human geography on the dynamics of location and growth in north-west England. Having represented the system—its elements, links and feedback relationships— Bennett estimated the influence of various external (i.e. national) events on the system’s parameters, isolated the effects of government policy (industrial development certificates) on the system’s structure and produced forecasts of the region’s future spatio-temporal morphology .
The most comprehensive attempt to forge a systems approach to geographical study has been done by Bennett and Chorley in their book entitled Environmental Systems: Philosophy, Analysis and Control (1978) with the intention of providing a unified multi-disciplinary approach to the interfacing of ‘man’ with ‘nature’.
The book was prepared with these major aims:
‘First, it was desired to explore the capacity of the systems approach to provide an inter-disciplinary focus on environmental structures and techniques.
Secondly, to examine the manner in which a systems approach aids in developing the interfacing of social and economic theory, on the one hand, with the physical and biological theory, on the other.
Thirdly, to explore the implications of these inter-facings in relation to the response of man to his current environmental dilemmas.
It is hoped to show that the systems approach provides a powerful vehicle for the statement of environmental situation of ever-growing temporal and spatial magnitudes, and for reducing the areas of uncertainty in our increasingly complex decision-making arenas.’
The use of systems analysis is based on the assumption (usually implicit) underlying much positivist work in human geography, that valid analogies can be drawn between human societies on the one hand, and both natural phenomena complexes and machines on the other. Individual elements in a system have pre-determined roles and can act and change in certain restricted ways only— depending on the structure of the system and its interrelationships with the environment.
As a descriptive device, this analogy allows the structure and operation of society and its components to be portrayed and analysed, and it provides a source of ideas from which hypothesis can be generated. And once a system has been defined and modelled, systems can be used as a predictive tool, to indicate the nature of the element and links following certain environmental changes, such as the introduction of new elements and links.
The systems analysis has wide applications in both human and physical geography along with the interface where man and environment interact. Once the system has been successfully modelled, it can be manipulated using control theory which is a dynamic optimisation technique permitting optimal allocation along the time horizons, and shifts emphasis from mere model construction to model use.
Such a combination of models describing systems with a theory of systems control has a wide range of potential application in such fields as pollution control, catchment management, inter-area resource allocation and urban planning. It suggests a commonality of interest, focused on methods, between applied physical geography and applied human geography.
Gregory (1978) attempts to criticise both systems analysis and general systems theory on the ground that they are intrinsically associated with positivism. The concept of one systems theory which is relevant for all the sciences may be seen as a fruit of the positivist concept of one science, one method. He is further afraid that prominence given to control system may lead to instrumentalism.
Essay # 5. Entropy and Information Theory:
Weaver distinguishes systems of disorganised complexity, where the interrelationships are relatively weak, and those of organised complexity, where they are stronger. Traditional science has usually been concerned with simple systems with relatively small number of entities or variables.
The ‘entropy-maximizing methods’ can be applied to systems of disorganised complexity, and the concepts and methods of bifurcation and catastrophe theory can be applied to systems of organised complexity.
The entropy of a system is an index of uncertainty. It is a measure of the amount of uncertainty in a probability distribution or a system subject to constraints. In the second law of thermodynamics, an increase in system entropy is an increase in system uncertainty.
Thermodynamic entropy relates to the most probable configuration of the elements, within the constraints of the system’s operations. The concept of entropy has been used in a wide variety of contexts, notably in information theory, and as the basis for the ‘entropy- maximizing model’ of spatial interaction.
The information theory is a mathematical approach in communication science, attempting to analyse the information or degree of organisation of a system. The theory and its associated methods have been used to describe settlement and population distribution in geographic space. In information theory, entropy refers to the distribution of the elements, across a set of possible states, and is an index of element dispersion.
One can be completely certain about a distribution, in terms of predicting where one element will be, if all elements are in the same state. Conversely, one will be most uncertain when elements are equally distributed through all possible states, so that prediction of the location of any element is most difficult.
The information-theory measure of entropy is another descriptive index, and can be developed in a variety of ways:
(1) As a series of indices of variation in population distribution;
(2) As an index of redundancy in a landscape, where redundancy is defined as relating to a regular sequence so that it is possible to predict the land use of a place as, for example, from knowledge of the land uses at neighbouring places; and
(3) As a series of measuring of reactions to situation in states of uncertainty.
The usage of entropy as developed in statistical mechanics rather than in information theory was introduced to the geographical literature by Wilson in his book Entropy in Urban and Regional Modelling (1970).
He points out – ‘The application of the concept of entropy in urban and regional modelling, that is in hypothesis development, or theory building (Model and hypothesis are used synonymously, and a theory is a well-tested hypothesis) … the entropy maximizing procedure enables us to handle extremely complex situations in a consistent way.’
Wilson’s initial example was a flow matrix. The number of trips originating in a series of residential areas, and the number ending in each of a series of workplace areas are known, but the entries in the cells of the matrix—which people move from which residential area to which workplace area—are unknown. What is the most likely flow pattern? Even with only a few areas and relatively small number of commuters, the number of alternations is very large.
He defines three states of the system. The first is the ‘macro-state’, comprising the number of commuters at each origin and the number of jobs at each destination. The second is the ‘mesa-state’, comprising a particular flow pattern- five people may go from zone A to zone X, for example, and three from the same origin to zone Y, but it is not known which five are in the first category and which three in the second.
The third is the’micro-state’ which is a particular example of a flow pattern—one of the many possible configurations of eight people moving from zone A, five to X and three to Y.
Entropy-maximising procedures find the meso-state with the largest number of micro- states associated with it. In other words, it is the meso-state about which the analyst is least certain as to which configuration produces the pattern. The entropy-maximising approach uses entropy as the basis for finding the most likely macro-state of a system with specific constraints.
Weber has extended Wilson’s argument that the purpose of entropy-maximising models is to draw conclusion from a data set which is ‘natural’ in that they are function of that data set alone and contain no interpreter bias.
On the basis of Wilson’s entropy-maximising model, Weber (1972) developed an entropy-maximising paradigm which focuses on locational models and on interaction models or both. The entropy- maximising model acts not only as a ‘black box’ (representation of an element) forecasting device, but also as a hypothesis.
If the operation of the systems described is to be understood, the axioms—the constraints—must themselves be explained. Given the nature of the constrains (in Wilson’s initial example, why people live where they do, why people work where they do and why they spend a certain amount of time, money and energy on transport), the task is a major one; entropy-maximising models aim to clarify it and indicate the most fruitful avenues for investigation.
Thus, the mathematical formulation of the information theory exhibits a close relationship with the mathematics of entropy in statistical thermodynamic and entropy-maximising models in geography and planning, and is central to quantitative revolution.
Essay # 6. Models and Analogue Theory:
A model is defined as an idealised and structured representation of the real, or a simplified representation of reality which seeks to illuminate particular characteristics. The concept is a wide one, and a model is either a theory or a law or an hypothesis or a structured idea.
A model or theory, when tested and validated, provides a miniature of reality and therefore is a key to many descriptions. It can offer insights towards understanding the real world. There is a single master key instead of the loaded key-ring. Model-building has a long history in many sciences, but its incorporation into geography is of comparatively recent origin (the 1950s and 1960s) and depends on so-called analogue theory.
An analogue theory is a formal theory of model-building which provides for the selective abstraction of elements from an empirical domain and their translation into a simplified and structured representation of a particular system. The theory has important connections with positivism, where models are typically used in tests which are intended to allow for the validation of hypothesis and thus the extension of general theories.
These procedures were often formalised in statistical and mathematical terms, and as such played a vital role in search for general theorems of spatial organisation. Haggett speak of essential links between model-building and the quantitative revolution, and causing a move from an idiographic to a nomothetic geography.
The use of analogy has long been recognised as a powerful tool both in reasoning process and in throwing new light on reality. Reasoning by analogy involves the assumption of a resemblance of relations or attributes between some phenomenon or aspect of the real world and an analogue or model.
The basic assumption involved is that two analogues are more likely to have further properties in common than if no resemblance existed at all, and that additional knowledge concerning one consequently provides some basis for a prediction of the existence of similar properties in the other.
A model or analogue must, therefore, belong to a more familiar realm than the system to which it is applied. In other words, the problem is ‘translated into more familiar or convenient terms such that a useful model involves a more simplified, accessible, observable, controllable, rapidly developing, or easily formulated phenomenon from which conclusion can be deduced which, in turn, can be reapplied to the original system or real world.
In general, any two things, whether events, situations, creatures or objects, can be said to be analogues, if they resemble each other to some extent in their properties, behaviour or mode of functioning. In practice, the term analogue is used rather loosely to cover a wide range of degrees of resemblance. This wide range of resemblance may lead the model builder to employ as varied analogues as a past situation or happening in the real world, a mathematical simplification or a construction built of string and wire’.
‘One of the most striking characteristics of geographical analysis which this subject has in common with the other natural and social sciences is the high degree of ambiguity presented by its subject matter…. Where such ambiguity exists, scholars commonly handle the associated information either by means of classifications or models. Much has been written on the subject of geographical classifications, which usually result from the accumulation of a back-log of information which is then dissected and categorised in some convenient manner – model building, which sometimes may even precede the collection of great deal of data, involves the association of supposedly significant aspects of reality into a system which seems to possess some special properties of intellectual stimulation. This is not to imply that classifications and models do not share common ground and, indeed, the genetic classifications of national economic stages and of shorelines form something of a link between them. However, in their extreme forms, classifications and models are sharply differentiated’.
There are two basic functions of a model as a representation of the real world, such as:
(a) A scale model, a map or a series of equations, and some other analogue; and
(b) An ideal map, a representation of the world under certain constrained conditions.
Both are used in the positivist method to operationalise a theory, as a guide to the derivation of testable hypothesis. A model becomes a theory about the real world only when a segment of the real world has been successfully mapped into it. As a theory (model), it can be accepted or rejected on the basis of how well it works.
As a model, it can only be right or wrong on logical grounds. A model must satisfy only internal criteria; a theory must satisfy external criteria as well. Reasoning by analogy is a step towards building of theories, such that a promising model is one with implications rich enough to suggest some hypothesis and speculations in the primary field of investigation.
Models need to be objective but ‘tend to be subjective approximations, in that they do not include all associated observations or measurements, but as such they are valuable in obscuring incidental detail (noise) and in allowing fundamental aspects of reality to appear. This selectivity means that models have varying degrees of probability and a limited range of conditions over which they apply. The most successful models possess a high degree of probability application and a wide range of conditions in which they seem appropriate. Indeed, the value of a model is often directly related to its level of abstraction’.
Many of the early models, e.g. the Von Thunen model of agricultural land use and the central place models of Christaller and Losch, were attempts to represent the abstract geometry of an idealised economic landscape through various translations of concepts of ‘accessibility’ and ‘distance decay’.
Further, since the itinerary of:
< abstraction— > generalization— > disclosure >
which their construction involved was supposed to provide for further extension, they represented necessary moments in the ‘puzzle solving’ activity required for the foundation of a new scientific paradigm. It was no accident, therefore, that Chorley and Haggett’s early advocacy of a ‘model based paradigm’ for modern geography should have been phrased in such resolutely Kuhnian terms.
Chorley (1964) provides a diagrammatic model for models. The diagram is simply composed of a series of ‘steps’ each of which contains some aspect of real world, model, observation, or conclusion. These are connected, sometimes in a very loose and varied manner, by ‘transformations’ whereby the reasoning process is advanced or checked upon.
Each of these transformations introduces (in the language of information theorist) the possibility of ‘noise’ (incidental detail) in so far as the processes of simplification and translation involve, on the one hand, the discarding of information and, on the other the possibility of the introduction of new information which is irrelevant. The most useful employment of analogues is that which succeeds in causing the least noise.
This sequence of steps represents the most difficult type of model building because, in developing a simplified but appropriate model for a given object, system or segment of the real world, huge amounts of available information are being discarded and therefore much noise (incidental detail) is being potentially introduced. The transformation of ‘idealization’ is especially difficult, and very much depends on the experience and genius of the model builder.
It involves extracting from the mass of information about the real world those aspects which are held to be especially significant, in that they seem to fit together into some sort of pattern. One of the major sources of noise in this transformation is introduced by the phenomenon of ‘feedback’.
Idealisation largely involves reaching some conclusion about the relationships in the real world which really matter, so that irrelevant information could be snubbed to expose the significant relationships then susceptible to further examination.
A conceptual model is derived by such a means, which, having some basis of observed facts and regularities, contains a mental image of the significant ‘web of reality’. The mental image may have come quite unambiguously from the simplification of previous empirical knowledge or may appear to have come out largely from imagination.
Many attempts at model building depart from formality at this point and pass by direct reasoning to a hypothesis or some conclusion about the real world which, if successfully appraised against the real world, may form the basis of a theory.
The conceptual model appears to be too complex to handle. It requires further simplification by removing/discarding still more extraneous information, and retaining only the simplest and most significant aspects of the basic matter. The transformation is less noisy than idealisation, because both ends of it are more completely realized.
And in practice it may represent a whole series of simplification, the aim of which is to maintain the simplest and most significant aspects of the problem, while removing any irrelevant material which might obscure the fundamental relationships and present a satisfactory solution. The final product of abstraction is often a ‘simplified model’ which has been reduced to a condition where the fundamental symmetries and relationships be exposed for precise definition and further treatment.
In geography, most attempts at model building by abstraction have met with least success. It is notable that those which are currently judged as most successful are the conceptual models, rather than the simplified models in which excessive simplification has been avoided. The simplified economic models of von Thunen and Weber, with their assumptions of fixed markets and sources of raw materials, seem less attractive than the freer and more complex models of Christaller and Losch.
However, most of the earlier models in geography (such as Ritter’s notions regarding the development of peoples under differing geographical conditions, Mackinder’s ‘Heartland Theory’, Malthus’ population growth model, Frederick Jackson Turnier’s ‘Frontier Hypothesis of national development’, and Suess’ eustatic theory) belong to the group of such simplified models in which too much truth has been sacrificed for too much simplicity.
It is interesting to note that not all simplified models produce detrimental results. But some simplification may be significant or successful as it transforms a segment of the real world into a new dimension which is merely different and interesting.
The derivation of some form of simplified model enables structures and relationships to appear which are capable of further exploitation, commonly in such a manner that prediction can be attempted. This exploitation can be achieved by means of mathematical, experimental, or natural models.
1. Mathematical Models:
These models are the transformation of the ideas of the simplified models into the formal, symbolic logic of mathematics. They include those models which involve the adoption of a number of idealizations of the various details of the phenomena and in ascribing, to the various entities involved, some strictly defined properties.
The essential features of the phenomena are then analogous to the relation-ship between certain abstract symbols, and the observed phenomena resemble closely something extremely simple, with very few attributes.
The resemblance is so close that the equations are a kind of working model, from which one can predict features of the real thing which have never been observed. The construction of a mathematical model involves the language transformation from the words of the simplified model to the mathematical symbols (mathematization) so as to produce a mathematical system.
Mathematical system or models can be of two broad kinds, ‘deterministic’ and ‘stochastic’:
(a) In deterministic models, if specified conditions exist at a particular moment of time, it is possible to specify what conditions existed at an earlier time or will exist at a later time. Such models are based on the classic mathematical notion of cause and effect, and consist of a set of mathematical assertions from which consequences can be derived by logical mathematical argument. Commonly, the reasoning exploits the assumed simple or multiple relationships between a number of interlocked factors which have been identified in simplified manner.
In physical geography, such mathematical deterministic models have been developed in large numbers so as to analyse/predict various aspects of the morphology of the Earth’s surface. The most popular geomorphic application of deterministic models has been in the deduction of the forms accompanying slope recession, summarised by Scheidegger.
Jeffreys deduced theoretically the form of resulting peneplane by developing a deterministic mathematical model. Miller and Zeigler applied it to predict the expected patterns of coastal sediment size and sorting. Most mathematical models of the atmospheric circulation are of the deterministic type, as are those involved in numerical forecasting.
Deterministic models have also been developed in human geography, but are not so common. However, Beckman applied the theory of the hydrodynamics equation continuity to investigate the cost of inter-local commodity flows. Lighthill and Whitham used the principle of kinematic waves to investigate the flow and concentration of traffic on crowded arterial roads and Richard applied the principle of the mathematics of fluid flow to investigate traffic flow.
Thus, in deterministic models the development of some system in time and space can be completely predicted, provided that a set of initial conditions and relationships is known.
(b) Stochastic models specify sequences of events within a certain range of probability. Some authors distinguish between a ‘probabilistic model’ in which the outcome of individual traits is predicted and a stochastic model in which the development of a whole series of outcomes is modelled.
A stochastic model may, therefore, include those situations in which the probabilities of an outcome of a particular trial are determined by the outcome of preceding trials. Stochastic models have been increasingly developed in human geography rather than in physical geography. Isard developed a spatial economic model of a statistical character.
Hagerstrand’s ‘innovation wave’ model is similarly based as is the mathematical theory of population clusters by Neyman and Scott which was inspired by the kinetic gas theory. Garrison suggested the employment of an electronic computer to develop a stochastic mathematical model for city growth.
Both the deterministic and stochastic mathematical models must be susceptible to logical mathematical argumentation, a transformation which is virtually noise free and involves the solution of the equations forming the basis of the mathematical system to provide logical mathematical interpretation into conclusion about the real world It is important to recognise that of themselves mathematical models do not provide explanation of the real world but merely allow conclusion to be drawn from the original mathematical assumptions.
2. Experimental Models:
It is through substantiation that a simplified model can be treated in order to examine certain phases of its operation or attempt to predict it. It is through this process that the concepts of the simplified model are reproduced as tangible structures by a transformation which is inherently very noisy. Experimental models can be further divided into the scale model’ and the ‘analogue model.’
(a) Scale Models are closely imitative of a segment of the real world with which they resemble in some respects (e.g. being composed mostly of the same type of materials) and the resemblance may sometimes be so close that the scale model be considered as merely a suitably controlled portion of the real world.
The most significant characteristics of scale models are the high degree of control which can be achieved over the simplified experimental condition and the manner in which time can be compressed. Scale models have been successfully developed in physical geography.
Friedkin developed scale model to investigate the phenomenon of meandering and Rouse applied its techniques in meteorological research. However, the fundamental problems attending their construction are that changes of scale affect the relationships between certain properties of the model and of the real world in different ways.
(b) Analogue Models:
A more abstract refinement of scale model are the analogue models. They involve a radical change in the medium of which the model is constructed. They have a much more limited aim than scale models in that they are intended to reproduce only some aspects of the structure or web of relationships recognised in the simplified model of the real world segment.
Analogue model is a great potential source of noise (incidental detail). Analogue models abound in physical geography. Though these models usually provide the basis for plausible hypothesis, but their use is valuable in physical geography.
Some of the analogue models so far developed in physical geography include the construction of hydraulic analogue model to simulate freezing and thawing of soil layers Lewis and Miller’s use of a kaolin mixture to simulate some features of the deformation and crevassing of a valley glacier, Reiner’s rheological models for simulation of different types of deformation, and Starr’s dishpan analogue model to investigate the gross features of the atmospheric circulation in a hemisphere.
It is, however, in the fields abutting on human geography that analogue experimental models have been most strikingly applied. The spatial economists have shown their interest in the use of analogue models.
Whether a scale or analogue experimental model is used, their construction is followed by experimentation leading to a set of experimental observations, which remain to be interpreted in terms of the real world in a manner similar to the theoretical interpretation of the conclusions from a mathematical model. Very occasionally these results from a mathematical model are transformed into experimental model (a very noisy process) as a further step in testing their agreement with the real world.
3. Natural Models:
Natural models are those in which simplified models are further treated, exploited and used as a basis for further analysis. Prediction by their translation into some analogous natural circumstance is believed to be simpler and more readily observable. Two types of natural models can be conveniently differentiated, the historical natural model and the analogue natural model.
(a) Historical natural model implies a translation of the simplified model into a different time and/or place, on the assumption of historical repetition of events. As such it is an obvious source of noise or incidental detail. The nineteenth-century German and French geographers were attracted to the historical geographical analogue concepts. The most obvious geographical analogue which applied to a different time is presented by the analogue forecasting methods developed in meteorology.
(b) Analogue natural models involve the translation of the simplified model into a different natural medium—a process of the most noisy kind. Problems of social geography are being attacked on the basis of physical analogues. Garrison developed an ice-cap analogy of city growth. The social physicists have naturally been attracted to the analogue natural models. The model is also used in physical geography.
However, the construction of the historical natural models or the analogue natural models, is followed by or subject to appropriate observation and may yield observed natural results which, through reapplication, may lead to some testable conclusions about the real world situation.
Chorley (1964) observed that it is through a process of more or less ‘direct reasoning’ that some of the reasoning in geography, leading to appropriate conclusions about the real world situations, could be achieved. However, the value of this method is obviously determined by the creative genius of the operator. Successful appraisal implies the checking of conclusion derived from model building with real world, so that a hypothesis can be developed.
And if successive checks of this kind lead to similar conclusions, then a ‘theory’ comes into being or may be developed.
The value of model-building in geographical ‘studies’ is critically high in importance as it helps in building up refined and systematic approach to understanding man-nature relationship in various forms and contextual frame of reference.
Most of the branches or sub-fields of the discipline of geography are replete with models of varying order and character. Models in geography could be categorised in many ways, taking into account the very nature of geographic concern to understand the man-nature development syndrome and its various forms and stages.
When one talks of models in geography then it is necessary to add that the models and their conceptualization is intimately connected with quantification in geographical inquiry; as the models tend to be systematically precise – and for precision a mathematical frame of reference is desirable, but it is not absolutely necessary.
The rapid development of model building and the use of quantitative techniques could not have taken place without computers, but the computers did not determine the development of either model building or quantitative methods.
Model building preceded the invention of the computer in many sciences, but in a discipline like geography which handles large quantities of data it would hardly have been possible to develop operational models worthy of the name without computer. The technological development had given the subject new possibilities which researchers had no hesitation in exploring.
Johnston (1986) observes that the claims and concepts regarding the accountability of the models, and their subsequent application in geographical inquiry, as developed by Chorley and Haggett, have been subjected to re-evaluation on three levels:
(1) First, the models themselves have been subject to a (limited) reformulation. In the second edition of ‘Locational Analysis’, the authors admitted that the present stock of models may be ‘unprepossessing enough’. The limited advance was in part because the autonomy of location theory had been compromised and more inclusive models of the space economy had been constructed outside its traditional domain.
Those who continued to work within the orthodox framework were thus concerned more with developing statistical and mathematical models which could break open complicated data-structures (e.g. space-time forecasting, Q-analysis) rather than with seeking to establish in any direct way the theorems of a general spatial science.
(2) Second, therefore, the original claim for model-building as the object of geographical enquiry was displaced and research efforts were directed towards methods seen as means rather than as ends.
(3) Third, even this was attacked by some critics who insisted that there was such clear-cut connection between model-building and the epistemology of positivism that the whole enterprise ‘ought to be rejected.
Essay # 7. Reaction to the Quantification:
Though the bulk of the Anglo-American geographers of 1950s and 1960s were crazy about the quantification of the discipline, but there were a number of people who opposed the scientific method being followed in geography.
Two related issues were the main foci of contention:
(1) ‘Whether quantification was sensible in geographical research’, and
(2) ‘Whether law making was possible in geography’.
Dudley Stamp (1966) was one of the principal opponents to the quantitative geography. He pointed out that there are many fields of enquiry in which quantification may stultify rather than aid progress, because there will be a temptation to discard information which cannot be punched on a card or fed into magnetic tape. There is also a danger that ethical and aesthetic values will be ignored.
Broek (1965) observed that since massive quantitative data on human behaviour are only available for the advanced countries and that only for at best a century, the theorists tend to construct their models from facts of the ‘here and now’, virtually ignoring former times and other cultures.
The procedure becomes invidious when one projects the model derived from one’s own surroundings over the whole world as a universal truth and measures different situations in other countries as a ’duration’ from the ‘ideal’ construct. He, therefore, pointed out that search for general laws, at a high level of abstraction, goes against the gain of geography because it removes place and time from our discipline.
Minshull (1970) felt that the landscape was rapidly becoming a nuisance to some geographers, and that many of the models will only apply to a flat, featureless surface. He warned that there was a real danger that these ideal generalisations about the spatial relationships could be mistaken for statements about reality itself. He also suspected that there would be an overriding temptation not to test and destroy one’s beautiful hypothesis or model but to prove it in a subjective way.
Lukermann (1958) reacted sharply at the attempts by the social physics school of the Princeton University to establish analogies with physics. He pointed out that hypotheses derived by analogy could not be tested- falsification was impossible. Statistical regularities and isomorphisms with other subject matter do not provide explanations, so that hypotheses derived from such models test only the models themselves.
Criticising Bunge’s and Haggett’s assertion that geography is a spatial science and that geometry is the language of geography, Sack (1972) pointed out that space, time and matter cannot be separated analytically in a science which is connected with providing explanations.
The geographical landscape is continuously changing. The processes which have left historical relics and which are creating new inroads all the time must be taken into account as important explanatory factors. The laws of geometry are, however, static—they have no reference to time. They are sufficient to explain and predict geometries, so that if geography aimed only at analysis of points and lines on maps, geometry could be sufficient as our language. But, geometry alone cannot answer geographic questions.
There is certainly a danger that models based on research within the western cultural experience may be elevated into general truths. But it is concluded that a universal urban geography does not exist, and urbanisation cannot be dealt with in a universal process.
There are several fundamentally different processes that have developed out of differences in culture and time, and four such universes can be identified—North America and Australia; western Europe; the Third World; and the Socialist bloc—each with its own urban geography, which again would change through time. Owing to this fact, generalisations on the basis of quantitative techniques may be misleading and found to be useless and inconsistent.
Burton (1963) identified five categories of critics who were opposed to the quantification:
(a) Those who felt geography was being led in the wrong direction;
(b) Those who felt geographers should stick to their proven tool—the map;
(c) Those who felt that quantification was suitable for certain tasks only;
(d) Those who felt that means were being elevated over ends, and there was too much research on methods for methods’ sake; and
(e) Those who objected not to quantification, but to the quantifier’s attitudes,
However, he believed that quantification had been proven to be more than a fad and that geography would develop out of a stage of testing relatively trivial hypotheses with its new tools so that ‘the development of theoretical, model-building geography is likely to be the major consequence of the quantitative revolution’.
‘Whereas the adherents of the quantitative school could admit a certain lopsidedness in their approach by the late 1970s, their approach and argumentation had been far more orthodox at the end of the 1960s when it was thought that a definite choice between paradigms had to be made’.
Whatever may be the opposition to quantification in geography, the discipline may still be regarded as ‘a science concerned with the rational development, and testing of theories that explain and predict the spatial distribution and location of various characteristics on the surface of the Earth’.
The discipline is in the process of changing from an idiographic to a nomothetic science that requires the development and testing of theories and models through hypothetic-deductive methods in order to develop geographical laws.
Essay # 8. Positivism:
Scientific and philosophical discussion in the 1950s and 1960s had produced two chief categories of ‘meta-theory’ (superior theories or theories about theories): positivism and critical theory. ‘Positivism’ is connected with the naturalistic-pragmatic trend in modern thought and ‘critical theory’ with phenomenology and hermeneutics. Positivism has tended to dominate the English speaking world and Scandinavia, while critical theory prevailed in Germany, France and the Spanish-speaking countries.
Positivism as a philosophy was originally proposed by August Comte in the 1820s and 1830s, but it had drawn upon the earlier ideas of Saint- Simon. The primary purpose was to distinguish science from both metaphysics and religion. The concept of positivism began as a polemical weapon against the ‘negative philosophy’ prevalent before French Revolution.
This was a romantic and speculative tradition which was more concerned with emotional than practical questions and which sought to change society by considering Utopian alternatives to existing situations. The positivists regarded such speculation as negative’ since it was neither constructive nor practical; it showed that philosophy was an ‘immature’ science. Philosophers, like other scientists, should not concern themselves with such speculative matters, but should study things they could get to grips with: material objects and given circumstances. This approach was to be recommended as the positive approach.
Comte himself wanted to direct the development of society, but stated that the nature of positivism is not to destroy, but to organise. An organised development should replace the disorder, created by the revolution.
‘Free speculation or systematic doubt … was identified by Comte as the metaphysical principle…. Metaphysics was later redefined as that which lies outside our sense perceptions or is independent of them. Comte regarded metaphysical questions as unscientific, and held that in a positive society scientific knowledge would replace free speculation or make it unnecessary’.
Comte’s lectures on the philosophy of positivism were published as a book in 1829.
His philosophy of positivism incorporated the following five basic percepts:
1. All scientific knowledge must be based on direct experience of reality, since direct observation is the surest guarantee that the knowledge acquired is scientific.
2. The direct experience of reality should be complemented by ‘la certitude’, that is, the unity of scientific method. This is implied that the different branches of knowledge were distinguished by their object of study (i.e., the subject matter) and not their method. In other words, science differs from one another in ‘what’ they study rather than ‘how’ they study
3. The concept of unity of the scientific method required ‘le precis’, that is, a common scientific goal of formulating testable theories. This implied that there was no place for subjective value-judgements in scientific inquiry since, being based on ethical assertions, value judgements are not products of scientific observation, and are as such, not verifiable;
4. Positivist view of science incorporated the principle of ‘l’ utile’, which means that all scientific knowledge must serve some useful purpose—it should be utilisable; it should be a means to an end, and a tool for social engineering.
5. The fifth precept was ‘la relative’, which means that scientific knowledge is essentially unfinished and relative, because knowledge keeps progressing by gradual unification of theories which in turn enhance man’s awareness of social laws. Greater awareness demands more comprehensive theory.
Positivism, thus, in some connection called ’empiricism’, has the central thesis that science can only concern itself with ’empirical questions’ (those with a factual content) and not with ‘normative questions’ (questions about values and intentions). Empirical questions are questions about how things are in reality.
In this context, ‘reality’ is defined as the world which can be sensed and experienced. This means that science is concerned with ‘object’ in the world. The subject or the subjects, for whom there is a world, or worlds, are excluded from the field of interest.
‘Positivism holds that … we cannot investigate such things as moral norms with our senses; we should keep away from normative questions; we cannot justify our tastes scientifically. Science can describe how things are, and experimentally or by some other measurement, discover the association of causes which explain why things are as they are…. Ideally, science is value-free, neutral, impartial and objective.’
In seventeenth-century England, acute political strife intermittently favoured independent research. Francis Bacon formulated the scientific method of interaction based on factual data derived from the evidence of the senses. John Locke formulated the basic positivist principle that all knowledge is derived from the evidence of the senses: what is not derived from the evidence of the senses is not knowledge.
Reliable knowledge can only come from the basic observation of actual conditions. To be scientific is to be objective, truthful and neutral. Comte, who later defined positivism as a scientific ideal in line with Locke’s principles, believed that alongside the natural sciences there should be a science of social relationships to be developed on the same principles.
As natural sciences discovered the laws of nature, scientific investigation of communities would discover the laws of society. He, however, admitted that social phenomena are more complex than natural phenomena, but believed strongly that the laws governing society would eventually be discovered and that subjective elements in research would be eradicated.
This belief is central to Comte’s proposition that social development takes place in three stages:
(1) theological, when men explain everything in God’s will;
(2) metaphysical; and
(3) positive, when causal connections are discovered between empirically observed phenomena.
The great stress laid by positivism on empirical data and replicable research methods enabled a marked development of science during the nineteenth century. Because metaphysical questions came to be regarded as unscientific, science developed its own objectives which were apparently free of belief and value postulates. Positivism tends to be anti-authoritarian in so far as it requires us not to believe in anything until there is empirical evidence for it and it can be investigated by controlled methods.
In the 1920s, a group of scientists known as the ‘logical positivists’ was founded in Vienna. The leading philosopher of this ‘Vienna circle’ was Rudolf Carnap. The group pursued a modern development of positivism, and extended its fundamental principle by arguing that formal logic and pure mathematics as well as evidence of the sense provide sure knowledge. The basis of logical positivism was a distinction between:
1. analytical statements, a priori prepositions whose truth was guaranteed by their internal definitions (i.e. tautologies). These constituted the domain of formal sciences, logic and mathematics, which were clearly vital in maintaining the coherence of:
2. synthetic statements, whose truth still had to be established empirically, through conventional hypothesis testing. These revisions provided empirical inquiry with a much more secure basis than the Comtean model, and the new geography was readily recommended within their framework. However, the principle of verification was supposed to be the hallmark of the ‘factual sciences’, although it was subsequently challenged by Karl Popper’s principle of falsification.
As a set of scientific principles, logical positivism is concerned with acquisition of knowledge in the form of general statements obtained through accepted scientific procedures of observation and analysis that can be used in manipulating phenomena with a view to bringing about the desired results. This was the general concept of science that the positivists believed in.
This concept incorporated a set of three related doctrines:
(a) ‘Scientism’, i.e. the claim that the positivist method is the only true method for obtaining knowledge;
(b) ‘Scientific politics’ which means that positivism is the key to rational solution of all problems of society—it is the fundamental basis for social engineering;-and
(c) ‘Value-freedom’, which means that scientific judgements are objective and independent of moral or political commitments.
The logical positivists were, in fact, opposed to everything that smacked of metaphysics and unverifiable phenomena. They, therefore, became bitter opponents of Nazism, which they saw as a mixture of irrational prejudice and ideological dogma.
‘Positivist’ became a term of abuse in Nazi Germany, and was applied to Alfred Hettner, among others. The leader of the Vienna Circle, Moritz Schlick, was murdered by- the Nazis and its other members driven abroad.
Johnston (1981, 1986) attempted to reveal how positivism determined the scientific states of its statements through:
1. Their grounding in a direct, immediate and empirically accessible experience of the world (Phenomenalism) which gave observation statements a peculiar privilege over theoretical ones, i.e. empiricism, and which guaranteed their generality through:
2. a unitary scientific method, accepted and routinely drawn upon by the entire scientific community; this depended on:
3. the formal construction of theories capable of empirical verification, their successful proof would serve to identify universal laws which had:
4. a strictly technical function, in that they revealed the effectivity or even the necessity (but not desirability) of specific conjunction of events — thus value judgements and ethical utterances were ruled out of the scientific world because they could not be empirically tested, and the statements which remained could be brought together by:
5. the progressive unification of scientific laws into a single and incontrovertible system.
The cumulative effect of these five claims was to move from the immediate through the unitary to the universal – to close the system around a particular version of the present and to refuse admission to alternative ways of being in and acting on the world. These alternatives were in fact extremely important to geography, whose founders often accepted and advocated conceptions of science which were denied by positivism.
Both Kant and Humboldt, for example, rejected brute empiricism on which Comte’s system was built and elaborated sophisticated philosophical systems in its place … yet, paradoxically, much of the subsequent history of the subject was dominated by the acceptance… admittedly… of some or all of these assumptions, so that when they were finally formalised during the quantitative revolution of the 1950s and 1960s the so-called ‘New Geography’ which resulted was ‘less of a radical departure than a logical extension of ideas which were already accepted by many geographers. In so far as they were in any sense philosophically novel, they derived their originality from a commitment to logical positivism’. Geography, since then, became ‘positivist-led’.
Human geography was one of the social sciences to adopt positivist approach in a greater way. Geography accepted the doctrine of positivism in so far as it formulated a ‘systematic’ approach as opposed to regional approach, and a number of scholars introduced this new paradigm in the field of urban and economic geography and in various branches of physical geography. In this everything we witness in the phenomenal world perceived by ‘sense perception’ can be verified.
The New Geography, during 1950s and 1960s, came to be known as ‘Quantitative Geography’ that aimed at analysing spatial data, development of spatial theory, construction and testing of mathematical models of spatial processes—reflective of a paradigm-shift from the earlier regional inductive approach to systematic and deductive nomological approach.
‘Schaefer’s (1953) paper on ‘Exceptionalism’, in fact, had opened the door to the premise of (logical) positivism; their admission was, however, slow and, as it were, through several side entrances rather than up the main steps’.
Thus, as a principle, positivism or logical positivism was acceptable. The greatest contribution of positivism was that it served a very useful purpose of bringing a scientific approach in the discipline. The predictive aspects of physical and social phenomena were highlighted in modern contemporary geography, founded in the post-World War Second period, with a nomothetic base.