3: Lowry and his legacy

A brief history

The ‘science of cities’ has a long history. The city was the market for von Thunen’s analysis of agriculture in the late 18th Century. There were many largely qualitative explorations in the first half of the 20th Century. However, cities are complex systems and the major opportunities for scientific development came with the beginnings of computer power in the 1950s. This coincided with major investment in highways in the United States and computer models were developed to predict both current transport patterns and the impacts of new highways. Plans could be evaluated and the formal methods of what became cost-benefit analysis were developed around that time. However, it was always recognised that transport and land use were interdependent and that a more comprehensive model was needed. There were several attempts to build such models but the one that stands out is I. S. (Jack) Lowry’s model of Pittsburgh which was published in 1964. This was elegant and (deceptively) simple – represented in just 12 algebraic equations and inequalities. Many contemporary models are richer in detail and have many more equations but most are Lowry-like in that they have a recognisably similar core structure.[1]

So what is this core and what do we learn from it? How can we add to our understanding by adding detail and depth? What can we learn by applying contemporary knowledge of system dynamics? What does all this mean for future policy development and planning? The argument is illustrated and referenced from my own experience as a mathematical and computer urban modeller but the insights work on a broader canvass.

The Lowry model

Lowry[2] started with some definitions of urban economies with two broad categories: ‘basic’ – mainly industry, and from the city’s perspective, exporting; and ‘retail’, broadly defined, meaning anything that serves the population[3]. He then introduced some key hypotheses about land. For each zone of the city he took total land, identified unusable land, allocated land to the basic sector and then argued that the rest was available to retail and housing, with retail having priority. Land available for housing, therefore, was essentially a residual.

A model run then proceeds iteratively. Basic employment is allocated exogenously to each zone – possibly as part of a plan. This employment is then allocated to residences and converted into total population in each zone. This link between employment zones and residential zones can be characterised as ‘spatial interaction’ manifested by the ‘journey to work’. The population then ‘demands’ retail services and this generates further employment which is in turn allocated to residential zones. (This is another spatial interaction – between residential and retail zones.) At each stage in the iteration the land use constraints are checked and if they are exceeded (in housing demand) the excess is reallocated. And so the city ‘grows’. This growth can be interpreted as the model evolving to an equilibrium at a point in time or as the city evolving through time – an elementary form of dynamics.

The essential characteristics of the Lowry model which remain at the core of our understanding are

  • the distinction between basic (outward serving) and retail (population serving) sector of the urban economy;
  • the ‘spatial interaction’ relationships between work and home and between home and retail;
  • the demand for land from different sources, and in particular housing being forced to greater distances from work and retail as the city grows. This has obvious implications for land value and rents.

Towards realism.

In the half century since Lowry’s work was published, depth and detail have been added and the models have become realistic, at least for representing a working city and for short-run forecasting. The longer run still provides challenges as we will see. It is now more likely that the Lowry model iteration would start with some ‘initial conditions’ that represent the current state. The model would then represent the workings of the city and could be used to test the impact of investment and planning policies in the short run. The economic model and the spatial interaction models would be much richer in detail and while it remains non-trivial to handle land constraints, submodels of land value both help to handle this and are valuable in themselves.

Specifically:

  • the key variables can all be disaggregated – people for example can be characterised by age, sex, education attainment and skills and so be better matched to a similarly disaggregated set of economic sectors – demanding a variety of skills and offering a range of incomes
  • population and analysis and forecasting can be connected to a full-developed demographic model;
  • the economy can be described by full input-output accounts and the distinction between basic and retail can be blurred through disaggregation;
  • the residential location component can be enriched through the incorporation of utility functions with a range of components and house prices can be estimated through estimates of ‘pressure’, thus facilitating the effective modelling of which types of people live where;
  • this all reinforces the idea that the different elements of the city are all interdependent.

‘Housing pressure’ will be related to the handling of land constraints in the model. In the Lowry case, this was achieved by the reallocation of an undifferentiated population when zones became ‘full’. With contemporary models, because house prices can be estimated (or some equivalent), it is these prices that handle the constraints.

While the Lowry-type models remain comprehensive in their ambition, sectoral models – particularly in the transport and retail cases – are usually developed separately in even greater depth and as such, they can be used for short run forecasting. Supermarket companies, for example, routinely use such models to estimate the revenue attracted to proposed new stores which supports the planning of their investment strategies.[4]

Understanding behaviour.

The models as described above are essentially statistical averaging models[5] and they work well for large populations where the predictions of the models are of ‘trip bundles’ rather than of individual behaviour. The models work well precisely because of this averaging process which takes out the idiosyncrasies of individuals. They use the mathematics – but with a different theoretical base – developed by Boltzmann in the late 19th Century in physics. But what can we then say about individual behaviour? Two things: we can interpret the ‘averaging models’, and we can seek to model individual behaviour directly.

In the first case, there are elements of the models that can be interpreted as individual utility functions. In the retail case for example, it is common to estimate the perceived benefits of shopping centre size and to set these against the costs of access (including money costs and estimated values of different kinds of time). What the models do through their averaging mechanism is to represent distributions of behaviour around average utilities. This is much more realistic than the classic economic utility maximising models as shown through goodness-of-fit measures.

The second case demands a new kind of model and these have been developed as so-called agent-based models or (ABMs). A population of individual ‘agents’ is constructed along with an ‘environment’. The agents are then given rules of behaviour and the system evolves. If the rules are based on utility maximisation on a probabilistic basis, then the two kinds of model can be shown to be broadly equivalent.

The argument to date has been essentially geo-economic though with some implicit sociology in the categorisation of variables when the models are disaggregated. There is more depth to be added in principle from sociological and ethnographic studies and if new findings can be clearly articulated, this kind of integration can be achieved.

The harder challenges: dynamics and evolution.

The models described thus far represent the workings of a city at a point in time – give or take the dynamic interpretation of the Lowry model. there is an implicit assumption that if there is a ‘disturbance’ – an investment in a new road or a housing estate for example – then the city returns to equilibrium very quickly and so this can be said to characterise the ‘fast dynamics’. It does mean that these models can be used, and are used, to estimate the impacts of major changes in the short term. The harder challenge is the ‘slow dynamics’ – to model the evolution of the slower changing structural features of a city over a longer period. This takes us into interdisciplinary territory now known as ‘complexity science’. When the task of building a fully dynamic model is analysed, it becomes clear that there are nonlinear relationships – for example, as retail centres grow, there is evidence that there are positive returns to scale. Technically, we can then draw on the mathematics of nonlinear complex systems which show that we can expect to find path dependence – that is dependence on initial conditions – and phase changes – that is, abrupt changes in form as ‘parameters’ (features such as income or car ownership) pass through critical values. The particular models in mathematical terms bear a family relationship to the Lotka-Volterra models originally designed to model ecological systems in the 1930s but which can now be seen as having a much wider range of application[6].

These ideas can be illustrated in terms of retail development. In the late 1950s and early 60s, corner-shop food retailing was largely replaced by supermarkets. By the standards of urban structural change, this was very rapid, and it can be shown that this arose though a combination of increasing incomes and car ownership – hence, in effect, increasing accessibility to more distant places. This was a phase change. Path dependence is illustrated by the fact that if a new retail centre is developed, its success in terms of revenue attracted will be dependent on the existing pattern of centres – the initial conditions – and again this can be analysed using dynamic models.

This leads us to two fundamental insights: first, it is impossible to forecast for the long term because of the likelihood of phase changes at some point in the future; and secondly, the initial structure of the city – the ‘initial conditions – might be thought of as the ‘DNA’ of the city and this will in substantial part determine what futures are possible. Attempts to plan new and possibly more desirable futures can be thought of as ‘genetic planning’ by analogy with genetic medicine.

Given these insights, how can we investigate the long term – 25 or 50 years ahead? We can investigate a range of futures through the development of scenarios and then we can deploy Lowry- Boltzmann like models to investigate the performance of these and we can use the fully dynamic Lotka-Volterra models to explore the possible paths to give insights on what has to be done to achieve these futures.

Alan Wilson, March 2015


[1] Alan Wilson (ed) (2013), Urban modelling, 5 vols, Routledge
[2] I S Lowry (1964) A model of metropolis, Rand Corporation. accessible via Google
[3] This was a very crude form of economic base model which can be elaborated as an input-output model.
[4] Alan Wilson (2013) The science of cities and regions, Springer
[5] Alan Wilson (2008) Boltzmann, Lotka and Volterra and spatial structural evolution: an integrated methodology for some dynamical systems. Journal of the Royal Society Interface, 5, pp. 865-871.
[6] Alan Wilson (2008) op cit.

Leave a Reply

Your email address will not be published. Required fields are marked *