The Butler 1944 Education Act: both milestone and millstone

Chris Game

A ‘Legislative game-changer’ was what we were asked for. Or was it ‘Legislative Game-changer’?  No matter; this one was both. It’s exactly a half-century since the summer of ’63: Profumo and Keeler, Philby, the Great Train Robbery, the Beatles, Sindy dolls, and my leaving the boys’ grammar school, to which I’d ‘won’ a place seven years earlier by passing the compulsory 11-plus exam, and going to university. Statement of fact, but also declaration of interest.

The 1944 Education Act more than changed my life; it shaped it. It shaped me, like millions of others, into an entirely different person from the one I’d have become, born even 15 years earlier. Its shaping of me was, I like to think, positive, and certainly I’d be judged a successful product of the Act and the educational system it established. That shaping, though, included the acquisition of a disinclination to accept even good things entirely without question, which is why this account of the Act differs rather from the ‘Can’t-we-just-be-proud-that-we-created-an-entire-education-system-in-the-middle-of-wartime?’ story my father would undoubtedly have preferred.

That 11-plus exam, my result in which was far and away the most celebrated present I ever gave my parents (who believed it a measure of the discredited Cyril Burt’s Intelligence Quotient), had a flip side. It created roughly 80% of publicly defined 11-year old ‘less intelligent failures’ – like, four years later, my younger sister.

The 11-plus (we even called it the ‘Grading Test’) and the ‘tripartite’ or effectively bipartite system to which it was the key – aptitude-differentiated grammar, secondary modern, and a few technical schools – opened up each year new social divisions and reinforced old ones. Its General Certificate of Education (GCE) exams, normally taken at 16 and 18, disqualified a majority of the nation’s children from qualifications and ensured that full participation in secondary education remained, to quote one critic, “very much a minority pursuit”.  And that’s not to mention (yet) the Act’s failure even seriously to challenge the enduring wormcans of church/faith and independent/private schools. So yes, millstone as well as milestone.

That the Act was a milestone, landmark, etc. there can be no doubt. It replaced almost all previous education legislation, belatedly raised the school-leaving age to 15, and made secondary education free and universal. It established the famous three-cornered partnership for education in England and Wales: central government (Ministry of Education), with legal responsibility to set a national framework and allocate adequate resources; local education authorities (LEAs), with knowledge of local needs and responsibility for provision; and teachers with their professional expertise and responsibility for the curriculum. In setting this framework, moreover, the Act consciously sought to address pupils’ personal as well as academic development and the needs of the wider community.

The Act’s date and milestone-ness, though, were also its problems. It was a wartime coalition measure, passed by a Conservative-dominated Parliament, and at no point, therefore, did it engage with the more progressive agendas that had emerged in the inter-war years for, for example, a genuinely unified, national, publicly controlled educational system, a single multilateral, comprehensive/common secondary school, or a school leaving age of 16, rather than 15.  Indeed, the 1943 White Paper was welcomed by some as a Tory project that could divert some attention from the more radical Beveridge Report. It was very different, then, from what a 1946 Education Act might (possibly) have looked like, yet it was treated, even by Labour, as a systemic and at least generational settlement, requiring no more of post-war politicians than possibly some marginal tidying-up.

So whose milestone was it? Received wisdom (received by me, anyway) is that youngish, liberalish Conservative Education minister, R A Butler, skilfully persuaded a reluctant Churchill that legislating for major educational reform in wartime was a good idea – which is at best only part of the story.

First, Butler’s arrival in July 1941 as President of the Board of Education was less the promotion of a rising social reformer than a sideways exit from the Foreign Office of a discredited appeaser. Secondly, much of the Act, including the main decisions about secondary education, was set out in the Board’s ‘Green Book’, Education after the War, produced by civil servants under previous Board President, Herwald Ramsbotham (less German than his first name might suggest). If we sideline the always confusing label ‘policy makers’, it was these Board officials who were the principal authors of what was more a civil service Act than a political one. The role of the chief politician, Butler, was that of indispensable legislative facilitator.

Indispensable because Churchill was not so much reluctant as resistant: opposed to legislation, as a distraction from the ‘War project’, and uninterested in its content, apart from insisting that under no circumstances must it stir up divisions in the country. Butler’s singular achievement, and a huge tribute to his parliamentary and personal skills, was to get the legislative show on the road and, by evading, placating and defusing protest, keep it going through to the end.

At INLOGOV we’re hard-wired to sniff out underlying authoritarianism in any central-local relationship, and there were those who saw the 1944 Act as strengthening central control. Perhaps, but if that really were the framers’ objective, they could, like their 1980s’ and subsequent successors, have gone a heck of a lot further. Nor, surely, would they have created an LEA as powerful and potentially bothersome as the London County Council.

The Minister did have “the duty to secure the effective execution by the local authorities, under his control and direction, of the national policy for providing a varied and comprehensive education service in every area”. But she [the Act used entirely male pronouns, for pupils and the Minister, failing to anticipate the first post-war Minister being Ellen Wilkinson] did not provide and equip schools and colleges or employ teachers; that was the LEAs, the county and county borough councils. She did not set curricula or prescribe textbooks; that, at least until 1988, was the teachers.

The fact that easily the biggest section of the Act (Part II) was that setting out the how the new statutory system would be locally administered by the LEAs means, in itself, little: it might be stuffed full of controls and constraints. That’s not, though, how it reads. Each LEA would have an Education Committee of elected councillors, and would appoint a Chief Education Officer to head the salaried officers of the authority. They sound almost like self-contained mini-empires separate from the rest of the local authority, and often were. The LEAs were to build and maintain the county (state) schools and the one-third of schools provided by voluntary, mostly religious, bodies. They would usually appoint and always pay the teachers. They would allocate resources, including staff, buildings, equipment and materials.

It should be emphasised here that none of the terms and concepts mentioned in my opening paragraphs – 11-plus, selection, tripartite system, grammar schools, secondary moderns – appeared per se in the Act. It did require, however, the provision of opportunities for all pupils “in view of their different ages, abilities and aptitudes, and of the different periods for which they may be expected to remain at school”, and the tripartite system of grammar schools for the most able, secondary moderns for the majority, and secondary technical schools for those with a technical or scientific aptitude, was how it came to be interpreted.

Returning to LEAs, they would not have detailed control of the curriculum but were to “contribute towards the spiritual, moral, mental, and physical development of the community by securing that efficient education … shall be available to meet the needs of the population of their area”. They were to provide sufficient places for 5-15 year olds – and for 16-year olds as soon as the further rise in school-leaving age became ‘practicable’, which proved not to be until 1972. Within this framework, LEAs had and exercised in practice considerable autonomy, developing distinctive styles of administration and forms of school organisation, including the pioneering of comprehensive secondary schools.

The one part of the curriculum that was prescribed in the 1944 Act was religious/faith education, as a crucial part of the settlement negotiated between Butler and the mainly Christian church leaders. Like Tony Blair 60 years later, the authors of the 1944 Act – this time definitely including Butler – took the view that religious education was a public good, whose responsibility should be shared between state-run and religious schools, and they legislated an unevenly balanced ‘dual system’ to accommodate it, apparently indefinitely.

Offered the choice of ‘aided’ or ‘controlled’ status, two-thirds of religious schools opted for the former and one-third – far more than expected by either Butler or the Archbishop of Canterbury – for the somewhat greater LEA control and considerably greater cash.  In exchange, all state schools would provide non-denominational religious education, and each school day would begin with an act of collective worship.

Which brings us to Part III of the Act and Independent Schools.  If only from the point of view of my long passed word limit, it’s perhaps a good thing there was no extended debate over whether these, like religious schools, should somehow be integrated into the state system or simply abolished. In truth, there was no debate at all. The sole demanding imposition of Part III was that they be registered – and registered is what they remain today.


Chris Game is a Visiting Lecturer at INLOGOV interested in the politics of local government; local elections, electoral reform and other electoral behaviour; party politics; political leadership and management; member-officer relations; central-local relations; use of consumer and opinion research in local government; the modernisation agenda and the implementation of executive local government.

The Housing Acts of 1980: a watershed in housing policy

Alan Murie and Christopher Watson

The Housing Act 1980 and the Tenants’ Rights etc. (Scotland) Act 1980 mark a watershed in housing policy.  In the aftermath of the First World War and the slogan ‘Homes fit for heroes to live in’ the introduction of exchequer subsidy for new housebuilding in 1919 resulted in sixty years of steady growth of council housing.  Council housing, along with the expansion of home ownership, had transformed the condition of and access to good quality housing.  By the late 1970s some 1 in 3 households were council tenants. But the election of 1979 and the new legislation passed in 1980 saw a change in the long established cross party support for council housing, ended the period of growth in the sector, and heralded a period of deregulation and privatisation.

The Housing Acts operated in the context of reduced public expenditure on housing and introduced the ‘Right to Buy’, the ‘Tenants’ Charter’, a new subsidy system for council housing and changes to the Rent Acts.  They led directly to the decline of council housing, rapid growth in home ownership, a new and enlarged role for housing associations, and an eventual revival of private renting following a century of decline.

The Conservative Manifesto at the General Election of 1979 echoed ‘Homes for heroes’ in its emphasis on ‘Homes of our Own’, ‘The Sale of Council Houses’, and ‘Reviving the Private Rented Sector’.  While the primacy given to home ownership was not new, the specific policies designed to achieve it marked a break with previous policy and were a challenge to local autonomy.

When the Conservative Party won the 1979 election, they saw their housing policies and the ‘right to buy’ in particular as factors contributing to their electoral success.  Throughout the subsequent period the government continued to regard its initial policy stance as an electoral asset.  It was also advantageous fiscally – delivering the largest capital receipts of any privatisation programme: though none of the capital could be spent on replacing the council housing that was sold.

The right to buy in 1980 did not introduce the sale of council houses for the first time as discretionary powers enabling sale had always existed.   These were replaced in 1980 by a statutory RTB.   It applied to almost all secure tenants with three or more years’ tenancy and to almost all properties where the landlord was a council, a new town, or a non-charitable housing association.  A statutory procedure for sale was laid down to limit local variation over implementation and the Secretary of State was given very strong powers to monitor and intervene in local administration.   Generous sale discounts were introduced, rising from 33% of market value to a maximum of 50% depending on the length of tenancy; and these were further increased under later legislation, to 60% for houses and 70% for flats.

The RTB was highly publicised and made more attractive to tenants because of a related policy to steadily increase council rents.  After some initial nervousness on the part of building societies and other lenders, these institutions adopted the RTB with enthusiasm and more than nine out of every ten sales under the scheme were financed with private sector loans.  By 1990 some 1.8 million council, new town and housing association dwellings had been sold into owner occupation in Great Britain and sales continued thereafter, making it the most successful privatisation ever.  With reduced funding for new council housing the sector went into sharp decline.

The 1980 Act was a game changer not only in its own right but also for the future changes it signalled.  Despite many protests, the Act subjugated local government to the will of central government.  In this respect, the government’s approach was brazen, unlike the less transparent later attempts at privatisation in health and education.

The decline in the proportion of council housing from 33% in 1980 to 8% today has speeded the residualisation of the sector, moving council housing towards an American-style welfare housing sector, as intended by the Thatcher governments of the 1980s.  The decline of council housing has been made more dramatic because of the transfer of a large part of its role to housing associations: in many parts of the country, associations now are more important housing providers than local authorities, further weakening the direct role of local government, especially district councils.

But with the combined housing stock of housing associations and local authorities, we still have a social rented sector in the UK which, at 18% of the stock, remains one of the highest in the world and which would now be difficult for governments of a neo-liberal persuasion to further challenge, especially in today’s situation of housing shortage.

The Labour governments of 1997 to 2010 were criticised by some for their continuation of the policy of council house sales but their encouragement of the further transfer of council housing to other registered providers (ie housing associations) has served to protect the provision of social housing, even if at the same time it has further weakened the direct role of local government.  For these reasons, it can be concluded that the long term consequences of the Housing Act 1980 have profoundly changed the role and responsibilities of local government and weakened the position of council housing within the UK housing system.  What remains, however, is a tradition of publicly provided not-for-profit housing and an organisational structure which continues to provide an essential alternative to the private housing sector.

Alan Murie and Chris Watson are former Directors of the Centre for Urban and Regional Studies at the University of Birmingham.  Alan Murie is Emeritus Professor of Urban and Regional Studies and Chris Watson is Honorary Senior Lecturer.  Both are members of the Housing and Communities Research Network in the University’s School of Social Policy.

Direct Payments Act 1996: a legislative game-changer on a slow burn

Catherine Needham

Some legislative game-changers have a high-profile passage through Parliament, with much media fanfare about how things will never be the same again: gay rights legislation, for example, fits into this category. Other game-changers proceed more quietly, with their immediate implications limited to a relatively small number of people. This latter type can be characterised as ‘valve’ legislation, in the sense that once passed there is no going back, even if this is not fully appreciated at the time.

The Community Care (Direct Payments) Act 1996 is an example of valve legislation. Its passage followed from the persistent and passionate campaigns of people with disabilities to gain more control over their support. It made legal the transfers of cash to people eligible for local authority funding. Some local authorities had been finding ways to make such payments for years, with a wary glance at the apparent ban on such activity in the National Assistance Act 1948. But the passage of the Act gained little media interest beyond the Society Guardian, and it was assumed by government that the payments would only be taken up by a minority of younger people with physical disabilities. Indeed people over 65 weren’t eligible for the payments. The Act gave local authorities the power but not the duty to grant the payments, meaning that access to them was heavily dependent on a supportive social services department.

Nearly twenty years later the English government is committed to getting a direct payment or managed budget (where the local authority or third party holds the money on your behalf) to 70 per cent of people receiving local authority-funded social care – more if possible. In a succession of modifications to the law and its regulations, direct payments are now expected to be the default funding mechanism for people with physical disabilities, learning disabilities, older people and people using mental health services, and are available to carers. They have expanded to parents with disabled children and are being proposed for children with special educational needs and for adoptive parents. They are being introduced for aspects of NHS care as personal health budgets, which constitutes a radical change to health funding albeit on a small scale at present. The policy has cross-party support, being pursued as assiduously by the Coalition government as by their New Labour predecessors.

Why was the 1996 legislation able to trigger such a systemic change? Here are four suggestions.

  1. It was a simple idea: give the money straight to the user. Although the implementation has been enormously complex, it was an easy idea to explain, helping its proliferation and popularity
  2. It fitted the political mood, both to expand choice to people as consumers of public services (seeing that as the best way to improve outcomes) and to break down barriers for people with disabilities on rights basis. Governments since the 1980s have promoted both strands of legislation, even though there are tensions between them.
  3. It didn’t seem to cost anything, since money was simply being allocated differently. This aspect of the policy appealed to the Conservative government that first introduced the policy and to subsequent New Labour and Coalition governments. It is particularly appealing in a period of public spending austerity. Evidence for cost-savings has been harder to establish in practice, however.
  4. It created a wedge that could be used by policy entrepreneurs to push for further change. Once the principle was established that disabled adults under 65 were eligible for such payments it was very hard to argue on a principled basis that they should not be extended to other people in receipt of local authority support. Organisations such as In Control pushed at the boundaries of the legislation to broaden its range and built national alliances of supporters to agitate for its extension.

All these factors created a permissive legislative and policy context in which devolved budgets have come to be seen as the way to respond to a whole range of social issues. However behind the simplicity and potency of the idea lie two challenges which have not yet been resolved, and will continue to pose issues for future government.

The first is implementation. The simple insights of personal budgets and direct payments have proved very difficult to apply to a hugely complex, variable and underfunded social care system. Personal health budgets may help to integrate health and social care provision but it is difficult to do this when social care funding is means-tested and health funding is not.

Second, there is a normative challenge: if people are better at spending their own money than the state is at spending it on their behalf, what is the state for (aside perhaps from channelling money from rich consumers to poor ones)? This vision of a voucher state has long been cherished by some on the right of the political spectrum. Raising questions about the need for a welfare state was not the vision of any of the advocates of the Direct Payments Act 1996 but such debates affirm that the full consequence of that Act are not yet known.


Catherine Needham is Reader in Public Policy and Public Management at the Health Services Management Centre, University of Birmingham, and is developing research around public service reform and policy innovation. Her recent work has focused on co-production and personalization, examining how those approaches are interpreted and applied in frontline practice. Her most recent book, published by the Policy Press in 2011, is entitled, Personalising Public Services: Understanding the Personalisation Narrative.

The Health Act 2006: Behaviour change in action?

Catherine Staite

The Health Act 2006 is a very dull title for an Act of Parliament which has had such a profound and universally beneficial impact on all our lives.  It enacted the ban on smoking in enclosed places to which the public have access.

When I was training to be a solicitor in 1976, I shared an unventilated basement office with an etiolated, chain smoking Welshman.  He chain smoked Gauloises and I went home every night with a bad headache, smelling like a kipper.  His right to smoke – and the social acceptance of smoking – trumped my right to breathe. How things have changed! But why have they changed so much?

In the 1950s the UK had one of the highest rates of smoking and consequently one of the worst rates of death from lung cancer in the world.  However, smoking began to decline in the 1960s and death rates began to fall from 1965.  In 1979, 45% of the population smoked but by the 1990s that number had fallen to 30%.  Between the introduction of the smoking ban in 2007 and 2010 it fell a further 9%.

There was much controversy at the time with dire predictions of damage to businesses, particularly pubs. Smokers argued that their human rights were being attacked. The tobacco industry complained that it was leading to a reduction in the number of cigarettes smoked and a significant rise in the number of people quitting. Fancy that!

So why has the smoking ban been such a success?  Firstly, the time was right.  Research at the time showed that there was very strong public support for the ban.  It has been largely self-policing; note how quickly people react if anyone breaches the ban. That is because the reasons for the regulations are well-understood and the benefits are now clear, in the same way our air is now clear.

The smoking ban did change behaviour but it achieved it by building on and reinforcing longer running changes in behaviour and attitudes.  It made it clear that the right to breathe trumps the right to smoke.  In 1976 I didn’t feel able to assert my right to breathe clean air in our dank little office.  In 2013, I don’t need to, because Parliament championed and legitimised my right not be harmed over the rights of others to harm me.

At INLOGOV we are very interested in behaviour change and how changing public expectations and behaviour can impact, both positively and negatively, on public services.  Behaviour change has come to be seen as a’ quick fix’ for all sorts of perceived ills.  The experience of the smoking ban shows that it is all much more subtle and complex than that.  It also demonstrates that the right legislation, at the right time, can work with the grain of changing  social attitudes and can help both to change the behaviour of the unwilling and to embed that changed behaviour in new social norms.

Catherine Staite

Catherine Staite is the Director of INLOGOV. She provides consultancy and facilitation to local authorities and their partners, on a wide range of issues including on improving outcomes, efficiency, partnership working, strategic planning and organisational development, including integration of services and functions.

Catherine has recently co-authored INLOGOV’s latest book, Making Sense of the Future: Do We Need a New Model of Public Services. The chapter ‘Beyond Nudge‘ by Catherine Mangan and Daniel Goodwin deals specifically with behaviour change.