In tech we trust: A teacher’s perspective on INLOGOV’s e-learning (r)evolution

Dr. Abena F. Dadze-Arthur

 

INLOGOV’s first online Masters

It was a historic moment for INLOGOV – even by the standards of the Institute’s long and eventful history. For the first time ever, INLOGOV was to design and deliver an online International Masters in Public Administration (MPA). The new MPA was to be delivered wholly online with students doing all their classroom activities outside the traditional classroom, at a distance from their school or college, and supported by interactive technology tools. The programme was to be targeted at adult learners across the world, specifically those already working in the public sector, who wish to study part-time while maintaining their career paths. It marked INLOGOV’s accession to the new club of educational institutes partaking in an evolution of a rather revolutionary nature, which is interchangeably termed e-learning, distance learning or online education.

A worldwide (r)evolution

Indeed, in offering the new MPA, along with two other online postgraduate programmes, the University of Birmingham joined the ranks of other prestigious tertiary educational institutes that have embraced the challenge of delivering e-learning courses. After all, the business case is compelling. Online programmes are rapidly becoming not only an inevitable but also rather lucrative part of mainstream education. The worldwide market for e-learning already reached $35.6 billion in 2011, and generated estimated revenues of some $51.5 billion in 2016, boasting growth rates of 17.3% in Asia, 16.9% in Eastern Europe, 15.2% in Africa and 14.6% in Latin America.

International organisations, such as UNESCO and the International Council for Open and Distance Education, conclude that the revolutionary explosion of e-learning is down to two key factors: First, and perhaps most obviously, the technological advances of the Internet and web-based technologies offer learners and teachers a considerable range of affordable tools and resources. These enable novel approaches to networked learning, change the ways in which knowledge is being imparted and open up new means of engagement. Second, online education allows professionals all over the world to upskill and pursue further qualifications while continuing to work in their jobs. Moreover, working professionals from Africa and Asia are now able to overcome the inadequacies and asymmetries of local educational provisions by enrolling in e-courses delivered by internationally renowned universities.

But who teaches the teachers?

Indeed, a whole new way of learning that is free from the constraints of time, space and pace – but also a whole new way of providing education! Helpfully, from the learner’s perspective, there is a lot of information out there about the ways in which this type of education differs from traditional ‘brick-and-mortar’ programmes, and what to consider when registering for e-courses. However, from the teacher’s perspective, surprisingly little has been published about designing and creating online education, and finding practical solutions for pedagogical and technical challenges. Tasked with authoring and tutoring the very first module of INLOGOV’s online MPA, my co-convener and I felt like two fishes out of water. Although, as university teachers and researchers at INLOGOV, we have had much experience of designing and delivering public management programmes for mid-career public servants on a ‘face-to-face’ basis (both in the classroom ‘on-campus’ and ‘in-house’ for sponsoring client organisations in the public and voluntary sectors), the new online MPA seemed a daunting endeavour. Preparing and providing an ‘online’ distance-learning module for a more diverse international group of practitioners, drawn from a wider range of public service contexts and experiences, certainly raised new and partly unexpected challenges for us that called for fresh approaches. Since then we have delivered the module twice – and learned something about how ‘to do’ online education.

‘Doing’ online education from a teacher’s perspective

Working in partnership with the international education and publishing group Deltak-Wiley, we realized early on the need to research and write much of the MPA programme anew. Our existing PowerPoints, although helpful in visually highlighting or synthesizing complex arguments presented in a classroom lecture, were unsuited and reductionist for this mode of teaching. Hence, we spent weeks producing fully scripted learning materials of a high quality and publishable standard, which also featured animated videos and interactive diagrams, timelines and theoretical models that could be expanded or collapsed at the click of a mouse.

Mindful of the international nature of the student group for whom the programme is intended, we recognized the need to ‘internationalise’ our curriculum. This was achieved by several means: we added new literature on international public management and governance in the reading lists; we included a variety of contemporary examples of public management from around the world; we produced a series of short, BBC-documentary-style videos featuring practitioners and researchers from across the globe who discussed their particular experiences of public management and governance in their respective home countries; and we used an array of photo images to portray global diversity in public service delivery.

Encouraging critical reflection in relation to the students’ own experiences of working in public management posed another challenge that required fresh thinking. We tackled this issue by including weekly formative assignments, which asked the participants to share and discuss issues and examples from their own country contexts in ‘Discussion Forums’. These forums enabled us to get course participants critically to engage in the activities, rather than simply to absorb ideas from the text, animated videos, or short film clips. In addition, it firmly placed the students at the heart of learning, thus achieving learner-centricity.

Not having the classroom interaction meant that we needed to find different ways by which the students could develop rapport with, and respect for, one another and so learn from each other. We solved this challenge by making the ‘Discussion Forums’ interactive. In order to attract a good grade, our online students were not only expected to ‘post’ their contributions (by particular deadlines) on the ‘Discussion Forums’, but they also had to respond to the ‘posts’ of at least two others (by further deadlines). For the formal assessment of these postings, we chose to use two criteria: a) ‘intellectual contribution’ to the discussion, and b) ‘contribution to the learning community’ focusing particularly on responsiveness to colleagues’ ‘posts’. We thereby incentivized the learners carefully to read each other’s contributions and offer thoughtful and thought-provoking feedback, constructive advice and mutual support – all of which led to the development of a strong learning-community. We also built in two synchronous sessions, which are specified times when students and instructors hold virtual ‘meetings’ online in real-time. However, what can be done so easily in a face-to-face classroom environment proved rather difficult online. Following initial difficulties in identifying meeting dates and times that suited every student in three different continents and time zones, we encountered even more serious problems during the meetings with both the audio (delays and echoes) and the webcams (requiring too much bandwidth). Clearly, online instructors are not the only ones who still need to mature – the technology does too!

The most important lesson

With our first cohort of students soon due to graduate from INLOGOV’s online MPA, we are expecting systematic feedback and more lessons on what worked, or not, about our online teaching from the participants’ perspective. Not to mention that, as we mature as online teachers and are given the opportunity to tweak and adjust our online classes, and deliver them to new and different student cohorts, our insight and understanding as e-learning providers is bound to increase. However, since I first set out, armed with skepticism and furrowed brows, to join the e-learning (r)evolution, the biggest lesson I personally have learned is how rewarding and of high educational quality an online course can be.

 

If you are interested in more details about our authoring and tutoring of the first module of INLOGOV’s online Masters in Public Administration (MPA), please download our chapter by clicking here.

 

Untitled.png

Following a ten-year professional career as a public policy specialist working for various governments across the world, Dr Abena F. Dadze-Arthur switched to an academic career in public administration, and is currently a lecturer at INLOGOV. Abena teaches courses on various aspects of public management and governance. Her research mainly focuses on non-western and post-western public management approaches.

First, do no harm – An assessment of the Housing and Planning White Paper

Anthony Mason gives an initial assessment of the white paper on housing and planning in England

First impressions are not always very reliable.  When Sajid Javid replaced Greg Clark as Secretary of State for Communities and Local Government following the post-vote governmental putsch last year (sorry, change of Prime Minister following the referendum), local government figures were very wary.  Clark had, and still has, a reputation for understanding local government and can connect the local to the national in discussions around the cabinet table in a way that few of his colleagues are able to. Javid, however, was an unknown quantity – said to feel that the DCLG role was a demotion and giving every indication that he was unexcited by the move.

Yet, for those of us specifically interested in housing policy, Clark – alongside his spiky and confrontational housing minister, Brandon Lewis, presided over some rotten housing policies, as I suggested in this place last year.  Indeed, the Housing and Planning Act 2016 will, I suspect, go down in legislative history as the Dangerous Dogs Act of housing policy in England.  Gratifyingly, a number of the craziest measures enabled by that Act have proved so difficult to implement that the “new” government has simply shunted them into a siding and (we hope) left them there to rot.

And now comes the first comprehensive white paper on housing policy in England for almost a generation. Bearing the less than poetic title-as-ambition of Fixing our broken housing market.  Javid and his refreshingly rounded housing minister Gavin Barwell, set out in 104 pages and many supporting papers their ambitions to do just that.  To their great credit, Javid and Barwell have spent many weeks on careful consultation with local government, sector interests, and Number 10 before getting this far; delaying the publication of the white paper somewhat while doing so.  They have even persuaded the PM to pen a lengthy introduction to the paper – presumably in the hope of corralling rural Conservative NIMBYs into line.

The white paper sets out many proposals and poses 38 carefully framed policy questions for response (by 2nd May 2017, if you’d like to contribute).  But in quick summary, it:

  • Acknowledges that England needs around 250,000 new homes each year going forward. This was expressed as “between 225,000 and 275,000 homes” – and is up from the oft-quoted 200,000 previously accepted (but never consistently realised)
  • Proposes that each local authority will have to draw up and regularly review an “honest assessment” of local housing need – methodology to follow.
  • Says that developers could be forced to build within two years of planning consent, or see that consent lapse. At the moment, permission usually lapses after three years.  The paper also proposes new compulsory purchase powers for councils where sites lie undeveloped – details to follow.
  • Suggests an expanded and more flexible affordable homes programme, for housing associations and local authorities, with £7.1bn of (already announced) funding. It drops the “old” government’s fixation with starter homes in favour of a more balanced approach.
  • Encourages building rates at higher density – including of higher buildings – to make best use of land (and to avoid having to give a view on releasing green belt).
  • Dodges the question of future housing association and council rent levels after George Osborne’s compulsory rent reductions “We will provide clarity over future rent levels. In return, we expect them to build significantly more affordable homes over the current parliament.” Is what ministers promise.
  • Says that smaller building firms will be given assistance to expand, including support for off-site construction (where components are fabricated off-site and factory-assembled). It also encourages “build to let” where private companies build large-volume rental flats for tenants.
  • Continues a focus on leaseholds, proposing what it calls “an end to leasehold abuse” where home buyers are locked into leases with spiralling ground rents.

Most of us acknowledge the general need for new homes while protesting loudly if those homes are to be built near to us – and for years, housing policy in England has tried not to upset voters and yet deliver new homes.  And the white paper has had to throw titbits in all directions to keep sector interests at bay.  Local authorities are both excoriated for planning failures and mildly encouraged to build new homes.  Those who worship at the altar of home ownership will be pleased that there is a threat to close a loophole that has allowed councils building homes through wholly-owned companies to avoid the right to buy.  Those who see renting as the most realistic way forward will be pleased that much of the white paper acknowledges this reality and makes gentle proposals for longer tenancies.  Big developers are both criticised for not building out sites as well as encouraged by some anti-planner language.

But ministers have failed to resolve some longstanding conundrums – and a couple of new ones – in their paper:

  • Successive governments have tried to combine bottom-up and top-down policies on housing which appear to conflict in their efforts to encourage and coerce. For example, communities and parishes have been given more control over developments and yet principal councils are still required to provide new homes.  Housing associations should develop more and yet have no control over the rents they can charge for these new homes.
  • Government has long had an intellectual tendency to support developers over planners – even though planning consents have been running ahead of homes built for some time. This white paper at last begins to recognise that not all is well, with our developers while avoiding the obvious response: councils’ potential contribution to building at scale.
  • There is a cherished belief that brownfield sites can provide the majority of our new homes, but these sites no longer match need. Not surprisingly, they are disproportionately in cities, but not all housing need is city-based.  The white paper avoids the question of building on the green belt, even though, in our own city, we’ve faced a highly charged debate about this topic.
  • A further concern is around labour and skills. We’ve long worried that not enough UK youngsters express any desire to work in the building industry.  This is now compounded by fears of the actual or apparent impact of Brexit on the non-UK workforce.

The fundamental question that the paper avoids is whether any combination of our present arrangements for building can ever deliver the amount of housing we need; as the answer to that question may be too hot to handle.  It’s old evidence now, but the Calcutt review of the housebuilding industry commissioned a decade ago set out a straightforward graphic showing who has built what in the UK in the years since the Second World War (see figure one on page 10).

untitledThis evidence was summarised in a beautifully simple graphic (above) by the University of Sheffield School of Architecture.  It evidences that the three decade long gap in our housing provision is simply because we’ve stopped building council houses.  The answer to the fundamental question would seem to be to let councils (and housing associations) build again at some scale in order to supplement the relatively fixed-but-declining contribution of private developers.

The title to this post is a common misquotation of the Hippocratic Oath.  It suggests that a first duty for medics is not to do harm “Primum non nocere” – and the new white paper seems to pass that test, at least.  If a second duty is “then to do good” – then I’m not yet convinced that the paper will achieve that in any significant way.

Anthony Mason

 

 

Anthony Mason is an Associate at INLOGOV and works mostly on local government systems and organisation and on improving public sector partnerships.  His early career was in local government followed by more than 20 years in PwC’s public sector consultancy team.

 

 

Troubled Families: Two Secrets to Great Evaluations

Jason Lowther

In this blog last week I explored the (rather flimsy) evidence base available to the developers of the original Troubled Families Programme (TFP) and the potential for “theory of change” approaches to provide useful insights in developing future policy. This week I return to the formal TFP evaluation and look at the lessons we can learn in terms of the timing and data quality issues involved.

The first secret of great evaluation: timing

The experience of the last Labour Government is very instructive here. New Labour appeared as strong advocates of evidence-based policy making, and in particular were committed to extensive use of policy evaluation. Evaluated pilots were completed across a wide range including policies relating to welfare, early years, employment, health and crime. This included summative evaluations of their outcomes and formative evaluations whilst the pilots were underway, attempting to answer the questions “Does this work?” and “How does this work best?”

Ian Sanderson provided a useful overview of Labour’s experience at the end of its first five years in power[i]. He found that one of the critical issues in producing great evaluations (as for great comedy), is timing. Particularly for complex and deep-rooted issues (such as troubled families), it can take a significant time for even the best programmes to have an impact. We now know the (median) time a family remained on the TFP programme was around 15 months.

It can also take significant time for projects to reach the “steady state” conditions, which they would work under when fully implemented. Testing whether there are significant effects can require long-term, in-depth analysis. This doesn’t fit well with the agenda of politicians or managers looking to learn quickly and sometimes to prove a point.

Nutley and Homel’s review[ii] of lessons from New Labour’s Crime Reduction Programme found that “projects generally ran for 12 months and they were just starting to get into their stride when the projects and their evaluations came to an end” (p.19).

In the case of the Troubled Families Programme, the programme started in April 2012, and most of the national data used in the evaluation relates to the 2013-14 financial year. Data on exclusions covered only those starting in the first three months of the programme, whereas data on offending, benefits and employment covered families starting in the first ten months of roll-out.

We know that 70% of the families were still part-way through their engagement with the TFP when their “outcomes” were counted, and around half were still engaged six months later.

It’s now accepted by DCLG that the formal evaluation was run too quickly and for too short a time. There just wasn’t time to demonstrate significant impacts on many outcomes.

The second secret: data quality

Another major element of effective evaluation is the availability of reliable data. Here the independent evaluation had an incredibly difficult job to do. The progress they have made is impressive – for the first time matching a wide range of national data sets, local intelligence and qualitative surveys. But at the end of the day the data quality base of the evaluation is in places poor.

The evaluation couldn’t access data on anti-social behaviour from national data sets, as this is not recorded by the police. This is unfortunate given that the strongest evidence on the effectiveness of TFP-like (Family Intervention) programmes in the past concerns reducing crime and anti-social behaviour[iii].

A chunk of data came from the 152 local authorities. This data was more up to date (October 2015), although only 56 of the councils provided data – which enabled matching to around one quarter of TFP families. The evaluation report acknowledges that this data was “of variable quality”. For example, the spread of academy schools without a duty to co-operate meant there are significant gaps in school attendance data. This will be a serious problem for future evaluations unless academies’ engagement with the wider public service system is assured.

In summary, the TFP evaluation covered too short a period and, despite heroic efforts by DCLG and the evaluators, was based on data of very variable quality and completeness.

Next time we will explore the “impact” evaluation in more detail – looking at how designing a more experimental approach into this and future programmes could yield more robust evaluation conclusions of what works where.

[i] Sanderson, Ian. “Evaluation, policy learning and evidence‐based policy making.” Public administration 80.1 (2002): 1-22.

[ii] Nutley, Sandra, and Peter Homel. “Delivering evidence-based policy and practice: Lessons from the implementation of the UK Crime Reduction Programme.” Evidence & Policy: A Journal of Research, Debate and Practice 2.1 (2006): 5-26.

[iii] DfE, “Monitoring and evaluation of family intervention services and projects between February 2007 and March 2011”, 2011, available at: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/184031/DFE-RR174.pdf

 

 

lowther-jason

 

Jason Lowther is a senior fellow at INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther

 

So: does the Troubled Families Programme work or not? – Part Two

Jason Lowther

In this blog last week I outlined results of the “impact evaluation” element of the Troubled Families Programme (TFP) and the rather limited pre-existing evidence base the TFP had to be built upon. How can government build on existing evidence in designing its initiatives, and what can we do when there isn’t much in the evidence cupboard?

Many government programmes have the luxury of a relatively strong evidence base on which to build. The previous Labour government’s National Strategy for Neighbourhood Renewal and Sure Start programmes, for instance, could draw on decades of research (collated through the 18 Policy Action Teams) on urban initiatives and the impact of early years experiences on achievements in later life. These sometimes honoured the extant evidence more in the theory than in practice[i], but at least they had foundations on which to build.

As evaluations of the Labour government’s Crime Reduction Programme found[ii], it is a difficult task to translate evidence, which is often “fragmented and inconclusive” into practical government programmes. People skilled at this task are in short supply in central government.

But in the case of the TFP, the most robust element of the existing evidence base was a single evaluation using a “control” of 54 families and focussed on addressing anti-social behaviour through Family Intervention Projects. What can government do when the evidence base is thin?

One strong tradition, particularly around medicine and around welfare policies in the USA, has been the idea of “experimental government” using social experiments to determine whether (and if so how) innovative approaches work in practice. For example, in the last three decades of the 20th century, America’s Manpower Demonstration Research Corporation (MDRC) conducted 30 major random assignment experiments involving nearly 300,000 people.

Historically, randomised controlled trials (RCTs) were viewed by many as the “gold standard” of evaluation by allowing statistically robust assessments of “causality” – whether observed changes are due to the intervention being evaluated. More recent thinking emphasises that evaluations need to be designed in the best way to create robust evidence and answer specific questions. Often this will involve a mixture of methods, both quantitative and qualitative. The TFP evaluation used a mixture of methods but without building in a “control” group of “troubled families” not yet receiving the TFP interventions.

Granger[iii] argued (for area based initiatives), that the range and variety of initiatives and the scale of change in government means that a strict statistical “control” is unfeasible. She argued that it is “virtually impossible” to achieve precise and clear-cut causal attribution and that we need clear, strong theories as a basis for counterfactual reasoning and causal inference.

The TFP evaluation did not develop or test a “theory of change” for the programme. This is a pity, because rigorously testing a theory can help illuminate where and how programmes do (or don’t) have real impact.

There are several other lessons we can learn from the existing literature on evaluation in government, for example the importance of timing and data quality. We’ll look at these next time.

[i] Coote, Anna, Jessica Allen, and David Woodhead. “Finding out what works.” Building knowledge about complex, community-based initiatives. London: Kings Fund (2004), esp. pp. 17-18.

[ii] Nutley, Sandra, and Peter Homel. “Delivering evidence-based policy and practice: Lessons from the implementation of the UK Crime Reduction Programme.” Evidence & Policy: A Journal of Research, Debate and Practice 2.1 (2006): 5-26.

[iii] Granger, R. C. (1998) ‘Establishing causality in evaluations of comprehensive community initiatives’, New approaches to evaluating community initiatives, 2, pp. 221-46.
lowther-jason

Jason Lowther is a senior fellow at INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther

So: does the Troubled Families Programme work or not?

Jason Lowther

In this blog last week I outlined the roller coaster trajectory of the Troubled Families Programme in the media, from saviour of all England’s most “troubled families”, to a wasteful and failed £1bn vanity project in under five months.  This despite independent evaluators finding the programme has radically transformed support for these families, and the families themselves saying that it has worked for them.

In most government evaluations, that is where the story would stop.  Yet another tremendously successful project from Whitehall.  But the DCLG (with a little encouragement from Treasury) were much braver.  They wanted to know how many of these improvements would have happened in any case, even without the Troubled Families Programme (TFP).  This is a dangerous question to ask.  And quite a tough one to answer.

Continue reading

Troubled Families: Is Lily The Pink Dead?

Jason Lowther

Whatever happened to the Troubled Families Programme (TFP)?  Three weeks before last year’s general election purdah period, Communities Secretary Eric Pickles heralded the Government’s three year programme as “a triumph…[that will] turn around the lives 120,000 of this country’s hardest to help families…[saving] the taxpayer over a billion pounds”.  After the election, Prime Minister Cameron announced the success rate had been 99% which had “saved as much as £1.2 billion in the process”.

Fast forward a few months and the headlines have taken a rapid reverse:

Continue reading