Making Random Assignment Happen

Evidence from the UK Employment Retention and Advancement (ERA) Demonstration

By Robert Walker, Lesley Hoggart, Gayle Hamilton

Published by the UK Department for Work and Pensions

Random assignment is one of the most powerful tools available to researchers to determine whether a social policy works. By dividing people randomly into those who receive services (the programme group) and those who do not (the control group), any difference in outcomes observed between the groups — the programme's effect or 'impact' — can be confidently attributed to the new policy.

Despite its potential, this methodology has not been widely used in Britain to develop social policy. However, it is currently being applied in a six-district demonstration test of the Employment Retention and Advancement (ERA) programme. This initiative aims to help low-income lone parents and long-term unemployed people remain employed and advance in the labour market once they find work.

Over the course of about 15 months, Jobcentre Plus randomly assigned over 16,000 people, making the ERA evaluation the largest random assignment test of a social policy's effectiveness in the UK to date. What did it take to implement this methodology within local offices? How well did the random assignment process work? And what lessons does the ERA experience hold for the application of this technique to future social policy evaluations in Britain? This report, based on a special case study undertaken as part of the overall ERA evaluation, addresses these questions. It uses qualitative data collected primarily through on-site observations at Jobcentre Plus offices, telephone and in-person interviews with customers and staff, and staff focus groups.

The ERA Programme and Evaluation

ERA was conceived as a 'next step' in welfare-to-work policy, designed to help break the 'low-pay-no-pay cycle' common among low-wage workers. Whereas the current New Deal programmes concentrate on placing unemployed people who are receiving benefits into work and on short-term job retention, ERA aims to keep them working longer and help them advance.

ERA commenced on 27 October 2003 in six Jobcentre Plus districts (covering 60 local offices) in Scotland, Wales, London, North West England, East Midlands, and North East England. Together these areas encompass about 6 per cent of the UK's workforce. Three groups known to have difficulties in getting and keeping full-time work are eligible for ERA:

  • Out-of-work lone parents entering the New Deal for Lone Parents (NDLP)
  • Long-term unemployed people entering the New Deal 25 Plus (ND25 plus)
  • Lone parents working less than 30 hours a week and receiving Working Tax Credits (WTC).

ERA programme group members can receive a combination of employment-related assistance from an Advancement Support Adviser (ASA) — a new advisory position in the six Jobcentre Plus pilot districts — for 33 months, including in-work support. ASAs are to help their customers find suitable work, solve work-related problems, and advance in their jobs. Customers working 30 hours a week or more may receive a work retention bonus that pays up to £2,400 over their time in ERA. More financial support is available for training and to help with emergencies that might compromise an individual's ability to stay in work. 

The ERA research will eventually determine whether people in these three target groups who are randomly assigned to the programme group remain longer in paid work and enjoy better pay and conditions compared with their counterparts in the control group. In addition to this assessment of the programme's effectiveness, the evaluation includes a detailed process study to understand how ERA is implemented, a cost study to establish the resources required to operate the programme, and a cost-benefit analysis to determine value for money. The evaluation is being conducted by a research consortium, which is working closely with the UK Department for Work and Pensions, the evaluation sponsor. The consortium includes three British organisations — the Policy Studies Institute (PSI), the Office for National Statistics (ONS), and the Institute for Fiscal Studies (IFS) — and MDRC, a US-based non-profit social policy research firm that leads the consortium.

Key Findings on Random Assignment

Given that the scale of ERA random assignment, which was conducted in 60 Jobcentre Plus offices, was unprecedented in the UK, random assignment proceeded well. This finding — a key one — helps assure that the results produced by the ERA evaluation will be trustworthy. In addition, more generally, it establishes that random assignment is practical in a UK context, and thus has encouraging implications beyond ERA.

While there were no significant problems with ERA random assignment, the process was not without glitches, as might be expected given its scale. In the following sections, the issues encountered in ERA are used to point to ways that future random assignment studies in the UK can be improved.

  • ERA successfully met two critical random assignment challenges: (1) creating a sufficiently large research sample of virtually identical programme and control groups, and (2) doing so with most customers and staff viewing the process as justified and fair.

For both New Deal target groups — the focus of this case study — the staff presented the ERA offer and explained random assignment during the same intake interviews in which they presented information on the New Deal programme. In the course of these interviews, staff were expected to encourage — but not coerce — eligible customers to agree to the random allocation procedures giving them a 50-50 chance of gaining entry to ERA. The fact that more than 16,000 people took up this offer attests to the appeal of ERA and indicates that random assignment did not dissuade them from applying for the programme. 

A comparison of measured background characteristics of the people in the programme and control groups shows that, as intended, their characteristics do not systematically differ and provides evidence that the allocation of people to the two groups was, in fact, random. Some staff reported feeling disappointed when random assignment designated a customer to the control group whom they felt would benefit from ERA. In general, though, staff acknowledged the potential value of assessing ERA using this method. Notwithstanding some initial scepticism, they also believed that the assignment of people to the two research groups was truly random and thus fair. Similarly, many customers assigned to the control group felt some disappointment that they could not participate in ERA, but, overall, members of the control group believed that the process was fair.

  • There were tensions between ERA and the normal Jobcentre Plus job entry targets.

The broader institutional context in which random assignment operates can potentially affect 'who' within the new policy's target group flows into the random assignment funnel. In the case of ERA, the Jobcentre Plus job-entry targets for New Deal customers had the potential to work at cross-purposes with the goal of recruiting as many customers as possible into the study. For example, New Deal Personal Advisers conducting intake stood to retain or lose customers who might help or hurt their ability to meet job-entry goals. Thus, an Adviser might be tempted to de-emphasise the benefits of ERA to a job-ready customer if joining ERA would have meant that that person would move onto another staff member's caseload, because the original New Deal Adviser would not receive credit when the customer went to work. There is little reason to believe that this was more than a small problem or that it fundamentally compromised the extent to which the study's sample is representative of customers in ERA's target populations.

  • Integrating ERA discussions into intake interviews covering many other topics, plus the staff's interest in minimising the control group's disappointment, often left customers with an incomplete understanding of the ERA offer.

Staff performing New Deal intake interviews who understood ERA well, especially ASAs, generally took a very positive view of the policy and found it easy to promote it. However, with all that they had to accomplish during the intake interviews, it was difficult to spend as much time on ERA and random assignment as they would have liked. Giving relatively full explanations of all topics in a single interview resulted in intake sessions lasting 90 minutes or more. More often, staff curtailed the time they spent explaining ERA or the time that customers spent reading an ERA fact sheet and/or the consent form for random assignment. In some instances, staff gave ERA only cursory coverage, sometimes introducing it only in the last five minutes of an intake interview. 

The 'information overload' commonly experienced by customers during intake interviews, and the often-truncated explanations of ERA, left many customers with less than a full understanding of what exactly they were being offered at intake. Adding to this lack of clarity was the need for staff to balance, on the one hand, the need to 'sell' ERA in order to maximise the take-up, with a desire, on the other hand, to minimise the extent to which those allocated to the control group might be disappointed. This led in particular to the belief among most staff early in the recruitment period that they should not mention the programme's financial incentives component.

  • Although staff observed the formalities of the informed consent process, many customers did not fully understand to what they had consented. However, the design of the study was such that this did not put them at risk of obvious harm.

Although staff in the focus groups felt that they imparted a great deal of ERA-related information in intake interviews, this was not the case in all the intake interviews observed. While the researchers confirmed that usually the formalities of informed consent were scrupulously observed, post-intake research interviews suggest that many customers remained unclear about what ERA offered when they signed the consent form. Many could not articulate the different consequences of being assigned to the programme or control groups. Most likely, this was because explaining the New Deal often took up the most time. Other reasons suggested by staff included the complexity of the information that they needed to convey, customer indifference and antipathy, and inadequate training for some New Deal staff. 

From an ethical standpoint, the minimum information that a potential study participant must know before consenting to take part in a research study depends on the level of risk involved in participating. For ERA, that level was very low, because consenting to join the study and then being randomly assigned to the control group would put people in a situation not substantially different from what they would have faced had they refused to participate. Thus, customers' incomplete understanding was probably sufficient to protect their rights as research subjects. While very little research has been done in this area, there is no reason to believe that the degree of pre-consent understanding in ERA was lower than in other research studies of this type.

  • Customers rarely objected to the idea or process of random assignment, and those assigned to the control group were not deeply troubled by this outcome.

While many customers may not have clearly understood the consequences of the outcome of random assignment, they rarely objected to the idea of it. Overall, only 8.4 per cent of all ERA-eligible customers to whom the random assignment offer was made during New Deal intake interviews refused to take up the invitation to apply for ERA. In general, customers understood that they had a 50-50 chance of being assigned to the programme. And among those allocated to the control group, few protested, displayed anger, or were visibly upset by that result.

  • Implementing random assignment presented staff with many challenges for which they felt they had not always been fully prepared.

All staff received basic training in the procedural aspects of the ERA intake, as well as information about the programme and ERA evaluation, including random assignment, and why ERA is a demonstration project. ASAs received an additional day's training on the content of the programme. In any type of training, however, the extent to which conveyed information is understood and absorbed is often less than intended. In this regard, ERA was no exception. Staff involved in ERA intake were often uncertain about how to market ERA and respond to customers' questions; some staff were uncertain about how to manage customers' reactions to the result of random assignment.

  • Computer difficulties complicated the random assignment process.

Technological problems often made it difficult for staff to complete the ERA intake interview in a timely manner. Before ERA was launched, the Jobcentre Plus Internet system was relatively slow running and therefore when the ERA form used to collect customers' demographic information needed to be accessed from the Internet, the system was unable to support this during the early part of the demonstration. This resulted in very slow running times, complicating efforts to collect information for random assignment and to make the assignment. If the Internet was not available, staff could contact another office or a Technical Adviser to continue the random assignment process, but this contingency plan — often requiring a lengthy phone call — could be time-consuming. These problems were reduced substantially later in the demonstration following Internet system upgrades in Jobcentre Plus.


Acknowledging the challenges faced in conducting random assignment in ERA while recognising the overall success of the process, the report's recommendations include the following suggestions:

  • In designing a voluntary random assignment study, take steps to ensure that — as much as possible — normal institutional performance targets do not undermine study participant recruitment.

  • Provide clear guidance and careful training to staff on how to encourage participation in the study while respecting a person's right of refusal.

  • Determine the minimum amount of information that candidates must know in order for them to give informed consent, and design procedures to ensure uniform communication of this information. (Further research into the degree to which participants in other studies understand what they are being asked to consent to, and why, would be valuable background.)

  • Provide frontline staff with training that incorporates 'real-life' situations they are likely to encounter in explaining random assignment to customers, and address their questions and concerns about the ethics of the process.

  • Whenever feasible, conduct a pilot test of random assignment procedures before launching full-scale random assignment. This would allow for corrective actions to be taken to improve systems, procedures, or staff training — before many people are enrolled in the study.

  • Conduct ongoing systematic and comprehensive reviews of intake and random assignment procedures in all offices, to ensure that the random assignment process is working properly.

  • Ensure that the IT applications and Internet access that frontline staff depend on are reliable and fast.
Walker, Robert, Lesley Hoggart, and Gayle Hamilton. 2006. Making Random Assignment Happen. New York: MDRC.