Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

The Information-seeking Behavior of Recruitment Professionals (Part 1)

DZone's Guide to

The Information-seeking Behavior of Recruitment Professionals (Part 1)

A look into how recruitment professionals search vast data, how machine learning could disrupt the industry, and survey results

· Agile Zone
Free Resource

Reduce testing time & get feedback faster through automation. Read the Benefits of Parallel Testing, brought to you in partnership with Sauce Labs.

A few months ago I published a post describing our InnovateUK-funded research project investigating professional search strategies in the workplace. I’m pleased to say that the project has now finished, and we are currently analyzing the results. As you may recall, we surveyed a number of professions, but the one we are examining first is (cue drum roll)… recruitment professionals.

Yes, I know it’s a profession that information retrieval researchers haven’t traditionally given much thought to (myself included), but it turns out that these individuals routinely create and execute some of the most complex search queries of any profession, and deal with challenges that most IR researchers would recognise as wholly within their compass (such as query expansion and optimization, results evaluation, etc.).

What follows is the first of a series of posts summarising those results. So here’s Part 1, which focuses on the research methodology and background to the study. As usual, comments and feedback are welcome – particularly so from the recruitment community who are uniquely placed to provide the qualitative insight needed to accurately interpret this data.

1. Introduction

Research into how people find and share expertise can be traced back to the 1960s, with early studies focusing on knowledge workers such as engineers and scientists and the information sources they consult [1]. Since then, the process of finding human experts (or expertise retrieval) has been studied in a variety of contexts, both academic and industrial, and has become the subject of a number of organised evaluation campaigns (e.g. the TREC Enterprise track 2005-2008 [2] and the TREC Entity Track [3]). This has facilitated the development of numerous research systems and lab-based prototypes, and led to significant advances in performance, particularly against a range of system-oriented metrics [4].

However, in recent years there has been a growing recognition that the effectiveness of expertise retrieval systems is highly dependent on a number of contextual factors [Hoffman et al, 2010]. This has led to a more human-centred approach focused on the process of expertise seeking, where the emphasis is on how people search for expertise in the context of a specific task. These studies have typically been performed in an enterprise context, where the aim is to utilize human knowledge within an organization as efficiently as possible (e.g. [5], [6]).

However, there is a more ubiquitous form of expertise retrieval that exists outside of the enterprise, and embodies expert finding in its purest, most elemental form: the work of the professional recruiter. The job of recruitment professionals is to find people who are the best match for a client brief, and return a list of qualified candidates in the shortest possible time. They may not have privileged access to the research prototypes and systems referred to above, but their work involves the creation and execution of some of the most complex Boolean expressions of any profession. These include nested, composite structures such as the following:

Java AND (Design OR develop OR code OR Program) AND ("* Engineer"
OR MTS OR "* Develop*" OR Scientist OR technologist) AND (J2EE OR
Struts OR Spring) AND (Algorithm OR "Data Structure" OR PS OR
Problem Solving)

Or exhaustive enumerations of related terms such as:

("looking for" OR "in search of" OR "open to" OR "new job" OR
"actively pursuing" OR "pursuing new" OR "searching for" OR "new
opportunity" OR "new opportunities" OR "available for" OR "in
transition" OR unemployed OR "immediately available" OR "currently
seeking" OR "seeking new" OR "seeking a new" OR "interested in")

Or expressions containing index field lookups:

site:ca.linkedin.com "network engineer" "ccnp" "wan" "lan" "vancouver"
-intitle:"profiles" -inurl:"dir/ " -inurl:job|jobs|jobs2

Over time, many recruiters create their own collection of queries and draw on these as a source of intellectual property and competitive advantage. Moreover, the creation of such expressions is the subject of many social media forums (e.g. [7]), and the discussions that ensue involve topics that many IR researchers would recognise as wholly within their compass (such as query expansion and optimisation, evaluation, etc.). However, despite these shared interests, the recruitment profession has been largely overlooked by the IR community; and their search needs, behaviours and preferences remain relatively unknown. Even recent systematic reviews of professional search behaviour make no reference to this profession [8].

This paper seeks to address that omission. We report on a survey of 64 recruitment professionals, examining their search tasks, behaviours and preferences, and the types of functionality that they value. Wherever possible, we compare the results to those of previous surveys of search behaviour, notably Joho et al [9] and Geschwandtner et al [10]. The former is particularly salient as it concerns another profession which relies on the use of complex Boolean queries.

The rest of this paper is structured as follows. We first provide a brief overview of the candidate sourcing process and related studies of expertise seeking (Section 2). Then in Section 3 we describe the current study and in Section 4 present the results of the survey. The issues raised in Section 2 are reviewed in the context of these findings, and the implications for systems development are discussed, before concluding and summarizing the work in Section 5.

2. Background

In this section, we provide a brief overview of the different recruitment tasks and their related search and information retrieval challenges. As mentioned above, there is very little prior work investigating the recruitment profession from an information seeking perspective. However, there are surveys of other professions that share some of the characteristics of recruitment. For this reason, we compare our findings with Joho’s [9] survey of patent search users, as they also employ highly optimised, complex queries that have a repeatable effect when applied to a given database, and need to dynamically balance precision with recall for different search tasks. We also compare out findings with Geschwandtner’s [10] survey of medical professionals, as this constitutes a further recent, large scale survey of professional information seeking behaviour. In addition, we review the literature on related topics, such as web-based people search and expert finding within an enterprise setting.

Recruitment Tasks

Recruitment is the process of finding and attracting capable applicants for employment. It can be proactive (performing outbound activities to facilitate hiring) or reactive (managing inbound responses to specific job postings). In this study, we focus on the former activity, which is often referred to a sourcing.

A recruitment professional may spend approximately 27% of their time actively searching for candidates [19], and needs to rapidly evaluate candidate CVs [16]. On average they can be expected to place around two candidates per month [20]. The activities of recruitment professionals range from directly searching for candidates using job boards through to investigating profiles on social networks to make connections with candidates, as well as gaining broader market intelligence on behalf of clients.

Sourcing is a skill that is to some extent emulated by expert finding recommender systems, where machine learning is used to select the best-suited individual to perform a particular task [21]. Modelling a person’s ability to complete tasks is also a key factor in crowdsourcing platforms such as Crowdflower or Mechanical Turk [22]. These techniques have been extended to much larger and noisier datasets on social networks where the person’s connections can be used as selection variables [23]. This is essentially a microcosm of the sourcing task: find the individual(s) with the skills that best match the job description. However, recruiters also must take into account other variables such as availability, previous experience, remuneration, etc.

Sourcing is also similar to people search on the web where the goal is to search greater volumes of unstructured and noisy data to return a list of individuals who fit specific criteria [24]. Likewise, the recruiter must apply additional factors to select a smaller, more manageable group of qualified candidates, with returned results needing to be normalised and disambiguated [2]. The gold standard for evaluation in this instance is recommending one or more candidates that successfully fulfil a client brief.

The Recruitment Industry

The recruitment industry is estimated to be worth nearly £30 billion per year in the EU [25] and nearly $100 billion in the US [26]. However in these regions, as well as the high value industries in India and the Middle East, there is a surprisingly high level of client dissatisfaction, with 76% of businesses reporting they do not get value for money from external recruitment companies [26]. Also, unfilled vacancies have high impact on the economy, costing the UK £18bn annually [27]. With 1,400 new recruitment agencies being setup in 2014 alone [28], the industry is becoming increasingly competitive. Recruiters need to improve their performance to match the expectations of their clients if they are to secure high value placements.

This survey investigates the search behaviour of recruitment professionals and highlights some of the key commonalities and differences between related professional sectors.

3. Method

The survey instrument consisted of an online questionnaire of 40 questions divided into with five sections. The survey was designed to align wherever possible with that of Joho et al [9], to facilitate comparisons between the two sectors (patent search and CV search). It also incorporated elements of Geschwandtner et al [10], so that some comparisons with health professional search would also be possible. The five sectors were as follows:

  1. Demographics: The background and professional experience of the respondents, including age, gender, education, role, job title, and client type.
  2. Search tasks: The types of search task that respondents perform in their work, how often they perform them, and what resources they use.
  3. Query formulation: How they construct the search queries and what types of functionality they find valuable.
  4. Evaluation: How they assess and evaluate the results of their search tasks, and the challenges this entails
  5. Your ideal search engine: Their views on any other features and functions additional to those described above.

The survey was designed to be completed in 15 approximately minutes. Prior to administering it, we engaged in a series of qualitative interviews with professional recruiters to inform how best to customise the survey instrument for the recruitment profession. We also piloted the survey with two recruitment professionals prior to launch, who provided valuable feedback and advice on its content and presentation (acknowledgement).

To obtain a large and representative sample we sent out the survey to various interest groups via social media (e.g. LinkedIn) and also engaged the services of SurveyMonkey Audience, who administered it to their panel of HR professionals based in North America. In both cases, we included a qualifying question at the beginning (“Is your primary job function to recruit and hire professionals for your organization or for clients?”) so that non-recruiters could be excluded from the results. Answering no to this question would trigger immediate closure of the survey.

The survey opened on 09 June 2015 and completed on 01 August 2015. In total, we received 416 responses, of which 69 were complete. The majority of incomplete responses were due to failure to pass the qualifying question, so these cases contained no usable data. Five other responses were eliminated due to contradictory or nonsensical answers, which left 64 complete responses. Because the number of individuals reached by the survey promotion is unknown, the participation rate cannot be determined.

4. Results

4.1 Demographics

We began our analysis by looking at the demographics of the recruitment profession. Of the 64 respondents, 69% were female and 31% male, with 54% of respondents aged between 25 and 45 years – a profile that is more female-oriented and younger than the patent and medical search survey respondents. In particular, there is a noticeable spike in the 25-31 age bracket:

fig1

The educational background of recruiters surveyed revealed that most were qualified with Bachelor’s degrees (60%) followed by Master’s degrees (29%). The most common degree subjects were professional/vocational (32%) and social science (19%).

Most respondents worked full time (91%), and the clients that they worked for were predominantly external (48%), i.e. outside of their organisation. The rest were either internal (34%) or both (17%). This contrasts sharply with patent searchers, whose clients were predominantly internal (88%).

Job titles for recruitment professionals differ greatly, with variations on recruitment, human resources, talent acquisition and personnel being typical. The most common single job title was recruiter (15%) followed by HR Manager (8%) and HR Generalist (7%).

The following figure shows the respondents experience as a recruiter along with their experience within the recruitment industry more generally. Most respondents have several years’ experience as a recruiter, with a median of around 10 years, which aligns with that of the patent searchers.

fig2

That’s it for this week. If you have comments or feedback, please share below. Next week we’ll cover:

  • Search tasks: The types of search task that respondents perform in their work, how often they perform them, and what resources they use.

  • Query formulation: How they construct the search queries and what types of functionality they find valuable.
  • The Agile Zone is brought to you in partnership with Sauce Labs. Discover how to optimize your DevOps workflows with our cloud-based automated testing infrastructure.

    Topics:
    boolean logic ,recruitment ,search

    Published at DZone with permission of Tony Russell-rose, DZone MVB. See the original article here.

    Opinions expressed by DZone contributors are their own.

    THE DZONE NEWSLETTER

    Dev Resources & Solutions Straight to Your Inbox

    Thanks for subscribing!

    Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

    X

    {{ parent.title || parent.header.title}}

    {{ parent.tldr }}

    {{ parent.urlSource.name }}