Like our site's new design? In April 2023, Circa was acquired by Mitratech.
>> Learn More

From the advent of the eight-factor analysis, availability analyses have been a central part of all federal affirmative action plans. The U.S. Department of Labor’s Office of Federal Contract Compliance Programs (OFCCP) changed from an eight-factor analysis to a two-factor analysis (i.e. an analysis examining only external populations and internal populations that may be able to enter jobs) more than ten years ago. However, the central idea behind any availability analysis is the same: an availability analysis should provide a reasonably accurate picture of the percentage of minorities and females who are available for positions in any particular job group.

As the U.S. Census Bureau enters the final stages of preparing for the release of new census data that will be used in availability analyses, it is worth taking a moment to ask an important question:

  • Do the availability analyses found in affirmative action plans actually provide reasonably accurate information?

Having prepared availability analyses for thousands of affirmative action plans (AAPs) at my company, we would suggest that they do not. This is despite the fact that OFCCP has rarely raised any kind of question about the availability analyses in the AAPs we have provided during hundreds of OFCCP compliance reviews. Rather than reaffirming the idea that availability analyses are an effective tool for understanding a company’s demographics, our experience in producing availability analyses has led us to believe that availability analyses contain a number of inherent flaws that make them virtually useless for any kind of meaningful statistical study.

Issues with Census Data

Many of the inherent flaws in availability analyses are associated with issues in the census data made available from the U.S. Census Bureau. The census data currently in use in federal affirmative action plans is data from the 2000 census. EEO data from the 2010 census has not yet been made available.

As had been done after previous decennial censuses, the Census Bureau made demographic data from the 2000 census on employment in specific job categories available in a special EEO file. Here are some of the issues associated with the data in that special EEO file.

  • The most obvious issue with any of the data from that 2000 census file is that it is now extremely old. Major demographic shifts in the American population have resulted in a demographic picture that is far different than the picture portrayed in the 2000 data.
  • The 2000 EEO file is an abstraction of data taken from census surveys. Only a certain portion of the population was asked specific questions about employment, and from these results the Census Bureau extrapolated results for the entire U.S. population. While this form of statistical extrapolation may work well for a large, relatively homogenous population, it leads to less accurate results when there are a multitude of variables that may affect the data. In regard to the EEO file, there are multiple race categories, various undefined jobs and job types, multiple geographic areas, and other variables that add to the complexity of the data.
  • There were 472 census categories used in the 2000 EEO file to report on all types of jobs in the United States. All positions recorded by citizens who completed the employment portion of the 2000 census were compiled into one of these 472 categories. While this may seem like a large number of categories, there are actually far too few categories to allow for meaningful analyses. This is evident in areas such as manufacturing, where, for example, census category 814 (Welding, Soldering, and Brazing Workers) contains a huge number of different types of welding positions. Contractors that employ welders have routinely complained that this category fails to differentiate the many types of welding specialties and sub-specialties that exist, and fails to reflect the experience level required for various types of welding positions. Similarly, census category 775 (Miscellaneous Assemblers and Fabricators) contains a huge variety of dissimilar jobs that have fundamentally different requirements and demographic patterns. Small plastics assemblers positions, where there tend to be far more minorities and females, and large vehicle assembly positions, where there tend to be few qualified minorities and females, are included in this census category. Among professional positions, census category 143 (Industrial Engineers, Including Health and Safety) includes a wide variety of positions that may or may not require an industrial engineering degree. Further complicating the use of this census category, companies may use the job title “Industrial Engineer” to reflect many different types of job duties. Thus, the percentage of minorities and females in census category 143 may greatly overstate or understate the actual availability of minorities and females depending on the nature of the open position, the persons who were assigned to this census category, and so on.
  • Federal contractors are allowed to select from various geographic areas when determining what census data to use. There is no concrete way to define what the proper recruitment area is for many jobs. With today’s use of the internet as a recruitment tool, applicants from distant geographic areas may express interest in any job, including entry-level jobs. However, it may not make sense to use a broad-based geographic area for comparison purposes when a huge majority of employees are drawn from local markets. OFCCP at times will argue that recruitment areas should either be broadened or narrowed to reflect a demographic population that has a greater number of minorities and females, leaving contractors in a difficult position.
  • The 2000 EEO file allowed federal contractors to choose industry-specific data for use in availability analyses, but only under certain circumstances. Contractors were only allowed to choose the United States as a whole, certain counties in certain states, and certain major cities when using industry-specific data in an availability analysis. There was no effective way to use industry-specific data on one state, a combination of states, or on a region within a state that did not have a large central city. Industries were aggregated into 88 categories, so that it was almost impossible to gather industry-specific data on one particular industry. Thus, even industry-specific data, which should have been among the most refined and relevant data, had flaws.
  • Finally, census data was not helpful for the circumstances when a major employer was centered in a small geographical area. It is not unusual for one employer to be dominant within a city or county. When this is the case, census data for that city or county will reflect employment levels at the one large company, creating a situation where goals based on external census data become meaningless.

We currently do not know exactly how EEO data from 2010 census will be released. However, initial reports suggest that some of these inherent flaws from the 2000 census will persist in the 2010 EEO data. Based on information received from the Census Bureau, we have some significant concerns about the ability of federal contractors (and OFCCP compliance officers) to effectively use the EEO data coming out of the 2010 census for ANY purpose.

Issues with Internal Feeder Groups and Value Weights

The problems associated with the use of census data in availability analyses are not the only issues associated with availability analyses. There are a number of serious problems associated with determinations about feeder groups to be used in Factor 2 of the availability analyses.

  • Many federal contractors routinely use job groups as a feeder for other job groups. While this is quick and simple way to assess internal feeder pools, it is rare that one job group routinely feeds the positions in another job group. For example, some but not all employees from a Professionals job group may feed positions in a Managers job group. The use of the entire job group may skew the data.
  • Even when federal contractors try to attach individual positions to other positions, the data may give an inaccurate picture of the persons who are proper candidates for open positions. It is not unusual for one or more persons in a job title to be excellent candidates for advancement to another position, while one or more persons in that same job title are unacceptable candidates. A truly accurate availability analysis would show only the best qualified candidates as feeders.
  • Every analysis of internal availability is affected by the many changes that occur within an organization in any given year. Positions are eliminated; employees change positions; employees leave and new people are hired. Any of these actions may have a significant effect on the demographics and the proper feeders associated with Factor 2.

Federal contractors can do their absolute best to develop proper statistics for use with Factors 1 and 2 in an availability analysis, and then run into a final problem when assigning value weights to these factors. Here are the types of questions associated with assigning value weights:

  • Should employers rely on a historic assessment of how positions are filled in determining value weights?
  • What if there have been significant changes in the workforce?
  • What if the company has decided to change its historical approach to filling open positions?
  • What if there have been significant changes in the qualifications for the positions in any particular job group?

Lurking behind all these questions is an important question that should have been evaluated before the preparation of any availability analysis was started: Are the company’s availability analyses associated with a good and viable job group structure that will withstand scrutiny by OFCCP and place the company in the best possible light during an OFCCP compliance review?

Considerations for Federal Contractors

If it is true, then, that there are multiple inherent flaws in availability analyses, what steps should federal contractors take? First and foremost, federal contractors should treat the preparation of availability analyses as a series of strategic decisions. Contractors should ask the following types of questions:

  • “Have we developed an effective job group structure?” If the job group structure is ineffective, the availability analysis will, by definition, be unhelpful. Developing job groups should be looked on as a strategic process where the federal contractor is trying to create a structure that both reflects and protects the company.
  • “Can we defend the data we have included in our availability analyses?” Decisions on which census categories and census areas are used for Factor 1, what internal feeder groups are used for Factor 2, and how final value weights are calculated should be supported by information about the nature of the contractor’s positions and data the contractor has collected about these positions.
  • “What kind of picture of this company are we trying to portray to OFCCP?” Availability analyses can be manipulated to increase or decrease the percentage of minorities and females through various legitimate means, and contractors then have multiple tests to use to establish placement goals. Availability analyses that lead to more placement goals in an AAP may demonstrate to OFCCP a contractor’s desire to increase the number of minorities and females in the workforce, and thus may work to the contractor’s benefit. Contractors that are making extensive outreach efforts and that effectively consider minority and female candidates should not be afraid of having placement goals. However, more placement goals may also suggest to OFCCP that the contractor has failed to make sufficient outreach efforts.
  • “How are we portraying our availability analyses to individuals who may review these analyses?” There are multiple audiences that may examine a contractor’s availability analyses. Managers and other employees inside the company may be skeptical of availability numbers that do not correspond to their experience in finding candidates. OFCCP compliance officers may question availability figures that they believe are too low. Plaintiff’s lawyers may use availability analyses to suggest that there is some form of discrimination occurring. It is important to explain to each audience what the numbers mean, how they are used, and why availability analyses are inherently flawed.

One final point worth considering: many federal contractors believe (and many AAP vendors suggest) that availability analyses and the placement goals derived from these availability analyses are the most important part of any affirmative action plan. They are not. As you can see, there are many inherent problems with even the best availability analyses. More important is the recognition that OFCCP is very focused on other subjects: outreach, compensation, veterans, and persons with disabilities. Only outreach efforts have any kind of tie to availability, and even then, that tie is tenuous when one considers that placement goals are only set for minorities and females, and companies are to be making outreach efforts to find minorities, females, veterans and persons with disabilities.

Did you know… that OFCCP compliance officers will occasionally compare applicant pools to census data to suggest possible discrimination in recruiting or selection? In light of the significant problems associated with the 2000 census data, especially the age of the data, federal contractors should strenuously object to any such comparison.

For more information on conducting availability analyses or for information on the various flaws in availability analyses, contact Bill Osterndorf at [email protected]. Additional information about the 2010 census data that will be used in affirmative action plans can be found at

Please note: Nothing in this article is intended as legal advice or as a substitute for any professional advice about your organization’s particular circumstances. All original materials copyright © HR Analytical Services Inc. 2012



Skip to content