A group of companies that advertised job opportunities through Facebook’s ad-serving platform discriminated against older members of the applicant pool, claims a proposed class action filed in the U.S. District Court for the Northern District of California. This filing suggests potential liability for any employer that posts jobs via ads that target recipients based on demographic metadata.


As Facebook’s roughly two-billion active users view, like, and share content, they give Facebook concrete information about their preferences and behaviors. Facebook’s ad platform leverages this data by allowing advertisers to reach the users most likely to find their ads relevant. Because Facebook also collects information on its users’ demographic factors, such as  age, race, and gender, critics note a potential for discriminatory ad targeting. In a December 20th filing, a proposed class of older job-seekers on Facebook argued that this discriminatory potential came to fruition when a group of employers (including Amazon, Cox, and T-Mobile) used an age filtering feature for their job postings to target younger cohorts and screen out older ones in violation of the Age Discrimination Employment Act (ADEA).


Largely in response to concerns about the opacity of Facebook’s ad-targeting, Facebook offers a feature on each ad that allows users to determine “Why am I seeing this ad?” Based on job postings like the one above, class members claim they were screened off from job postings that reached younger Facebook users. Since this filing, Facebook, which was not named a defendant, commented in response to this suit that its ads could be part of a broader media campaign by hiring employers, and that targeting is a permissible part of a diversified hiring strategy. Facebook further noted that its ads are no different from TV and magazine ads, which inherently reach different demographics by virtue of their viewer and subscriber bases.


Several important implications from this filing:


  • Targeting may not equal discrimination, but it can get you sued.


The defendant employers likely share Facebook’s view—that targeting is not per se discrimination. Whether or not this argument prevails, this filing shows that applicants scrutinize potential discrimination in posting criteria as well as in the hiring decision. The plaintiffs argue that Facebook ads are so ubiquitous and pervasive that being screened off from those ads is to be effectively eliminated from the pool. Courts will have to decide whether or not targeting gives rise to ADEA liability, but before the question can be settled, employers accused of targeting will be dragged into expensive, broad-ranging suits like this one if their postings facially preference a certain age.


  • This is bigger than Facebook postings.


Facebook is not the only platform with targeted ads. A 12/20/17 ProPublica and New York Times report highlighting potentially discriminatory employment ads found that Google’s AdSense and LinkedIn ads had the same age-filtering capability (LinkedIn since eliminated this function). Going forward, postings through these and other, smaller ad-serving platforms can expect the same scrutiny by potential plaintiffs and their lawyers. The proposed class presently includes the Communications Workers of America, an international union representing 700,000 workers in telecom, cable, IT, media, education, and public service sectors, but the groups affected by these targeted online ads putatively include job seekers and employers in every conceivable industry.


  • This age filter was just the one the plaintiffs could see—the next frontiers of litigation are the ones we can’t yet see.


This suit focuses on an overt age filter, one that is visible to the applicant-user. The direct connection between the targeting factor and the impermissible criteria (age) presents an ideal argument for the plaintiffs’ lawyers, i.e.: where the defendants chose to click on an age range to target, their intent to discriminate based on age can be inferred. The immediate result of this filing will likely be the removal of age-filtering options (following LinkedIn’s example), but this will not end the inquiry into targeted job postings. Ad-serving platforms have a wealth of data that gives them a subtler capacity to discriminate, and plaintiffs will dig deeper into the nuances of Facebook’s and other algorithms to uncover more sophisticated discrimination, like using permissible targeting factors as a proxy for impermissible ones.


The diversity of the data collected by Facebook creates the ability to use other factors as a proxy for age. For example, if a tech company sought to preference millennial applicants and exclude older ones, it might target users who liked, viewed, and shared content about fidget spinners, vape pens, and avocado toast (or any number of factors that correlate strongly with millennial status), rather than flagging its intent by selecting the 18-38 age range. The age-filtering effect on who receives the ads might be the same, but the intent would be harder to discern. Ads could use proxies to target factors other than age, such as race, gender, or sexual orientation, which by their nature would be more likely to use coded methods than overt ones. The basis for these “proxy-factor” claims requires an understanding of an ad-serving platform’s targeting algorithm. If this class action proceeds to discovery, the key struggle will be between the plaintiffs’ press to lay bare the criteria of this algorithm and Facebook’s push to protect its and its advertisers’ proprietary information. The stakes in this struggle are the viability of future claims along these lines.


In the meantime, employers posting jobs online should avoid applying overt age filters to targeted ads on platforms like Facebook. Other sites with embedded ads may use similar functionalities, so be sure to double check what you’ve selected before posting.