Best Practices to Avoid Discrimination When Using AI-Based Digital Ads

Three kinds of AI ad tools can get you into trouble when applied to housing.

 

 

Savvy use of digital media for marketing purposes can be a useful tool for owners, but it can also get them into fair housing trouble. The same artificial intelligence (AI) and machine learning algorithmic technologies that empower you to streamline and target your marketing can also be used, whether deliberately or inadvertently, to exclude groups the fair housing laws protect.

Three kinds of AI ad tools can get you into trouble when applied to housing.

 

 

Savvy use of digital media for marketing purposes can be a useful tool for owners, but it can also get them into fair housing trouble. The same artificial intelligence (AI) and machine learning algorithmic technologies that empower you to streamline and target your marketing can also be used, whether deliberately or inadvertently, to exclude groups the fair housing laws protect.

Last month, HUD issued important new guidance making it clear that it’s keeping an eye out and intends to hold owners accountable for digital discrimination. In the last issue, we discussed the trend of third-party screening companies utilizing AI to provide owners with a score or recommendation for leasing to the tenant. In this issue, we’ll cover the potential misuse of AI when relying on AI platforms for marketing and advertising housing availability at your site. The new HUD guidance says that an “advertiser” at risk includes not just the operator of an ad platform—that is, a website, mobile application, or other channel—but also “entities or individuals placing advertisements” for rental housing on the platform. That would include owners and management companies.

Potential liability for discriminatory advertising is very broad in scope. And problems occur when the AI platforms fail to meet fair housing standards and, as a result, have discriminatory effects on protected groups. We’ll outline three kinds of AI tools that can get you into trouble when applied to housing. And we’ll lay out a set of five best practices you can follow to minimize your liability risks for discriminatory digital advertising.

How Digital Marketing Can Lead to Fair Housing Liability

The fundamental problem is that AI platforms may be wired to do things that violate fair housing rules, including targeting advertising for housing toward some groups and away from others that the discrimination laws protect. While this can be done deliberately, it may also happen as a result of automated systems operations that are designed not to comply with fair housing laws but to make ad delivery more efficient in meeting an advertiser’s objectives.

For example, because of the way it’s programmed, a system may conclude that men are more likely than women to click on certain housing ads. Accordingly, it may direct those ads only to people it believes to be men in violation of fair housing laws banning discrimination on the basis of sex.

The HUD guidance lists other ways that ad targeting may constitute illegal discrimination against protected groups, including by:

  • Denying them information about available housing opportunities;
  • Discouraging or deterring them from applying;
  • Quoting them different and less favorable prices or rental conditions;
  • Steering them to particular neighborhoods or buildings; and
  • Targeting them for predatory products or services.  

As HUD acknowledges, the kind of fair housing transgressions that AI platforms commit can happen “without the advertiser’s direction or knowledge, and can even frustrate an advertiser’s intention that an ad be distributed more broadly.” Even so, as the advertiser, owners are legally responsible for these violations.

Digital Ad Technologies to Beware of

The HUD guidance discusses the discrimination risks associated with different tools that digital platforms typically use to enable advertisers to target the audience for their ads, as well as the systems they use to deliver them.

1. Audience categorization tools. Audience categorization tools allow for the segmentation and selection of potential audiences for an ad by gender, age, income, location, interests, activities, connections, and other categories. As the guidance explains, categorization tools come in different forms, such as drop-down menus, toggle buttons, search boxes, or maps. For example, the ad placement interface may display a toggle button prompting the advertiser to select “men” or “women” as the potential audience. Such tools may allow for both inclusion and exclusion of categories.

The guidance warns that using tools to segment and select audiences for ads on the basis of protected characteristics would violate fair housing laws by limiting access of protected groups to information about housing opportunities. Also, using categorization tools to show different content to different groups on the basis of protected characteristics could also “result in steering, pricing discrimination, or other discriminatory outcomes.”

The HUD guidance explains how digital ad platforms generate the personal data they need to categorize people by race, gender, or other personal characteristics. In some cases, consumers self-identify and disclose this personal information when signing up for a product, making a purchase, or signing into their browser. Some ad platforms infer personal characteristics based on available data about their purchase or browsing history, activities, movements, and information about people with whom the consumer interacts. HUD cites the example of a platform inferring that a consumer is female based on the person’s past purchases of women’s clothing. 

2. Custom & mirror audience tools. The HUD guidance also cautions owners to be careful when using platforms that enable advertisers to deliver ads only to “custom” or “mirror” audiences.

Custom audience tools allow owners and other advertisers to deliver ads only to those included on their customer database. Owners obtain typically obtain these lists from a data broker or other source and then upload them to the ad platform. Some custom audience tools identify consumers who’ve taken a specific action tracked by an advertiser or ad platform, such as visiting a particular website or making a particular purchase.

Mirror audience tools enable advertisers to find consumers who are similar to or mirror consumers on a customized list, called the “source audience.” Using custom tools to advertise housing may violate the FHA when the “source audience” is based on protected characteristics. Similarly, using mirror audience tools becomes problematic when mirroring is designed “to introduce, replicate or enhance” discriminatory exclusions or limitations in the source audience.

For example, suppose an owner with all white tenants posts an ad and furnishes a source list of its current tenants to an ad platform with an audience mirroring application. The platform then generates an expanded list of other people who are similar to the source list based on their online behavior. As a result, the mirror list includes only white people, thus denying non-white consumers access to information about the housing opportunity.

3. Algorithmic delivery functions. In addition to audience selection, using AI to advertise housing creates potential fair housing problems stemming from how the advertising content is actually delivered. Ad platforms rely on sophisticated algorithms to determine which of the eligible ads to actually deliver, passing along the ads that they believe are the most likely to achieve the advertiser’s objective. The problem is that these algorithmic delivery functions may deliver ads based on a consumer’s race, sex, religion, or other protected characteristics.

The guidance notes that discrimination issues may also arise where ad platforms charge higher prices to show ads on the basis of protected characteristics, such as for advertising to women because of their purchasing history. Thus, an owner that thinks it’s just opting for lower rates may inadvertently end up directing its ads to men and away from women.

Best Practices to Prevent Digital Discrimination

HUD issued guidance simply to ensure that owners are aware of their responsibility for the content of their advertising and aware of the glitches that can cause digital ads to violate the Fair Housing Act. As long as you understand the risks, there are things you can do to manage them.

The name of the game is to ensure that whatever digital advertising technology you use isn’t wired to limit or exclude groups on the basis of protected characteristics. Here are five best practices you can follow to achieve that objective and keep your digital ads compliant. 

Best Practice #1: Select the right digital platform. It’s not you, it’s the technology. More precisely, it’s how AI and machine learning systems are wired that can cause problems when applied to the advertisement of housing. And because those systems come from the platform, it’s imperative to select platform providers that you can trust.

According to HUD, “before using an ad platform, [owners] should ensure that they obtain necessary information and disclosures from the ad platform regarding how the platform mitigates risks” of liability for discriminatory advertising. Specifically, select a platform that runs ads for housing in a separate and specialized interface that’s designed to guard against discrimination in audience selection and ad delivery and offers audience targeting options for housing ads that don’t directly describe, relate to, or serve as proxies for FHA-protected characteristics. Also verify that the ad platform provider:

  • Performs regular end-to-end testing of its systems to ensure effectiveness in detecting discriminatory outcomes—for example, by running multiple ads for equivalent housing opportunities at the same time and comparing the demographics of the delivery audience;
  • Proactively identifies and adopts less discriminatory alternatives for AI models and algorithmic systems;
  • Takes steps to ensure that algorithms are similarly predictive across protected groups and makes necessary adjustments to correct for disparities;
  • Ensures that ad delivery systems aren’t generating differential charges on the basis of protected characteristics or charging more to deliver ads to a nondiscriminatory audience; and
  • Documents, retains, or publicly releases in-depth information about ad targeting functions and internal auditing.

Best Practice #2: Ensure audience selection tools don’t discriminate. If the digital platform offers audience categorization functionality, ensure that those tools don’t segment audiences based on consumers’ protected characteristics or what are called “close proxies” enabling the platform to infer those characteristics. Consider using platforms such as Meta and Google that have taken measures to flag and ensure that advertisers of housing aren’t offered use of audience categorization tools that pose potential FHA problems. Ask the platform provider if it has and regularly audits this flagging function.

Best Practice #3: Ensure custom & mirror audience tools don’t discriminate. HUD recommends that owners “carefully consider the source, and analyze the composition, of audience datasets used for custom and mirror audience tools for housing-related ads.” In other words, ensure that custom and mirror data isn’t compiled on the basis of protected characteristics. The guidance also advises making “considered use of any tools provided by the ad platform for evaluating the projected demographics of a targeted audience.” Seek out platform providers that are aware of the risks and have taken steps to disable these functions to guard against potential liability.

Best Practice #4: Make sure ad platform knows you’re advertising housing. Ad platforms wired to minimize FHA liability risks should have some kind of mechanism to alert the system that the proposed ads relate to housing and require special treatment. So, be careful to follow ad platform instructions to ensure that the mechanism and special risk mitigation measures activate.

Best Practice #5: Monitor your own housing ads & campaigns. If possible, owners should monitor their own ads and advertising campaigns “to identify and mitigate discriminatory outcomes.” Thus, for example, a digital ad to which only white prospects from wealthy neighborhoods respond could be a signal that something is amiss in the audience selection or ad delivery function.  

Topics