Good AI governance supports tangible business objectives through four pillars. In this series of articles we're looking at each of those pillars in turn to establish a sketch of what good AI governance looks like:
- AI System Inventory
- AI Use Case Inventory
- AI Regulatory Crosswalk
- AI Governance Roadmap
We dove into the first pillar, how to approach an AI System Inventory, in our previous article. Today, we’ll deep-dive into how to approach an AI Use Case Inventory.
AI Governance Can Borrow From Privacy Compliance
While AI presents new and unprecedented challenges, it’s also amenable to tried-and-true approaches, such as information governance, ediscovery and cyber. However, the most relevant related domain for your AI use case inventory efforts is privacy compliance, which often involves the use of AI — even when AI isn’t specifically mentioned.
In particular, U.S. state-level and global privacy regulations have a lot to say about automated and algorithmic decision-making, all of which aim squarely at organizations’ use of AI, whether or not AI is mentioned in the regulations, rules or laws.
It’s only natural then we turn to our privacy efforts to guide our AI efforts and ask, “How are we using AI to achieve business goals?” In privacy, this is contemplated under a processing activity inventory. For AI, we might demystify it a bit and call it an “AI Use Case Inventory;” i.e., what are all the ways the business is using or plans to use AI to meet business objectives?
Given this, we should follow an analogous playbook to our privacy efforts: Identify the high-risk areas of the business for AI, map their processes at high level and document the AI-specific risks associated with them. Whether this is in a spreadsheet, Word doc, or SaaS privacy/GRC system is irrelevant — the important thing is that you make the effort and document it.
Get Specific With Your AI Use Case Inventories
One of the challenges with your AI use case inventory efforts will be the tendency to rely on generalizations. That is, you’ll document a use case at too-high a level to be tied to specific requirements in laws, regulations and rules.
To combat this, get specific. Instead of “Hiring,” dig deeper and click down a level: Do you mean candidate evaluation, interview assessments, post-interview selection, job offer tuning, etc.?
If you’re having trouble getting specific, reach out to your privacy team to see if they’ve inventoried the processing activities for the functional area you’re working on. If they have, you can start from their processing activity inventory to develop your AI use case inventory. If they haven’t, you can enlist them to help in the effort: After all, if the use cases you’re documenting involve sensitive data or algorithmic/automated decision-making (or both), they would be in scope for your privacy team.
Once you do this, you’ll immediately notice that even seemingly unified use cases can be broken down into sub-use cases that will not only help you document them for your inventory, but also determine your regulatory and legal AI governance exposure.
Now What?
Once your AI systems and use cases are inventoried, you determine what your legal and regulatory obligations are vis-a-vis AI. Yet as we've seen, there is a disconnect between typical corporate use cases and the applicability of laws, regulations and rules. You’ll need to dig deeper to get to the use case details needed to determine accurate compliance obligations. Consider partnering with your privacy team to do so, as their work on processing activities will almost certainly contribute positively to your more AI-focused inventory efforts.
As above, “Hiring” is too general: Is AI being used for candidate evaluation, applicant sorting, interviewee scoring, hiring decisions or job offer specifics? Each of these will potentially obligate your organization to different laws, regulations and rules — and different sections of each. Without knowing these specifics, you can’t take the next step and define your specific compliance requirements pertaining to your use cases for AI.
Learn how you can join our contributor community.