AI has quickly become indispensable for efficiency and productivity across industries. Yet, a recent CX Trends 2025 report highlights a largely overlooked issue: the unauthorized and uncontrolled use of AI tools by employees, a phenomenon known as Bring Your Own AI (BYOAI). The findings are alarming: a 188% surge in the use of unapproved tools within just five sectors, often without companies’ knowledge or consent. This trend places organizations in the United States and abroad at significant and unpredictable risk.
Based on surveys with more than 5,000 consumers and 5,500 executives across 22 countries, the research reveals a troubling picture: nearly half of customer service agents already rely on unauthorized AI tools in their daily work. While often driven by the pursuit of greater efficiency, this behavior creates substantial vulnerabilities for organizations. Much like the Bring Your Own Device (BYOD) trend once challenged corporate security, BYOAI now emerges as the next frontier in risk and compliance management.
The sectoral data underscores the scale of the problem:
Frequency of use is equally concerning: 52% of respondents report regular use, 41% frequent use, and only 7% say they rarely turn to unauthorized AI.
The root of the problem lies in the governance gap. Generative AI tools are advancing at extraordinary speed, widely available to employees while corporate policies, training, and approved alternatives lag behind. In this vacuum, BYOAI thrives, presenting an attractive but high-risk option that undermines security, compliance, and trust.
Unlike the European Union, the United States has not yet enacted a comprehensive federal law on AI governance. However, this absence of regulation should not be mistaken for an absence of risk. Companies remain fully exposed under existing U.S. legal frameworks including data protection, consumer protection, cybersecurity, and intellectual property laws, as well as state-level regulations such as the California Consumer Privacy Act (CCPA/CPRA).
The risks posed by BYOAI are substantial:
For U.S. businesses, the absence of a federal AI law is no excuse for inaction. On the contrary, it underscores the need for companies to take the lead in building robust internal governance frameworks. A comprehensive governance program should function as a corporate ecosystem of control, oversight, and accountability across the lifecycle of these technologies.
Key steps include:
In today’s business environment, failing to address BYOAI is not merely an operational oversight but a serious governance gap with legal, financial, and reputational consequences. While the United States still lacks a dedicated federal AI law, regulators and courts are already applying existing laws to risks arising from the use of these technologies. The message is clear: organizations must act now.
By implementing proactive governance, businesses can transform BYOAI from a shadow risk into a strategic advantage, fostering innovation while safeguarding compliance, trust, and long-term resilience.
Talk to an expert with proven experience who can help you identify your company’s data privacy needs.
Taking the first step is important. Right from the beginning, the expert can help you identify what data privacy project would be the best for your company’s needs and what methodology should be applied, avoiding the risk of losing money and wasting time.