Shadow AI Is Already Inside Your Fort Worth Business. Here Is How to Find It.
Shadow AI Is Already Inside Your Fort Worth Business. Here Is How to Find It.
There is a good chance your employees connected an AI tool to your company's Microsoft 365 this week. They did not ask permission. They did not read the fine print. They were trying to write a proposal faster, summarize meeting notes, or clean up a spreadsheet. They clicked "Allow" on an OAuth prompt and moved on with their day.
That single click may have given a third-party AI application permission to read their email, access their OneDrive files, and pull data from your SharePoint. For a Fort Worth construction company, that is a mess to clean up. For a Fort Worth healthcare practice handling patient records, it is a potential HIPAA violation that could trigger an investigation.
This is shadow AI, and according to Gartner, 40% of organizations will experience a security or compliance incident caused by it before 2030. The problem is not theoretical. It is already happening in Fort Worth businesses right now, and most owners have no idea.
What Shadow AI Actually Looks Like in a Small Business
It Is Not What You Think
When most people hear "shadow AI," they picture an employee sneaking around doing something they should not. That is not what we see. What we see, every time we audit a new client's environment, is well-meaning employees who found a tool that made their job easier and connected it without realizing what they gave it access to.
Here are the most common ones we find in Fort Worth businesses:
Browser extensions that use AI to write "smart replies" in Outlook. These extensions typically request permission to read all email in the account. That means every message, including ones with client financials, employee records, or protected health information, is being processed by a server the business does not control.
Meeting transcription apps that record calls, send audio to a third-party server, and generate AI summaries. In a healthcare practice, those meetings often include discussions about patient care, diagnoses, and treatment plans.
ChatGPT plugins and similar tools that employees connect to OneDrive or Google Drive so they can "ask questions about documents." The AI gets read access to every file in the connected account.
AI scheduling tools, AI writing assistants, AI note-taking tools. The list grows every week.
The OAuth Problem
The mechanism behind all of this is OAuth, the permission system that lets third-party apps access your Microsoft 365 or Google Workspace without needing your actual password. When an employee clicks "Allow" on one of these prompts, they are granting the app a set of permissions that persist until someone revokes them.
Most employees do not read what they are granting. Most IT providers are not monitoring what gets granted. The result is a growing pile of third-party applications with access to your company's data, and nobody keeping a list.
According to research across 500 companies cited by Gartner, 68% of employees now use unauthorized AI tools at work. That number was 41% in 2023. It is accelerating.
How to Find Shadow AI in Your Environment
Start With Your Microsoft 365 Tenant
If your Fort Worth business runs Microsoft 365 (and most do), the first place to look is the Enterprise Applications blade in Azure Active Directory. This shows every third-party app that has been granted OAuth consent by any user in your organization.
Here is what to check:
App permissions inventory. Go to Azure AD, then Enterprise Applications, then filter by "All Applications." Sort by date added. Anything you do not recognize needs investigation. Pay special attention to apps with "Mail.Read," "Files.ReadWrite.All," or "User.Read.All" permissions.
User consent settings. By default, Microsoft 365 allows any user to grant consent to third-party apps. For regulated businesses, this should be locked down so that only admins can approve new app connections. If your tenant still allows user consent, every employee is one click away from granting an AI tool access to company data.
Conditional access policies. Even with admin consent required, you need conditional access policies that restrict which apps can access sensitive data based on user role, device compliance, and location. CISA's guidance on securing cloud business applications is a good starting point for understanding what these controls should look like.
If this sounds like a lot of work, it is. But it is also table stakes for any business with compliance obligations.
Then Check Browser Extensions
Browser extensions are the blind spot most IT providers miss entirely. An employee can install a Chrome or Edge extension that reads every page they visit, including Outlook Web, SharePoint, and any web-based application your business uses.
Most endpoint management tools can audit installed browser extensions across managed devices. If your IT provider is not doing this, ask them why.
Need help finding shadow AI in your environment? IT Integrations runs free Shadow AI audits for Fort Worth businesses. We scan your Microsoft 365, identify every third-party AI app with access to user accounts, and give you a clear report. Call us at (817) 808-1816 or contact us for a free assessment.
Why This Matters More for Fort Worth Healthcare Practices
HIPAA Does Not Have an AI Exception
If a third-party AI tool is processing protected health information (PHI) without a Business Associate Agreement (BAA) in place, that is a HIPAA violation. Period. It does not matter that the employee was trying to be productive. It does not matter that the tool's marketing page says it is "HIPAA compliant." If there is no signed BAA between your practice and that vendor, and that vendor's AI is touching patient data, you have a problem.
The HIPAA Journal reported that in January 2026 alone, 46 data breaches affecting over 1.4 million individuals were reported to the HHS Office for Civil Rights. Hacking and IT incidents caused 36 of those 46 breaches. The average cost of a healthcare data breach remains the highest of any industry.
The proposed HIPAA Security Rule updates, expected to finalize in 2026, point toward more prescriptive technical controls. The days of "we have a policy document" being enough are ending. Regulators are going to start asking what controls you actually have in place, and "we did not know about the AI tools" is not going to be an acceptable answer.
Fort Worth Home Health and Hospice Are Especially Exposed
We work with home health, hospice, and assisted living operations across the Fort Worth area, and these businesses have a unique shadow AI risk. Their staff are mobile. They use personal phones and tablets. They work from patient homes, not from a locked-down office network.
When a home health nurse installs an AI note-taking app on their personal phone and uses it during a patient visit, the practice may never know. But the data is out there, on a server the practice has no agreement with, no audit rights over, and no ability to delete from.
This is why HIPAA compliance in 2026 has to include an AI governance component. If your compliance program does not account for AI tools, it has a hole in it.
What We See When We Audit Fort Worth Businesses
The Patterns Are Consistent
After running shadow AI audits for Fort Worth businesses over the past several months, the findings are remarkably consistent. Almost every business we audit, regardless of size or industry, has at least a few unauthorized AI applications with OAuth access to their Microsoft 365 environment.
The most common findings:
Between 3 and 12 unauthorized AI-connected apps per organization. Smaller businesses tend to have fewer, but the ones they have often have broader permissions because nobody configured consent policies.
At least one app with permission to read all user email. This is the permission that should make you uncomfortable, because it means the app can see everything every user sends and receives.
Meeting transcription tools installed by one or two users but processing data from every meeting attendee, including external participants who never consented.
AI browser extensions that predate the business's current IT provider and have been silently running for months.
No documentation of any of these tools anywhere. No BAAs. No risk assessments. No audit trail.
The Fix Is Not Complicated
The good news is that finding and remediating shadow AI is not a massive project. For most Fort Worth small businesses, it takes a few hours to inventory, a conversation about policy, and some configuration changes in Microsoft 365 to prevent it from happening again.
Here is what the process looks like with us:
First, we scan your Microsoft 365 tenant and enumerate every third-party app with OAuth consent. We check permissions, last activity, and whether a BAA exists (for healthcare clients).
Second, we review browser extensions across managed endpoints.
Third, we produce a report showing what is there, what risk it represents, and what to do about it.
Fourth, we help you set up the controls that prevent new unauthorized apps from being added: admin consent workflows, conditional access, and endpoint policies for browser extensions.
Fifth, for businesses that want to use AI productively (and you should), we help you deploy approved AI tools with proper guardrails. That might mean configuring Microsoft 365 Copilot with DLP policies, deploying a local AI instance for sensitive data processing, or building custom AI workflows that keep regulated data inside your environment.
The point is not to ban AI. Your employees are going to use AI whether you approve it or not. The point is to give them an approved path that does not create compliance exposure.
Frequently Asked Questions
What is shadow AI?
Shadow AI refers to any artificial intelligence tool or application that employees use for work without the knowledge or approval of the IT department or business owner. This includes AI browser extensions, ChatGPT plugins connected to work accounts, AI-powered meeting transcription apps, and any other AI tool that accesses company data through OAuth permissions. The term parallels "shadow IT," which has described unauthorized technology use for years, but shadow AI moves faster and touches more sensitive data because of how AI tools request broad access to email, files, and calendars.
Is shadow AI a HIPAA violation?
It can be. If an AI tool processes protected health information (PHI) and there is no Business Associate Agreement (BAA) between your practice and that tool's vendor, that is a HIPAA violation regardless of intent. The employee did not mean to violate HIPAA. They were trying to work faster. But the regulation does not distinguish between intentional and accidental exposure of PHI to an unauthorized third party. Any Fort Worth healthcare practice should treat shadow AI discovery as a compliance priority, not just an IT housekeeping task.
How do I know if my employees are using unauthorized AI tools?
The fastest way is to check the Enterprise Applications section of your Azure Active Directory (if you use Microsoft 365) or the OAuth apps section of your Google Workspace admin console. These will show every third-party application that any user has granted access to. If you see app names you do not recognize, especially ones with permissions to read email or access files, those need investigation. For a more thorough review, a shadow AI audit that also covers browser extensions and mobile devices will give you the full picture.
Can I just ban AI tools entirely?
You can try, but it usually does not work well. Employees use AI tools because the tools genuinely make them more productive. Banning them without providing an approved alternative means employees either stop being productive or find ways around the ban (which makes the shadow AI problem worse, not better). The better approach is to set up approved AI tools with proper security controls, lock down unauthorized app consent in your Microsoft 365 or Google Workspace tenant, and give your team a clear path to using AI without creating risk.
Next Steps
Shadow AI is not a future problem for Fort Worth businesses. It is a current one. The tools are already connected. The data is already flowing. The only question is whether you know what is there.
Gartner predicts AI governance spending will reach $492 million in 2026 and exceed $1 billion by 2030. That spending is being driven by exactly the kind of unauthorized AI exposure we find in businesses every week.
You do not need to spend a fortune to get this under control. You need someone to look, tell you what is there, and help you fix it.
Ready to find out what AI tools are running in your environment? IT Integrations provides free Shadow AI audits and AI governance services for Fort Worth businesses and the surrounding DFW metro. We have been doing IT for Fort Worth since 2003, and we know what regulated businesses need. Call (817) 808-1816 or schedule a free IT consultation today.