By Yatin Miglani, EA | Sophicor | April 2026 | ~1,800 words | Reading time: 7 min
If you are a CPA or Enrolled Agent running a tax firm in 2025, you have almost certainly been pitched at least one AI-powered document extraction tool in the past year. The sales decks all look similar. The accuracy claims all sound the same. But buried in the architecture of every one of these tools is a question that almost nobody is asking:
When your client's W-2 is being read by the AI — whose server is it sitting on?
The answer to that question determines your liability exposure, your compliance posture under IRC §7216, and your ability to make good on the data security promises you make to your clients every time they sign an engagement letter with you.
This article explains the actual difference between private cloud and shared cloud tax automation — not in marketing language, but in operational terms that matter to practicing tax professionals.
A shared cloud platform processes your clients' documents on the vendor's own servers. When you upload a W-2 or 1099 to a shared cloud tool, that file — and everything in it — travels from your client's environment to the vendor's infrastructure. The AI reads it there. The data sits there during processing. The vendor controls the encryption keys, the access logs, and the security perimeter.
Most tax automation tools on the market today operate on this model. GruntWorx processes documents on their servers in New Hampshire. Juno, which integrates with TaxDome, processes extractions on its own cloud infrastructure. This is not inherently wrong — but it carries implications that firm owners need to understand.
A private cloud platform installs the processing engine directly inside your own existing infrastructure. The AI runs in your environment. Your documents never leave the cloud account you already control. In the context of tax firms, this typically means installing inside your Google Workspace or Microsoft 365 tenant — cloud accounts you already pay for and already govern.
The distinction is not a marketing preference. It is an architectural reality with direct legal and operational consequences.
IRC §7216 governs the disclosure and use of tax return information. As a tax preparer, you are already required to comply — but the compliance obligation extends to anyone you share that data with.
When you send a client's W-2 to a shared cloud vendor for AI processing, you are sharing tax return information with a third party. This is permissible under §7216 with proper consent — but the question is whether your current engagement letters and consent forms actually cover this specific use. Many do not.
More critically: when the vendor is processing your client data on their servers, the vendor becomes a data custodian. If they experience a breach, your clients may hold you responsible — because your firm name is on the engagement letter, not the vendor's.
A data breach at your vendor is not their client problem. It is your client relationship problem.
With a private cloud architecture, the AI processing happens inside your own Google Workspace or infrastructure. The vendor never holds your data. There is no third-party data custody. Your existing security policies, encryption standards, and access controls govern the environment from the first byte to the last.
Social Security Numbers are the single most sensitive data element in a tax document. Every W-2 contains one. Every 1040 contains multiple. In a typical 500-return firm, you are handling thousands of SSNs per tax season.
On a shared cloud platform, those SSNs travel to the vendor's servers for processing. The vendor may encrypt them in transit and at rest — but they hold the encryption keys. Your SSN data exists in their environment, however briefly, during extraction.
The risk is not necessarily that the vendor is careless. The risk is that you have no visibility into or control over what happens to that data once it leaves your environment.
A properly architected private cloud system handles SSNs differently. In NEXFILE™ PRO, for example, SSNs are stripped from documents and encrypted in Google Secret Manager the moment a file is uploaded — before extraction even begins. An air gap is created between the raw document and the sensitive identifier. The AI reads the document; it never sees the SSN in plain text.
This is not a feature. It is an architectural decision that eliminates an entire category of exposure.
Beyond liability, the private cloud vs. shared cloud distinction affects your day-to-day operations in two concrete ways:
When a client asks "where does my data go when I send you my documents?" — a shared cloud firm has to explain that it goes to a third-party vendor. A private cloud firm can say, truthfully: "It goes into our secure Google Workspace environment. We control it from intake to output."
That is a fundamentally different conversation. For high-net-worth clients, business owners, and anyone who has experienced identity theft, the private cloud answer is materially more reassuring.
Shared cloud tools typically charge per return or per document because they are running compute on their infrastructure for each transaction. Every return processed is a cost to them — and they pass it to you.
Private cloud tools install the engine once. You run it inside your own Google Workspace, on Google's compute infrastructure, at flat cost. The marginal cost of each additional return approaches zero. This is why flat licensing is economically natural for private cloud architecture and structurally difficult for shared cloud vendors to match.
At 500 returns per season, a $30/return shared cloud tool costs $15,000 in extraction fees alone — before staff, before software. A flat-licensed private cloud tool has the same cost whether you process 50 returns or 500.
If you are evaluating or currently using any AI document extraction tool, here are the questions that will tell you whether you are on shared or private cloud:
• Where exactly are my client documents processed? On your servers or in my own cloud environment?
• Who holds the encryption keys for my client data during processing?
• If your infrastructure experiences a breach, what is my firm's liability exposure?
• Does my use of your platform require §7216 consent from my clients? Do you provide that documentation?
• Can you provide your SOC 2 Type II audit report on request?
A vendor that cannot answer these questions directly and specifically is a shared cloud vendor. The evasion is itself informative.
Private cloud and shared cloud are not different tiers of the same service. They are fundamentally different architectures with different liability profiles, different pricing structures, and different answers to the question your clients will eventually ask.
The tax AI market is moving fast. Most of the tools being marketed to Drake firms, TaxDome firms, and independent practices today are shared cloud tools — because shared cloud is easier to build and easier to scale. Private cloud requires more architectural discipline and a deeper partnership with the firm's existing infrastructure.
But for a firm that takes data stewardship seriously — and every firm that signs engagement letters with real clients should — the architecture question is not a technical detail. It is a business decision.
Your data stays in your cloud, not ours. That is not a marketing line. It is the whole architectural premise.
Yatin Miglani is an Enrolled Agent and the founder of Sophicor, a private cloud tax automation platform built for firms using Drake Tax. NEXFILE™ PRO installs directly inside your Google Workspace. Learn more at sophicor.com.
Tags: private cloud, shared cloud, §7216, tax automation, Drake Tax, CPA data security, AI tax software, NEXFILE PRO
By Yatin Miglani, EA | Sophicor | April 2026 | ~1,500 words | Reading time: 6 min
Every licensed tax preparer in the United States operates under IRC §7216. Most know the name. Far fewer understand the specific obligations it creates — and almost nobody is applying it correctly to the new generation of AI-powered document tools their firms are adopting.
This article is a plain-language explanation of what §7216 actually requires, how it applies to the use of third-party AI extraction tools, and what your firm should be doing — and asking — before you upload your first client document to any external platform.
This is not legal advice. For specific compliance guidance, consult a tax attorney. This article is written to inform practicing tax professionals about how §7216 commonly applies to AI document tools.
IRC §7216 prohibits tax preparers from knowingly or recklessly disclosing or using tax return information for any purpose other than preparing the return — unless the taxpayer provides written consent.
The statute carries criminal penalties: up to one year in prison and fines up to $1,000 per violation. The civil exposure under related regulations (Treasury Reg. §301.7216) is additional.
The key phrase is "disclosure." Under the regulations, you disclose tax return information any time you share it with a third party — including a software vendor processing that information on your behalf.
When you upload a client's W-2, 1099, or other tax document to an AI extraction tool, you are transferring tax return information to a third party. Under §7216 and the related Treasury regulations, this is a disclosure — and it requires either an exception or written taxpayer consent.
Treasury Reg. §301.7216-2 provides a list of permitted disclosures that do not require client consent. One of the most commonly relied upon: disclosure to a third party for the purpose of "tax return preparation" — meaning the processing directly serves the preparation of that client's return.
Most AI extraction tools fall within this exception, provided: (1) the data is used only for preparation purposes, (2) the vendor is subject to adequate data handling requirements, and (3) the disclosure is not broader than necessary.
Here is where firms get into trouble: they assume the exception applies automatically and without conditions. It does not.
Under the regulations, even permitted disclosures must satisfy several conditions that many firms have not formalized:
• You must have a written agreement with the vendor that specifies how the data will be used, stored, and protected.
• The vendor must be contractually prohibited from using the data for any purpose other than the specific tax preparation function.
• You must be able to demonstrate that the disclosure was no broader than necessary — meaning you cannot send entire client files when only a specific document is needed.
• You remain liable for the vendor's misuse of the data if you did not take reasonable steps to ensure compliance.
Look at your current contract with your AI extraction vendor. Does it contain all of these provisions? In most cases, a SaaS vendor's standard terms of service do not come close.
A private cloud architecture — where the AI processing engine runs inside your own Google Workspace or equivalent infrastructure — changes the §7216 analysis in an important way.
If the vendor's software installs inside your environment and never holds or processes your client data on their own servers, there may be no disclosure to a third party in the first instance. The data never leaves your infrastructure. The processing happens on Google's compute resources, inside your Google account, under your encryption keys.
This does not eliminate all §7216 considerations — you still need appropriate client consent language in your engagement letters for the use of AI tools generally. But it substantially changes the data custody question and eliminates the most significant exposure: third-party breach liability.
If your vendor never holds your data, your vendor cannot expose your data. The liability calculus changes entirely.
Regardless of whether you use shared or private cloud tools, your engagement letters should be updated to reflect the use of AI-assisted document processing. The IRS has issued guidance (Notice 2023-34 and related) making clear that use of AI in tax preparation is a disclosable activity in many contexts.
At minimum, your engagement letter should include:
• A general disclosure that the firm uses technology tools, including AI, to assist in document processing and return preparation.
• A statement that client documents may be processed by third-party technology systems operating on the firm's behalf for tax preparation purposes.
• If using shared cloud tools: specific disclosure that documents are transmitted to third-party servers, with the vendor's name or a description of the category of vendor.
• If using private cloud tools: confirmation that all processing occurs within the firm's secure cloud environment and client data does not leave the firm's infrastructure.
Again: consult a tax attorney to draft or review your specific engagement letter language. The above is a general framework, not legal advice.
Here is the uncomfortable reality of the current AI tax tool market: most firm owners adopting these tools have not reviewed the §7216 implications at all. They have evaluated accuracy rates, pricing, and TaxDome integration — and signed up without asking the questions that their professional license requires them to answer.
The questions every firm should be asking any AI document tool vendor before adopting:
• Are you §7216 compliant? Provide the documentation.
• Do you have a Data Processing Agreement (DPA) available? Is it included in standard terms or requires negotiation?
• Do your standard terms prohibit you from using client data for model training, product improvement, or any purpose beyond the specific service?
• Have you completed a SOC 2 Type II audit? Can you share the report?
• In the event of a data breach affecting client data held on your servers, what is your notification timeline and what remediation do you provide?
A reputable vendor answers these questions without hesitation. A vendor that deflects, redirects to marketing materials, or says "our legal team will follow up" is telling you something important.
§7216 compliance in the age of AI document tools is not complicated — but it does require intentionality. The statute was written before AI existed, but the underlying principle is the same: your clients trusted you with their most sensitive financial information, and you are legally and professionally obligated to treat it accordingly.
That obligation extends to every vendor you bring into contact with that data. It does not end at your firm's door.
The firms that will navigate the AI transition cleanly are the ones that ask the right questions now — before something goes wrong — rather than the ones that wait for a breach or an IRS examination to discover they were out of compliance.
The question is not whether AI belongs in your practice. It does. The question is whether you control the environment it runs in — or someone else does.
Yatin Miglani is an Enrolled Agent and the founder of Sophicor, a private cloud tax automation platform. NEXFILE™ PRO runs inside your Google Workspace — your data never leaves your environment. Learn more at sophicor.com.
Tags: §7216 compliance, IRC 7216, tax return information, AI tax software, data security, CPA compliance, private cloud, tax automation