Alex is Sprintlaw’s co-founder and principal lawyer. Alex previously worked at a top-tier firm as a lawyer specialising in technology and media contracts, and founded a digital agency which he sold in 2015.
- What Does The Privacy Act Require From NZ Businesses?
A Practical Privacy Act Checklist For Businesses Using ChatGPT
- 1) Map The Personal Information You Handle (And Where AI Touches It)
- 2) Set A Rule: Don’t Input Personal Information Unless It’s De-Identified (Or You’ve Checked It’s Appropriate)
- 3) Be Clear With Customers: Use A Privacy Policy And Collection Notice
- 4) Put Internal Guardrails In Place (So Your Team Uses AI Safely)
- 5) Check Whether You’re Disclosing Personal Information To A Service Provider
- 6) Have A Process For Privacy Requests (Access And Correction)
- 7) Prepare For Breaches (Because They’re Not Always “Hackers”)
- Key Takeaways
AI tools like ChatGPT can be a huge productivity boost for small businesses. You can draft emails faster, summarise meeting notes, brainstorm marketing copy, and even build internal processes without hiring a full team from day one.
But there’s a catch: the moment you feed real customer, client, employee, or supplier information into an AI tool, you’re dealing with privacy compliance.
In New Zealand, the Privacy Act 2020 (often referred to as the Privacy Act, the NZ Privacy Act or the privacy act NZ) sets the rules for how your business collects, uses, stores, and shares personal information. If you’re using AI in your workflows, it’s worth putting some clear privacy steps in place early, so you’re protected from day one.
This article provides general information only and does not constitute legal advice. If you need advice on your specific situation, it’s best to get tailored legal guidance.
Below, we’ll walk through practical privacy steps for NZ businesses using ChatGPT (and similar AI tools), what the Privacy Act expects, and how to set up your internal guardrails in a way that’s realistic for busy business owners.
What Does The Privacy Act Require From NZ Businesses?
The Privacy Act 2020 applies to most organisations in New Zealand (including small businesses), as soon as you handle personal information.
Personal information is information about an identifiable individual. Common examples in a small business include:
- Customer names, phone numbers, emails, delivery addresses
- Client notes and service history
- Invoices tied to individuals (especially sole traders)
- Employee records (payroll details, performance notes, leave records)
- CCTV footage where someone can be identified
- IP addresses or device identifiers if they can reasonably identify someone
The Privacy Act is built around a set of privacy principles (often called the IPPs). You don’t need to memorise them, but you do need to run your business in a way that matches their intent. In plain terms, it’s about:
- Only collecting what you actually need for a lawful purpose
- Being transparent with people about what you collect and why
- Using the information only for appropriate purposes (and not surprising people)
- Keeping it safe from loss, misuse, and unauthorised access
- Allowing access and corrections when people ask
- Thinking carefully before sharing information, especially overseas
Even if you’re a small team, these obligations matter. Privacy issues can lead to complaints, investigations, reputational damage, and significant time spent responding to the Privacy Commissioner.
When you introduce AI into your processes, you’re basically adding a new “touchpoint” where personal information might be handled. The key is to control that touchpoint properly.
Where Does ChatGPT Fit Into Privacy Act Compliance?
From a Privacy Act point of view, the big question isn’t “is ChatGPT allowed?” It’s:
- What information are you putting into it?
- Is any of that information personal information?
- Where does that information go next?
- Are you allowed to use or disclose it in that way?
- Have you taken reasonable steps to keep it secure?
Common Risk Scenarios For Small Businesses
Here are some realistic examples where good intentions can turn into a privacy problem:
- Customer service: You paste a customer email complaint (including name, address, order details) into ChatGPT and ask it to draft a response.
- HR: You paste performance concerns about an employee into ChatGPT to “word it professionally”.
- Health/wellbeing services: You paste client case notes into ChatGPT to summarise or create a treatment plan template.
- Sales: You paste a lead list into ChatGPT and ask it to segment by industry and write outreach messages.
- Operations: You upload a spreadsheet containing names and bank details to “clean up” formatting or categorise transactions.
In many of these situations, you may be disclosing personal information to a third party tool, and you may also be creating privacy/security risk if your team does it inconsistently.
Overseas Storage And Disclosure
One privacy issue that often gets missed is that AI tools can involve overseas storage or processing. Under the Privacy Act, disclosing personal information outside New Zealand can be lawful, but you need to think carefully about:
- Whether the overseas recipient is subject to comparable privacy safeguards, or whether you have appropriate contractual protections in place (and whether those protections are actually enforceable in practice)
- Whether the disclosure is necessary for your purpose (or whether you can avoid it by removing identifiers)
- What you told people in your privacy communications about where their data may go
It’s also important to remember that what’s “reasonable” can depend on the specific tool you’re using (including its settings, where it hosts or processes data, whether it uses inputs to improve models, and the terms you’ve agreed to). This is why “we didn’t mean to” isn’t a great strategy. It’s much easier to set the rules internally before your team builds AI into everyday workflows.
A Practical Privacy Act Checklist For Businesses Using ChatGPT
If you want a simple goal: use AI in a way that avoids personal information unless you have a clear and controlled reason to include it. For most small businesses, that means setting up an “AI hygiene” process that reduces privacy risk by default.
1) Map The Personal Information You Handle (And Where AI Touches It)
Start with a quick audit. List:
- What personal information you collect (customers, clients, employees, leads, suppliers)
- Where you store it (email, CRM, accounting software, cloud drives)
- Who accesses it (staff, contractors, virtual assistants)
- Where AI might be used (marketing, support, admin, HR, summarising calls/notes)
This doesn’t need to be a 40-page document. Even a one-page map is enough to identify the high-risk areas.
2) Set A Rule: Don’t Input Personal Information Unless It’s De-Identified (Or You’ve Checked It’s Appropriate)
For many businesses, the safest approach is a simple internal rule: don’t enter personal information into ChatGPT (or other AI tools) unless it’s properly de-identified, or you’ve deliberately decided it’s appropriate given your purpose, your tool configuration, and your contractual terms.
- No names, emails, phone numbers, addresses, bank details, ID numbers, or customer IDs
- No sensitive details (health information, biometric info, detailed employee performance issues, disciplinary info)
- Use placeholders like “Customer A”, “Employee B”, “Client X”
You’ll still get most of the value from AI by focusing on structure, tone, and templates. For example:
- “Write a polite response to a customer who received a delayed delivery” (no personal details)
- “Create a performance improvement plan template for lateness” (no employee history)
- “Turn these bullet points into a professional email” (with identifying details removed)
This is also consistent with the Privacy Act principle of collecting/using only what you need.
3) Be Clear With Customers: Use A Privacy Policy And Collection Notice
One of the easiest Privacy Act wins is transparency. If you collect personal information through your website, onboarding forms, email, or booking systems, you should tell people:
- What you collect and why
- Who you share it with (including service providers)
- Whether information may be stored or processed overseas
- How people can request access or correction
For many businesses, that starts with a properly drafted Privacy Policy and a tailored Privacy Collection Notice.
If you are using AI tools as part of delivering your service (rather than purely internal drafting support), it’s even more important that your privacy communications are accurate and not overly vague.
4) Put Internal Guardrails In Place (So Your Team Uses AI Safely)
Most privacy slip-ups happen when:
- Different team members have different assumptions about what’s “okay”
- Someone is rushing and copies/pastes without thinking
- A contractor uses their own tools outside your systems
To manage this, you can implement a simple internal policy that covers:
- Approved AI tools and approved accounts (avoid personal logins for business work)
- What information must never be entered
- When you can use de-identified information
- Who to ask if unsure
- How to store AI outputs (and not to paste them back into customer records if they include personal info)
This can sit alongside your existing IT or workplace rules, like an Acceptable Use Policy or a dedicated Generative AI Use Policy, depending on how your business is structured.
5) Check Whether You’re Disclosing Personal Information To A Service Provider
If your business shares personal information with third party providers (for example, cloud storage providers, CRMs, outsourced admin support, IT providers, marketing platforms, analytics tools, or AI tools), you should think about what contractual protections you have in place.
In some situations, it can be appropriate to have a Data Processing Agreement (especially where a supplier is processing personal information on your behalf).
This helps you set expectations around confidentiality, security, breach notification, subcontracting, and data handling. It’s also useful evidence that you’ve taken reasonable steps to protect personal information (which is a major theme under the Privacy Act).
6) Have A Process For Privacy Requests (Access And Correction)
Under the Privacy Act, individuals can request access to personal information you hold about them, and they can request corrections. If you’re using AI to generate summaries or notes that end up attached to a customer file, those records may become disclosable as part of an access request.
That doesn’t mean you can’t keep internal notes. It does mean you should be careful about:
- What you record
- Whether it’s accurate
- Whether it’s necessary
- Whether it contains sensitive or subjective commentary that could create risk if disclosed
It helps to have a standard internal workflow for handling requests, including a consistent way to intake the request and verify identity. Some businesses use a template like an Access Request Form to keep things tidy and reduce response time.
7) Prepare For Breaches (Because They’re Not Always “Hackers”)
When people think “privacy breach”, they imagine a major cyberattack. In reality, small businesses often face privacy breaches like:
- Staff emailing a document to the wrong person
- Sharing the wrong attachment with a customer
- Posting a screenshot that includes personal info
- Entering identifiable info into a tool that shouldn’t receive it
- Losing a laptop or phone without adequate security
New Zealand has mandatory privacy breach notification rules in certain cases. If a breach has caused (or is likely to cause) serious harm, you may need to notify affected individuals and the Privacy Commissioner.
Having a clear plan matters, even if you’re a small team. A Data Breach Response Plan can help you move quickly, preserve evidence, and make sensible decisions under pressure.
When Can You Use ChatGPT With Personal Information (If Ever)?
Sometimes, businesses genuinely want to use AI with real-world data. For example, you might want to summarise customer interactions at scale, triage incoming support tickets, or help staff draft responses that include order details.
Whether that’s appropriate under the Privacy Act depends on your exact setup (including the tool, your settings, where data is processed, and the contractual terms you’re using), but here’s the practical way to think about it.
Ask These Questions First
- Is it necessary? Or can you get most of the benefit by using de-identified data?
- Is it consistent with what you told customers? If your privacy communications suggest limited internal use, AI processing might be a stretch.
- Are you disclosing it to an overseas provider? If yes, do you have safeguards and permissions in place?
- Is it sensitive information? Health info and HR issues should be treated with extra care.
- Can you limit access? For example, restrict AI use to one trained person or a specific team.
- Can you log and supervise use? A policy without implementation tends to fail.
A “Template-First” Approach Works Well For Most SMEs
If you’re time-poor (like most founders), a workable middle ground is to use ChatGPT to create:
- Email templates and tone guidelines
- FAQ drafts and support macros
- Customer service scripts
- Employment document templates (non-personalised)
- Internal process checklists
- Marketing content frameworks
Then your team fills in the personal details manually inside your own systems.
This approach keeps you productive while significantly reducing privacy risk.
What Privacy Documents Should Your Business Have In Place?
Privacy compliance isn’t just about “having a policy on your website”. It’s about creating a system where the legal rules match what your business actually does day to day.
Depending on your business model, these are common legal and operational documents to consider:
Privacy-Facing Documents
- Privacy policy: Sets out what you collect, why, who you share it with, and how requests are handled. A tailored Privacy Policy helps align your marketing, sales, and operational reality.
- Collection notice: A shorter notice shown at the point of collection (for example, on forms). A clear Privacy Collection Notice is often what customers actually read.
Internal Documents
- AI use rules: A Generative AI Use Policy can define what your team can and can’t input into AI tools.
- Acceptable use: An Acceptable Use Policy sets expectations for devices, systems, accounts, and security behaviours.
- Breach plan: A Data Breach Response Plan sets out who does what if something goes wrong.
Supplier/Contract Documents
- Data processing terms: A Data Processing Agreement can help control how service providers handle personal information, especially where they process data on your behalf.
The right mix depends on your business. If you’re a solo operator with a basic mailing list, your privacy framework will look different to a business handling health information or running an online platform.
If you’re not sure what applies to you, getting tailored advice early can save a lot of clean-up later (especially once you’ve hired staff, scaled your marketing, or started handling higher volumes of customer data).
Key Takeaways
- The Privacy Act 2020 applies to most NZ businesses as soon as you collect or hold personal information.
- Using ChatGPT can create privacy risk if you input identifiable customer, client, employee, or supplier information into the tool.
- A practical default rule is to avoid entering personal information into AI tools and use de-identified prompts and templates instead (unless you’ve made a deliberate decision that using personal information is necessary and appropriately safeguarded).
- Make sure your external privacy communications match reality, including a properly drafted Privacy Policy and Privacy Collection Notice.
- Put internal guardrails in place (like an Generative AI Use Policy) so your team uses AI consistently and safely.
- Have a plan for privacy requests and privacy breaches, including a clear Data Breach Response Plan, so you can respond quickly if something goes wrong.
If you’d like help setting up your privacy compliance (including policies, collection notices, and AI-use guardrails), you can reach us at 0800 002 184 or team@sprintlaw.co.nz for a free, no-obligations chat.


