This is the most important section in this module and I am going to be direct about it.
When you type something into an AI platform, you are sending that information to a server operated by a third party. The major platforms have privacy policies, data handling commitments, and in many cases the option to opt out of your conversations being used to train future models. We covered those settings in Module 2 and I hope you made the choices that are right for your practice.
But regardless of what the platform’s policy says, there are categories of client information that should never be entered into a public AI tool. Not because the platforms are necessarily irresponsible with it, but because the professional standard for handling client data does not depend on a third party’s privacy policy. It depends on you.
Here is the line. Anything that identifies a specific client by name, combined with information about their travel plans, their financial details, their personal circumstances, or their contact information, does not belong in a prompt.
Let me make that concrete with examples that are specific to how travel advisors work.
A client’s passport number, identity number, or date of birth: never. Their credit card or payment details: never. Their home address combined with the dates they will be away: never, and think about why. Medical information shared in confidence because it affects their travel requirements: never. The names of their children combined with school details or travel dates: never.
Some of those are obvious. Others require you to think about what information becomes sensitive in combination. A client’s name on its own is one thing. A client’s name combined with their travel dates, their accommodation details, and the fact that their house will be empty for three weeks is something else entirely. Context creates sensitivity.
The practical approach that works is this: anonymise before you prompt. If you are using AI to draft a pre-departure email for a specific client, you do not need to include their surname or their passport number in the prompt. Use a first name or initials if it helps the tone. Include the destination, the travel style, the emotional register you want. Leave out anything that, if it appeared in a data breach, would identify a specific person and their specific plans.
For advisors in South Africa, this is also a matter of law. The Protection of Personal Information Act requires you to process personal information responsibly, with the knowledge and where applicable the consent of the person it belongs to. Entering identifiable client data into a third-party AI platform without considering your obligations under POPIA is a risk you do not need to take, particularly when the simple habit of anonymising your prompts eliminates it entirely.
The rule is simple: brief the tool on the situation, not the identity. You lose nothing in output quality. You protect everything that matters.