How colleagues can use AI in their role
What is Microsoft Copilot?
Microsoft Copilot is the tool we use for generative AI chat. It aligns with our security controls, ethical and data considerations offering:
- Data Encryption: Physical security controls are in place to protect any digital data transfers.
- Data Privacy: Data is kept private as Microsoft does not use data entered into Copilot for either storage or training of models.
- Copilot Safeguards: Protects against harmful content and provides protected material detection that highlights any copyrighted information in chat responses.
This is an open AI version of the tool, which means colleagues should adhere to the same mandatory guidelines that are applied to other open AI tools like Chat GPT. This will ensure the confidentiality, integrity, and availability of our data.
Do’s and Don’ts of generative AI tools
- Do not enter any personal identifiable information (PI) or sensitive information of clients, colleagues or stakeholders into generative ai prompts.
- Do not enter any commercially sensitive information/company information into generative ai applications.
- Do not enter any company performance reports or statistics into generative ai prompts to either re-write, create or summarise these datasets.
- Do not enter any trade secrets into generative ai.
- Use AI programmes responsibly and for legitimate business purposes only, and only for the purposes for which they were designed and intended.
- Be transparent about your use of Generative AI. Raise with your manager what the intended work purpose of using this technology is.
- Always verify data produced via Generative AI. It should never be used as a sole source of the truth and only to supplement additional research or writing methods.
These do’s and don’ts form part of GC’s AI policy which you can access [here].
Where can I access Microsoft Copilot?
The open version of Microsoft Copilot is available to the majority of colleagues via https://copilot.microsoft.com or via the copilot app on myapps.microsoft.com.
We are currently carrying out testing with a small number of colleagues on the benefits of a closed version of Microsoft Copilot which we are hoping to release in Q4 of this financial year.
Can I use other generative AI tools like ChatGPT?
Within the current technology landscape, there are many generative AI models such as ChatGPT, Claude 3, Perplexity AI, Google Gemini, and ChatSonic. Microsoft Copilot is the tool we have requested colleagues to use for generative AI chat.
Should you need to use other open AI tools you must:
- Inform your line manager what the intended work purpose is
- Adhere to GC’s 8 key principles and our mandatory guidelines for using AI as set out in GC’s AI policy
- Disclaim where AI has been used to generate documents/ inform contributions.
While we’re keen to foster innovation and harness the potential of these technologies it is important that we take a responsible approach. Colleagues are reminded to refer to GC’s 8 key principles for using AI which, alongside the practical do’s and don’ts set out in our AI policy guide our organisational approach.
What should I do if I want to use an AI tool for work?
Be transparent about your use of Generative AI. Raise with your manager what the intended work purpose of using this technology is.
GC maintains a register of closed and open AI tools that are used within the business. Please flag any generative AI tools you would like to use with your manager and ai@growthco.uk
Can I use AI notetaker tools (e.g Otter AI and Fathom) for meetings/events?
Microsoft Teams has recording and transcribing functions built in which some colleagues are already using. Colleagues are reminded to apply the principles of transparency and privacy by making sure you ask other participants in the meeting before using.
What should I do if I realise I’ve shared sensitive information on an AI platform?
If you think you have shared sensitive information on an AI platform please contact the IT department in the first instance, who will review and advise on next steps of action, contact them via this email: itservices@growthco.uk
What sources should I use to check information and outputs provided by AI solutions?
You should always attempt to verify the information generated by AI tools. You can do this by cross-referencing data and facts with multiple credible sources such as trusted websites, reputable databases, or academic research.
Is there a form of words I should use to indicate that I’m using content generated by AI?
A simple line that says ‘this content was partly generated by AI’ should be enough to let clients, customers and colleagues know that you have used AI to help create the information you are sharing with them.
Are there any penalties for inappropriate use of AI in the workplace?
Colleagues should always behave in line with GC’s code of conduct which sets out the professional standards and behaviour GC prides itself on, aligned with our values. The code of conduct is available here.
GC’s Information Security Information and Event Reporting Procedure outlines the immediate steps colleagues should take to report the sharing of sensitive or personal data, how these breaches will be investigated and the possible outcomes.
We want to support colleagues navigate the use of AI responsibly and with confidence. We will continue to update these FAQs and guidance and welcome your feedback and further questions (ai@growthco.uk) to make sure we are supporting you as best we can.
Do I need technical expertise to use and implement AI?
Though technical skills are helpful, many AI tools today are designed to be user-friendly and do not need extensive technical knowledge. Despite AI's longstanding presence, its rapid growth means we are all learning together. We’ll continue to update colleagues with the latest knowledge, guidance and relevant training tools as appropriate.
I'm not good with new technology; will there be training on AI tools?
We know that new technology can be both exciting and challenging. We are committed to making sure everyone feels comfortable with any new tools or approaches we implement. Training sessions will be provided to help everyone learn how to effectively use AI tools. The first of these will be trialled with managers before the end of the year. Additionally, ongoing support from the IT team will be available for any questions or assistance needed along the way.