Description of support service for the implementation, adoption and maintenance of Microsoft 365 Copilot
Comprehensive and flexible support in the implementation of Microsoft 365 Copilot - tailored to the realities of your organization.
We guide the company step by step through all the key stages of implementation:
- Technical and data security analysis,
- Preparing teams and building awareness,
- A pilot based on real-life scenarios,
- wide adoption of the tool across the company,
- Development of user competence and implementation of prompts.
- ongoing support services
The offer is modular - you can take the full implementation path or choose specific modules, such as just a security audit, a prompt workshop or a pilot.
No matter what stage you are at today, you can enrich your team with expertise and implement AI consciously - for the benefit of people, data and the business as a whole.
Stage 1: Environment and data security audit
One of the most important and often underestimated steps in implementing Copilot is auditing the security environment and configuration. Many organizations proceed to use Copilot without first preparing the infrastructure, which can lead to uncontrolled access to data, known as oversharing - a situation in which users gain access to information that should not be available to them.
Copilot works on data that the user already has access to - whether it's documents, messages, calendars or SharePoint sites. Therefore, improperly configured permissions can result in the accidental disclosure of sensitive data, such as financial, HR or contract data. In practice, it happens that Copilot returns data that the user shouldn't even know existed - because he or she „saw” it in sources to which he or she had unknowingly granted permissions.
What do we do at this stage?
- Audit of the environment and access policies
- We identify key resources (e.g., HR, finance, management) and analyze the current authorization level of users and groups.
- We perform access reports for SharePoint sites, OneDrive libraries, mailboxes, Teams and private channels.
- We check the configuration of roles and permissions, including for redundant access (oversharing) and inheritance of permissions.
- Cooperation with the organization's security department. We work closely with internal IT security and compliance teams to work together:
- Identify potential areas of risk,
- Conduct an analysis of Copilot's impact on compliance with security policies,
- Prepare joint recommendations on access control and data classification,
Thanks to this cooperation, the audit is not just a technical review, but a full-fledged element of the organization's data security strategy, in line with the risk management policy.
- Microsoft tools and controls. We utilize advanced features of Microsoft 365, including:
- Microsoft Purview (Information Protection, DLP, Compliance Manager) - for data classification and protection,
- Microsoft Entra and PIM - to manage privileged and timed access,
- Defender for Cloud Apps - to monitor activity and detect unauthorized access,
- Sensitivity Labels, Access Reviews, Conditional Access - to provide control over what Copilot can process.
- Recommendations and action plan
- We provide a report with the detected risks and a proposal for changes in authorization policies, role configuration and data security.
- We point out specific actions to be taken before enabling Copilot for users, such as access restrictions, content reviews, implementing protection labels or creating DLP policies.
- Technical and business workshop. We organize a workshop session with IT department and business representatives (about 1.5 h), during which:
- discuss the results of the audit,
- we present risks and good practices,
- We explain how Copilot's operation can affect data visibility,
- We explain what not to do to avoid leaking information.
This step typically takes 2-3 business days, involving analysis, a workshop and a recommendation report. This is an absolutely crucial step for organizations that want to introduce AI consciously - without compromising on data security.
With this:
- You will make sure that Copilot will only operate within secure access limits.
- You will minimize the risk of accidentally sharing classified information.
- Your IT and security teams will be actively engaged and prepared to manage AI in a manner consistent with the organization's policies.
Example:
At one large services company, it was revealed during a board meeting that marketing staff had access to the company's financial resources stored in SharePoint - even though no one had knowingly granted them this access. It turned out that the permissions had been inherited from a historical directory structure, and over time no one had controlled them.
After turning on Microsoft 365 Copilot, employees began receiving responses containing excerpts from budgets and cost statements that Copilot „picked up” as context for prompts.
An audit of the environment helped identify redundant privileges, fix the access structure and implement minimum access (least privilege) policies before the actual leak occurred.
Stage 2: AI awareness workshop - opportunities and limitations
Before employees start using Microsoft 365 Copilot, it is crucial that they understand how the tool works, its limitations and the risks they may face if used improperly. We know from experience that users often have overly high expectations, fail to verify generated content or unknowingly share sensitive data, such as through ill-considered prompting.
What do we do at this stage?
We are conducting an AI awareness workshop - in two options
- Basic level - Introduction to Microsoft 365 Copilot. For people who are new to Copilot or have not had contact with AI:
- We show how Copilot works in Microsoft 365 applications (Word, Excel, Outlook, Teams, PowerPoint, Loop),
- We discuss practical use cases for various departments: HR, sales, finance, operations,
- We explain the limitations of AI:
- Hallucinations (making up non-existent data),
- no permanent memory,
- Language errors, lack of historical context,
- We point out what risks are associated with ill-considered prompting - e.g., copying sensitive data, over-reliance on AI, lack of compliance with RODO.
- Advanced level - Effective and safe use of AI at work. For more advanced users who want to make better use of Copilot:
- We teach how to write effective and safe prompts (e.g., for meeting summaries, data analysis, presentation preparation),
- We show how to assess the reliability of Copilot's response and when it needs verification,
- We discuss good practices for working with AI in teams: creating prompt bases, sharing good examples,
- We remind you of ethical and legal boundaries, such as where the data goes, and whether Copilot can analyze invoice scans, HR or R&D data.
We organize the workshop in a formula that involves both IT/Security teams and representatives of key business departments. This approach is key, as IT is responsible for the security and integration of the tools, while the business decides on their practical application in everyday work. Through the joint session, an understanding of the real capabilities and limitations of AI is created, which is the foundation for effective and informed implementation.
The workshop can be conducted both onsite and remotely, and upon request the session can be recorded for further internal use. For larger organizations, we also offer a series of workshops dedicated to specific departments - for example, HR, finance or sales - to better tailor the content to the specifics of the teams' work. In addition, we provide educational materials to help consolidate knowledge, such as checklists, ready-made sets of prompts and procedure shortcuts.
With this:
- Users better understand how Copilot works and what to realistically expect from it.
- Awareness of legal, technical and data risks is growing.
- IT and business are beginning to speak a common language about AI.
- The organization is ready to pilot and further adopt Copilot.
Example:
At one trading company, the accounting team was using ChatGPT for initial analysis of invoice scans, unaware that the data being entered could end up in systems outside European jurisdiction. It wasn't until the workshop that it was discovered that the company was violating its own data protection policies.
These types of cases show how important it is for users to understand where the data goes, how Copilot differs from ChatGPT, and the consequences of unconscious use of AI.
Stage 3: Analysis of business needs (Discovery)
In this phase, we explore with representatives from your organization where Microsoft 365 Copilot can realistically support day-to-day work - to the benefit of people, processes and business goals. Through direct departmental involvement, we gain a clear picture of the needs that can be addressed by AI. This is the foundation of a well-designed pilot and successful adoption.
What do we do at this stage?
- We conduct Discovery workshops with business teams
We hold meetings with representatives of selected departments (e.g., HR, marketing, finance, sales, operations) to learn about their tasks, challenges and work context. We use methods such as design thinking to effectively identify real needs. - We map processes and define Copilot application scenarios
Together we determine in which areas Copilot can bring specific results - such as speeding up data analysis, creating offers, preparing reports or automating communications. - We define goals and measures of success
By establishing measurable indicators (e.g., number of active users, time savings in specific processes), we create a basis for assessing the value of the implementation. - We are creating a plan for piloting and further adoption
We segment user groups, prioritize activities and recommend the order in which to implement features and create prompts.
This stage usually lasts from 1 to 5 working days - depending on the number of departments included in the analysis. Each workshop lasts up to 8 hours and can be conducted in a desktop or online format. Upon completion, we prepare a document with recommendations, scenarios for the application of Copilot, measures of success and a proposal for next steps.
With this:
- You will learn in which areas of your organization Copilot will bring the most value.
- You will gain a thoughtful and tailored pilot plan.
- Your employees will be realistically involved in the AI implementation process.
- You will establish common goals, metrics and priorities for implementation - which will increase the effectiveness of subsequent stages.
- The organization will be ready to conduct a pilot based on real work scenarios.
Example: At one company, the marketing department reported a need to streamline the preparation of invitations for recurring webinars. Previously, each person created them manually - from scratch - which led to inconsistent communication and wasted time on repetitive tasks.
The Discovery workshop developed dedicated prompts for Copilot with clear guidelines: tone of communication, structure of content (headline, call-to-action, information block) and language to match the company's style.
After implementing the prompts, the time to create one invitation was reduced from about 40 minutes to 10-15 minutes. In addition, teams were able to use a ready-made prompt template - which improved the quality of the content, increased the consistency of the communication and relieved the marketing team of reproduction work.
Stage 4: Pilot testing - Microsoft 365 Copilot in practice
Before Copilot is deployed throughout the organization, we run it on a limited basis - among a selected group of users. This is the stage when we test the tool in realistic working conditions, but in a safe, controlled environment. This gives us practical information about how Copilot performs in specific scenarios, as well as what needs to be fine-tuned before a wider deployment.
What do we do at this stage?
- Together with the client, we select one or more pilot groups - usually teams from different departments, representing different roles and tasks.
- Based on the results of the Discovery stage, we prepare specific work scenarios and sets of tailored prompts for these teams.
- Users receive Copilot licenses and instructions on how to use the tool and materials.
- During the pilot, we are monitoring their experiences and use of Copilot in their daily work.
- We collect feedback - both in sessions with users and from telemetry data.
- We optimize prompts and scenarios and prepare conclusions and recommendations for further adoption of Copilot in the organization.
This stage usually lasts from 2 to 4 weeks - depending on the number of participants and the range of scenarios tested. At the end, we prepare a summary of the pilot and recommendations for full-scale implementation.
With this:
- You'll test Copilot's performance under real-world conditions - without the risk of a full-scale deployment.
- You will catch possible problems with permissions, prompt quality or scripts before a wide rollout.
- You will gain concrete conclusions for further adoption - based on real feedback and usage data.
- You will see a quick return on investment on a small, representative group of users.
- You will build engagement with teams who will become internal ambassadors for AI.
- You'll refine prompts and scenarios - before they go out to the entire organization.
Example: At one company, the Copilot pilot began in the sales department. Users worked with prepared prompts for creating offers. After just a few days, it became clear that some of the prompts needed to simplify the language and change the order of the steps to make them easier to understand. With feedback, the IT team quickly refined the scripts, which were later used throughout the organization - without the need to test everything from scratch.
Stage 5: AdoptionMicrosoft 365 Copilot training and implementation across the organization
After the pilot is complete, we are moving to the widespread implementation of Copilot in the organization. Our goal is to provide users with the knowledge and materials that will allow them to use the tool in a way that is efficient, secure and tailored to their daily tasks.
What do we do at this stage?
- We organize training courses and workshops for teams - either onsite or online.
- We prepare implementation materials: recordings, checklists, ready-made prompts and procedures.
- We run a series of educational sessions for various user groups - including sessions dedicated to specific departments.
- We support the implementation of a tool for sharing prompts and best practices.
- We help the IT department and those responsible for the continued maintenance of the tool on the customer side.
This stage usually lasts from 2 to 6 weeks - depending on the scale of the organization and the number of teams involved in the implementation.
With this:
- End users will know how to use Copilot in their daily work.
- AI errors and concerns will be reduced.
- Copilot will begin to bring real value at the level of the entire organization.
- A knowledge base and prompts will be created that more teams can use.
- The IT team will be prepared to continue maintaining and developing the tool.
Example:
In one organization, after the pilot was completed, we prepared two sets of implementation materials. For all employees - short „How to get started with Copilot” guides, in the form of 3-minute videos and simple PDF instructions, accessible via the intranet. For leaders and key people in teams - expanded training materials, including best practices, sample scenarios, and tips for supporting Copilot adoption in their departments. As a result, the widespread implementation went smoothly, and local leaders played an active role in adapting teams to work with AI.
Stage 6: Prompting training - working effectively with AI
To get the most out of Copilot, users need to know how to create effective and secure prompts. At this stage, we conduct practical workshops where participants learn how to talk to AI, ask pertinent questions and optimize communication with Copilot.
What do we do at this stage?
- We conduct practical workshops on writing effective prompts for Copilot.
- We teach how to formulate clear, specific and understandable instructions.
- We show how to iteratively improve prompts to increase the accuracy of answers.
- We work with examples from participants„ daily work - we test prompts ”live".
- We analyze why some answers are wrong or inaccurate and how to counteract this.
- We are creating a set of good practices together, usable for further internal operations.
This stage usually lasts from 1 to 2 working days for a given group. It can be carried out one time or cyclically in different departments.
With this:
- Users get noticeably better results from working with Copilot.
- The number of wrong or unhelpful answers decreases, reducing frustration and increasing confidence in the tool.
- The organization develops specific competencies related to the effective use of Copilot in daily work
Example:
At one client, after implementing Copilot, many employees reported difficulty getting accurate responses. We organized a series of prompting workshops where users learned how to break down commands into steps and use context. Just one week after the training, there was a noticeable increase in the number of tasks completed using Copilot without the need to correct it. In an internal survey, the tool's effectiveness rating increased from 2 to 4 (on a scale of 1-5), and more than 80% participants said they were more willing to use Copilot on a daily basis.
Step 7: Repository of prompts - sharing knowledge within the team
In this phase, we are focusing on creating and expanding an internal library (repository) of prompts for Copilot. The goal is to provide all employees with access to proven commands (prompts) that can be used immediately in their daily work. This way, every user can benefit from best practices and avoid the tedious creation of effective prompts by trial and error. The shared repository becomes a living knowledge base - a place where prompts are collected, evaluated and improved based on the experience of the entire organization.
What do we do at this stage?
- We help you launch or customize your prompt repository platform (e.g. SharePoint, OneNote, Teams).
- We design a clear structure - with a name, description, example, application and tags that make it easy to search.
- We create a starter library based on proven prompts used in the organization.
- We work with selected individuals (e.g., team leaders, Copilot Champions) to verify and develop content.
- We show how to manage the repository in practice - conduct reviews, updates and promote knowledge sharing.
- We support internal communication to make the repository a permanent part of daily work with Copilot.
This stage usually lasts from 1 to 2 weeks. It can be carried out on a one-time basis - as support for the launch of the repository - or developed periodically as part of internal activities and ongoing support.
With this:
- Users have quick access to verified prompts and save time.
- Copilot's response efficiency and satisfaction with its use are increasing.
- A culture of collaboration and development of AI competencies within the team is being created
Example: In the sales department of a technology company, salespeople often tried to use Copilot to prepare quotes, follow-ups and analysis emails - each in their own way, with varying results. After launching a repository of prompts, they gained access to ready-made, tested commands created by teammates. Instead of starting from scratch, - they used the library, tested the prompts and added comments with suggestions for changes.-.
As a result, prompts began to be developed collaboratively - as practical tools tailored to the role, rather than one-off experiments.
Stage 8: Ongoing support, maintenance and development
Once Copilot is implemented, we provide the organization with ongoing, expert care - tailored to changing needs, technologies and processes. This stage is not just about maintaining the tool, but more importantly its conscious development within the company's structure.
As part of our long-term support, ISCG becomes a permanent point of contact for IT, compliance and business teams - we advise, update, inspire. We support the organization in maintaining a high level of Copilot usage, developing the prompt base, testing new scenarios, and implementing changes due to Microsoft 365 updates or regulator guidelines.
What do we do at this stage?
- We provide a permanent advisory team - which knows the context of the organization and supports it on an ongoing basis.
- We monitor Copilot usage and identify areas of low or high usage.
- We propose new scenarios for working with AI, tailored to roles, departments and business processes.
- We provide information about Copilot's news and recommend what to test.
- We support the development of user competence - through training, consultation and Q&A sessions.
- We help maintain compliance - updating policies, documentation and data access.
- We organize reviews of the repository of prompts - supporting their development and quality.
- We are developing an AI culture within the organization - through internal meetings, materials and best practices.
With this:
- You have a trusted partner to support the organization in its daily work with AI.
- Users remain engaged and use the tool consciously and effectively.
- The organization gains up-to-date knowledge of Copilot's capabilities - without having to search for it on its own.
- It is possible to respond quickly to changes - technical, organizational, regulatory.
- Copilot is growing with the company - supporting more departments and business processes.
This stage is continuous and is carried out on an ongoing basis - monthly or quarterly.
We organize cooperation in recurring blocks, such as:
- Monthly training sessions on Copilot novelties or inspiring use scenarios,
- Usage review - analysis of telemetry data and insights from departments,
- Testing new use cases and developing a repository of prompts,
- Onboarding new AI users and leaders,
- 1:1 consulting and advisory services for business or technical departments.
Example:
In a multinational organization that implemented Copilot in the finance and HR departments, ISCG served as a permanent point of contact. As part of regular cooperation, we proposed new scenarios for the purchasing and R&D teams, prepared a set of prompts, and organized tests and workshops. In parallel, we updated the documentation and repository in accordance with the new organizational guidelines. As a result, the company was able to safely and effectively expand the use of Copilot - without having to launch a new implementation project.
