Microsoft Purview: the solution to secure Copilot deployment

Posted by Steve Brown on Apr 17, 2024 11:30:00 AM
Microsoft Purview: the solution to secure Copilot deployment
4:13

Surging interest in AI (and, more specifically, Microsoft's new AI companion, Copilot), is putting cyber risk under a new microscope as businesses figure out how best to securely introduce it. Microsoft Purview holds many of the answers.

Purview plays a critical role in enabling responsible AI development and deployment in Copilot. As an end-to-end data estate insights solution, Purview provides capabilities to catalogue, map, and monitor sensitive data across the organisation's entire data landscape. This gives developers, engineers and SOC teams the visibility and control needed to assess and mitigate potential AI ethics risks when building and launching Copilot-powered applications.

Purview's data catalogue maps out where PII, financial information, healthcare data and other sensitive data resides across on-prem, multi-cloud, and SaaS environments. This inventory allows identification of datasets that could lead to bias, fairness or confidentiality issues if used to train Copilot models. It also applies sensitivity labels to properly tag and categorise identified sensitive data, aiding in the appropriate treatment of datasets when developing Copilot applications, as well as ensuring they are anonymised or synthesised if needed to remove ethical concerns.

Moreover, Purview's data lineage feature provides visibility into upstream data flows from source to consumption. This reveals how various data sources are interconnected and used across an organisation. For instance, lineage enables users to assess whether any sensitive data is entering downstream AI development processes, highlighting potential areas of risk. Combined catalogue and lineage give development teams full visibility into data pedigree before launching Copilot-enabled apps.

In production, Purview's continuous scan and monitoring capabilities keep oversight of the AI data estate. Any new sensitive data that appears is immediately flagged through automated classification and labelling. Purview also features trainable classifiers, allowing for customisable identification of sensitive data types beyond out-of-box patterns. Using sample files to train the AI, organisation-specific data such as product codes, customer IDs or proprietary content can be quickly detected to ensure comprehensive data governance across structured, unstructured and custom data sources. Elsewhere, scan results can trigger notifications to data owners if undesired data is detected in Copilot's training environment, allowing for rapid remediation to keep the AI ethical and compliant. Similarly, integrating Purview with PowerBI delivers AI ethics dashboards to data governance teams for ongoing monitoring.

Purview's integration with Azure Policy enables centralised governance guardrails over the AI data estate. Policies can enforce controls around data access, retention, geography, encryption and acceptable use based on sensitivity. Any policy violations trigger alerts for investigation, ensuring continued oversight of the data used by Copilot in production.

With its end-to-end sensitive data visibility, automated insights, and policy enforcement, Microsoft Purview is indispensable for ethically and securely harnessing the power of Copilot. It offers unprecedented visibility across your security data, bringing signals together from Defender, Sentinel, Intune, and Entra into a single pane of glass. Fundamentally, Purview’s capabilities enable developers to assess AI risks early, design-in appropriate controls, and maintain responsible oversight after deployment.

Fundamentally, Purview helps ensure Copilot adheres to organisational values, regional regulations, and ethical AI best practices – building trust with customers, regulators and the public. For any enterprise AI program, it delivers the data estate transparency and governance needed for ethical and compliant innovation.

If you’re considering adopting Copilot but have concerns around security when it comes to AI, speak to the FluidOne team and find out if Purview is the solution for you.