The Error Up There: Security Needed for Copilots

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 ก.พ. 2025
  • Copilots aren’t just for aviation anymore; they are embedded into nearly every business and personal productivity tool out there today, be it Microsoft 365 or Power Platform. Microsoft Copilots help bring efficiency to the next level. The problem is, the things being built, designed, and sent are often insecure and need strong air traffic control to govern proper usage of these Copilots and prevent data leakage. Common concerns that are being ignored include prompt injection, circumventing data classification, inherent uncertainty of what applications in production do, and over-sharing.
    In this on-demand webinar, you will learn:
    How the concept of Copilots does not necessarily solve the problem of AI alignment. While AI interactions are tied to user intentions, AI can still do things the user doesn’t expect nor ask for
    If you’re building extensions to your Microsoft Copilots or creating your own Copilots with Copilot Studio, there are concrete risks you must avoid and configurations you should watch out for
    Data leakage that can stem from the use of Copilots in M365
    Examples of apps, automations and Copilots that are built by business users that interact with sensitive data
    Implementing secure guardrails for the Copilots that are used across the Microsoft suite
    Why existing tools don’t go deep enough into the ‘why’ of apps are built and ‘why’ they need to do certain things

ความคิดเห็น •