MEG Achieves SOC 2 Type II Attestation!

At MEG, protecting sensitive healthcare data is a core part of who we are. That’s why we’re thrilled to announce that MEG has achieved the prestigious SOC 2 Type II attestation, a globally respected benchmark that reflects our commitment to privacy, security, and operational integrity.

We recently spoke with Guvanch Meredov, MEG’s Head of Compliance and Data Protection Officer, to learn more about what this milestone means for MEG, our customers, and the wider healthtech ecosystem.

In this blog, you'll discover:

  • What is SOC 2 Type II and why does it matter?

  • What does SOC 2 Type II evaluate?

  • How MEG Achieved SOC 2 Type II

  • What does this mean for our customers and partners?

  • What’s next in MEG’s compliance journey

  • Final Reflection

What is SOC 2 Type II and why does it matter?

SOC 2 Type II is one of the highest security standards in SaaS. Developed by the American Institute of Certified Public Accountants (AICPA), it goes beyond a one-time review, instead, it evaluates how effectively an organisation operates its data protection and security controls over 12 months.

While SOC 2 Type I assesses design at a single point in time, Type II proves that those controls are consistently implemented over months of real-world operation.

SOC 2 Type II  shows that our controls don’t just exist on paper, they’re consistently applied in real operations.
— Guvanch Meredov, Head of Compliance/DPO at MEG

For healthcare providers and regulated organisations working with us, this is a meaningful assurance - MEG can securely manage their sensitive data at scale with the highest standards of protection.

What does SOC 2 Type II evaluate?

The audit evaluates MEG’s controls across five key trust service principles:

  • Security — Protecting data against unauthorised access

  • Availability — Ensuring systems are reliable and operational

  • Confidentiality — Keeping sensitive information private

  • Processing Integrity — Ensuring systems operate correctly and without error

  • Privacy — Safeguarding personal data in line with regulations

The scope included our cloud infrastructure, encryption protocols, access controls, incident response, and more, providing a thorough evaluation of both technical and procedural safeguards.

How MEG Achieved SOC 2 Type II

Our attestation covers June 2024 through May 2025, and was the result of a sustained, company-wide effort. The journey included:

  • Scoping and defining systems under audit

  • Implementing and refining controls aligned with trust service criteria

  • Rigorous internal readiness checks

  • Extensive evidence gathering to demonstrate compliance in practice

  • Third-party validation and testing

This builds on MEG’s existing ISO 27001 certification and GDPR adherence, enabling us to maintain a high standard of trust and transparency.

What does this mean for our customers and partners?

Whether you're an existing customer or evaluating MEG, this attestation brings key advantages like:

  • Independent validation of our ability to manage sensitive data securely

  • Alignment with major compliance frameworks — including GDPR, ISO 27001, Cyber Essentials, and the NHS DSPT

  • Faster procurement and onboarding, thanks to verifiable third-party assurance

  • Increased credibility with public sector buyers, supported by our UK G-Cloud 14 listing

Clients can also request executive summaries, audit reports, or attestations to support their own compliance requirements.

For our customers, it provides independent assurance that MEG can safely manage, process sensitive healthcare data at scale

What’s next in MEG’s compliance journey

SOC 2 Type II attestation is a major milestone but not the finish line. MEG is committed to ongoing compliance through:

  • Annual ISO 27001 and SOC 2 Type II surveillance audits

  • Biannual penetration tests and vulnerability scans

  • Continuous staff training and policy reviews

  • Automated real-time monitoring of security controls

  • Regular GDPR Data Protection Impact Assessments (DPIAs) and related processes

With growing interest in US and international markets,MEG is aligned with HIPAA requirements and is scheduled for an external audit to validate compliance in Q4 2025.

Final Reflection

SOC 2 Type II is more than a logo or a line on a slide. It reflects the reality that when organisations trust MEG, they’re trusting us with something sacred—the safety, privacy, and dignity of people’s health data.
We take that responsibility seriously. And now, we have the audit to prove it.

Want to review our SOC 2 report? Reach out at dataprotection@megit.com


If you are interested in discovering how MEG can meet your data protection, operational, and regulatory needs, our team is here to help.

What We’ve Learned About Aligning Audits to CQC’s Quality Statements

MEG - CQC Dashboard

Introduction: More Than Audit Coverage

When the CQC launched its Single Assessment Framework, many governance leads paused - not because they weren’t ready, but because they knew this would ask something different of their audit programmes.

Not just more coverage.
More meaning.

We’ve had the privilege of working alongside NHS governance teams adapting to this change, not with panic, but with intention.

This post reflects what those teams have taught us: how they’re realigning audits to the new CQC Quality Statements, what’s working, and where the real opportunities are.

What's Changed and Why It Matters

The move from KLOEs to 34 Quality Statements was more than a structural update. It reframed what the CQC values in audits:

  • Less about checking compliance

  • More about demonstrating outcomes

  • Less about volume

  • More about triangulation: audit + incident + feedback + assurance

For governance leads, this shift presents a question:

“Are our audits generating the kind of evidence CQC is actually looking for?”

What Teams Are Learning in Practice

1. Audits Are Being Seen as Evidence Generators

Rather than audit as a standalone task, teams are starting to use it as a way to surface assurance that matters to execs, to staff, and to inspectors.

In one Trust, we saw teams stop referring to audits as “compliance checks” and start calling them “assurance insights.” That language shift unlocked a mindset shift.

2. Templates Are Evolving (But Staying Pragmatic)

Some partners have co-designed new audit templates directly mapped to Quality Statements. Others are tagging existing templates and using MEG to report them by domain.

There’s no one-size-fits-all, but most effective teams do less reworking than expected, and more reframing.

Example: a “Medication Safety” audit was simply retagged under the statement: “We learn when things go wrong.” The audit questions didn’t change, but the reporting narrative did.

3. Services Want Clarity, Not More Burden

Frontline teams often say they’re happy to participate in audits, as long as it’s clear what it’s for. Tying audits visibly to CQC domains and themes has improved engagement in several sites using MEG.

Rather than adding audits, some Trusts are consolidating:

  • Combining multiple overlapping audits into one aligned format

  • Using Quality Statements as thematic anchors

  • Creating visual dashboards to show what’s covered and what’s not

What MEG Helps Surface

Governance leads using MEG have shared that the most valuable shift has been visibility. Some of the key benefits we’ve seen:

✔️ Domain-Tagged Audit Templates
Audits mapped to CQC domains and Statements using in-platform tagging

✔️ Dashboard Filters by Domain or Theme
Easily see what’s been audited under Safe, Well-Led, Responsive, etc.

✔️ Gaps and Overlaps Made Visible
Trusts can spot under-audited Statements, or consolidate where duplication exists

✔️ Action and Outcome Linking

MEG lets teams connect an audit to its follow-up actions, risks, and training, making it easier to tell the story of impact

CQC Domain Compliance - RAG Dashboard by Ward

Suggestions from What We’ve Seen

From teams who’ve made the transition feel meaningful, not just compliant, we’ve observed some emerging patterns:

🧩 Start with what you already do
Rather than build from scratch, most teams begin by reviewing existing audits and tagging them to the new domains.

🔗 Link audits to other evidence
Teams get stronger assurance when audits are triangulated with:

  • Incident themes

  • Patient feedback

  • Policy reviews

  • Staff learning outcomes

📊 Let dashboards tell the story
Instead of lengthy audit logs, teams are surfacing domain-based visuals that show:

  • What’s covered

  • What’s overdue

  • Where improvements have followed

Related Reading

Conclusion: From Checklist to Confidence

Realigning audits to the CQC’s Quality Statements doesn’t have to mean overhauling your entire system.

Often, the biggest change is in how audits are framed, linked, and presented.

The teams we’ve learned the most from didn’t chase volume, they focused on value. And they built audit cultures that serve not just inspection readiness, but meaningful internal assurance.

Curious how your audit programme maps to the new CQC Statements?
Book a call with the MEG team and we’ll walk you through a domain-based snapshot of what’s possible.

Making Dashboards Part of Governance Culture

Example of a live MEG dashboard showing CQC domains with RAG ratings

Introduction: Dashboards Are Only Useful If They're Used

Dashboards are everywhere in healthcare. But in governance?
They’re only as valuable as the conversations they support.

We’ve seen NHS teams build beautiful, detailed dashboards, only to realise they’re not actively shaping board reporting, clinical decision-making, or team priorities. That’s not a tech problem. It’s a cultural one.

In this post, we explore what happens when dashboards move from back-room reporting to frontline governance tools. It’s based on what we’ve learned from NHS Trusts using MEG to embed dashboards into meetings, workflows, and assurance frameworks - not just to see performance, but to act on it.

Table of Contents

  1. The Dashboard Dilemma

  2. What We’ve Seen from NHS Governance Leaders

  3. The Three Jobs a Dashboard Should Do

  4. How Teams Are Making Dashboards Part of the Conversation

  5. What MEG Dashboards Help Surface

  6. Conclusion: Culture First, Then Tech

The Dashboard Dilemma 

Most governance leads want real-time visibility.
But visibility only helps when it:

  • Reaches the right people

  • Supports the right discussions

  • Surfaces what matters, not just what’s measurable

In several organisations, we’ve seen dashboards launched with energy, only to fade from view after initial rollout. Why? Because they weren’t integrated into how governance teams think, meet, or make decisions.

What We’ve Seen from NHS Governance Leaders

The Trusts we’ve learned the most from have something in common:

They didn’t just launch dashboards.
They built habits around them.

Some used domain-specific dashboards (e.g. Well-Led, Safe). Others developed role-specific views for Divisions, ward managers, or executive committees.

What mattered most?
The dashboards became part of the governance rhythm, not a side project.

The Three Jobs a Dashboard Should Do 

Based on what we’ve seen across partner Trusts, dashboards work best when they serve these three functions:

1. Surface signals, not noise

A good dashboard highlights what’s slipping, what’s overdue, or what’s out of pattern.
Clarity over complexity.

2. Prompt action

Every data point should have a clear implication: Who owns it? What’s the follow-up?
“Inform” isn’t enough. “Activate” is better.

3. Support assurance, not just reporting

Boards and committees need more than figures, they need confidence that risks are being seen, understood, and managed.

Dashboards must speak to governance visibility, not just operational tracking.

How Teams Are Making Dashboards Part of the Conversation 

Here are some patterns we’ve observed from Trusts embedding dashboards into governance culture:

✅ Dashboards Are Standing Agenda Items
In monthly Clinical Governance or Divisional meetings, live dashboards are reviewed alongside SIs and audit updates, not after the meeting as a slide.

🗂️ Governance Packs Pull from Dashboards, Not Spreadsheets
Several Trusts now use MEG dashboard exports as the base for their committee reports, Board Assurance Frameworks, or executive updates.

🧩 Local Teams Are Given Their Own Views
Ward or site-level dashboards help clinical teams see how their activity links to domain performance or inspection readiness.

🔄 Performance Reviews Reference Domain Dashboards
Safe, Well-Led, and Responsive data are presented not by exception, but as standard inputs to team reflection and performance cycles.

What MEG Dashboards Help Surface 

MEG dashboards were shaped by governance leaders who wanted clarity, not clutter.

🔹 CQC Domain Views
See audit coverage, incidents, risks and actions mapped to domain and Quality Statements

🔹 Live RAG Indicators
Highlight overdue actions, unverified learning, or gaps in assurance

🔹 Action Ownership
Track which service, team or individual is responsible and what’s been completed

🔹 Cross-System Integration
Link incidents to audits to policies, making learning and oversight easier to follow

Related Reading

Conclusion: Culture First, Then Tech 

The most effective governance dashboards aren’t the most advanced.
They’re the most used.

Embedding dashboards into governance culture doesn’t start with features. It starts with habits:

  • Reviewing dashboards together

  • Taking action from them

  • Reporting from them

  • Trusting them

When that happens, dashboards stop being reporting tools and become assurance systems.


Want to explore what dashboards could look like for your governance structure?

Book a MEG Dashboard Walkthrough and we’ll show you how NHS teams are using real-time views to support real-world decisions.

How to Embed Closed-Loop Learning in NHS Clinical Governance

Closed Loop Learning infographic

Introduction: Real Improvement Means Following Through

Across the NHS, governance leaders are united in one belief: learning matters most when it leads to change.

And under the Care Quality Commission’s updated Single Assessment Framework, that belief is now a clear expectation. The CQC wants to see not just that we review incidents, but that we close the loop, by turning insights into action, and actions into measurable improvement.

From our work with NHS Trusts, care providers, and governance teams, we’ve seen what this looks like in practice and where the challenges are.

This blog shares what we’ve learned from those organisations. It offers a practical roadmap to help you:

  • Make learning loops visible and trackable

  • Align assurance with CQC’s new expectations

  • Build a governance culture where improvement is consistently evidenced

Coming up:

  1. What Closed-Loop Learning Looks Like

  2. Why Even Strong Governance Teams Sometimes Struggle

  3. The 6-Stage Learning Loop in Practice

  4. How MEG Supports Teams in Closing the Loop

  5. Embedding the Loop: Tips from NHS Partners

  6. Conclusion + Next Steps

What Closed-Loop Learning Looks Like

At its simplest, a learning loop is the process of turning a safety or quality issue into a verified improvement in practice.

The loop can include:

🚨 Incident or feedback → 📋 Action → 🎓 Training → 🔍 Audit → 📈 Outcome → ✅ Evidence of change

What progressive providers have shown us is that the loop isn’t about creating more paperwork. It’s about designing systems that make it easy to:

  • See where change is needed

  • Assign ownership

  • Track impact

And critically, show that improvement efforts are actually working.

Why Even Strong Governance Teams Sometimes Struggle

We’ve worked with teams who are deeply committed to improvement, but feel frustrated by the barriers in their way. Common themes include:

🔹 Data spread across systems
Incidents in Datix, audits in Excel, training on paper i.e. no single view.

🔹 Action plans that drift
Well-written action logs, but no way to track whether they were followed through.

🔹 Good intentions, missing evidence
Training is delivered, but no audit confirms whether practice changed.

These aren’t failures, they’re symptoms of governance systems that haven’t caught up with governance ambition.

The 6-Stage Learning Loop in Practice

Here’s the structure many of our NHS partners are using to close the loop more effectively:

1. 🚨 Trigger
Incident, complaint, audit failure, or staff concern

2. 📊 Analysis
PSIRF or thematic review to understand root causes

3. 📘 Action & Policy Review
Clear next steps, SOP updates, and named owners

4. 🧾 Credentialing / Training
Staff receive support and development, not just tasks

5. 🔍 Audit for Assurance
Check that changes are now part of everyday practice

6. 📈 Outcome Review & Loop Closure
Track the effect over time. Did things improve?

What we’ve learned: it’s not about complexity, it’s about clarity. When teams share a common loop (improvement) model, everyone knows what to do next.

How MEG Supports Teams in Closing the Loop 

MEG’s tools were shaped by feedback from quality and governance teams who wanted to simplify and strengthen the way they work.

Here’s how providers are using MEG to support the loop:

🔗 End-to-End Integration
Connects incidents, actions, training, policies, and audits

📊 Loop Dashboards
See live data on loop status, overdue steps, and domain performance

📋 Action Ownership
Assign tasks, set deadlines, and track progress visibly

🧠 Evidence Capture
Auto-generate reports showing how learning led to measurable change

One Trust used MEG to reduce their average ‘loop’ closure time by over 40% with MEG’s Action Planning tool.

Another created domain dashboards that now support Board-level assurance.

These aren’t just software features. They’re workflows that work because teams helped design them.

MEG's Action Planning tool

MEG’s Action Planning Tool - Demonstrate Open, In-Progress and Closed Tasks

Embedding the Loop: Tips from NHS Partners

The teams we’ve learned the most from have a few habits in common:

1. They standardise, but stay flexible
They adopt a core loop structure but let services adapt language or steps to their context.

2. They track loops, not just logs
It’s not just about counting incidents, it’s about showing improvement journeys.

3. They bring loop data into committees
Dashboards are shared in governance meetings, so learning becomes part of everyday assurance.

Related Reading

🎯 Conclusion + Next Steps

Embedding closed-loop learning doesn’t mean doing more.
It means creating clarity, so your governance efforts lead to meaningful, measurable change.

Across the providers we work with, we’ve seen that once the loop is visible, it becomes doable and once it’s tracked, it becomes culture.

🔄 Curious how your current learning loops stack up?
Book a call with the MEG team to see how loops could support even stronger assurance.

Aligning Audits with CQC Quality Statements

Introduction: Audits That Do More Than Measure

When the Care Quality Commission introduced its new Single Assessment Framework, governance teams across the NHS quickly recognised a shift, not in what audits were for, but in how they needed to work.

No longer was it about ticking the right boxes or ensuring every policy had been reviewed. Instead, audits were being reframed as evidence of lived experience, organisational learning, and impactful care.

This blog post reflects what we’ve seen—and learned—alongside NHS Trusts using MEG to make this shift. We’ll explore how teams are:

  • Reframing audits as part of a broader assurance story

  • Aligning templates to Quality Statements with minimal disruption

  • Using domain-based reporting to surface insight, not just activity

Coming up in this article:

  1. The Shift from KLOEs to Quality Statements

  2. What We’re Hearing from NHS Partners

  3. Three Ways Teams Are Adapting Their Audit Approach

  4. How MEG Supports Domain-Aligned Auditing

  5. Conclusion: From Coverage to Confidence

The Shift from KLOEs to Quality Statements

The transition from Key Lines of Enquiry (KLOEs) to 34 Quality Statements is more than cosmetic. It signals a shift in CQC’s expectations:

Each Quality Statement is supported by six evidence categories, including:

  • People’s experience

  • Policies and processes

  • Observation and feedback

  • Culture, leadership, and outcomes

Meaning: audits now need to fit into a bigger story; one that connects what’s checked with what’s changed.

What We’re Hearing from NHS Partners 

Governance leads and quality managers we've worked with have shared some consistent themes:

🔸 “We don’t need more audits, we need better alignment.”
Many teams have strong audit coverage, but lack clarity on which Quality Statements are being evidenced (and where gaps exist).

🔸 “Our audits still reflect old structures.”
Some audit templates were designed around KLOEs or historical policies. They’re still useful but they don’t always reflect the new framework’s language or intent.

🔸 “We want to audit what matters not just what’s measurable.”
There’s a growing appetite to include cultural and experience-based domains (like Responsive and Caring) in audit programmes, not just procedural areas.

Three Ways Teams Are Adapting Their Audit Approach

1. Tagging, Not Rebuilding

Rather than redesign every audit from scratch, many teams are tagging existing audits to their relevant Quality Statements.

Example: An audit titled “Ward-Based Medicines Safety” is now tagged under “Safe: We learn when things go wrong.”

In MEG, these tags become filters for dashboards and reports.

2. Using Quality Statements to Identify Gaps

Some Trusts have used the CQC Statement list as a mapping tool, cross-referencing against audit coverage to see:

  • Where duplication exists

  • Where gaps are hidden

  • Which domains lack current audit data

This allows them to rationalise, not expand, their audit programme.

3. Making Audits Part of Domain Dashboards

By integrating audit results into MEG’s domain dashboards, teams can:

  • Track audit coverage across Safe, Effective, Well-Led, etc.

  • Monitor audit performance by service, location, or team

  • Include audits in regular governance reporting, not just inspection prep

How MEG Supports Domain-Aligned Auditing

MEG was designed to support audit frameworks that evolve with regulatory needs.

Key features include:

📂 Domain-Based Audit Templates
Easily tag audits to one or more Quality Statements

📊 Filterable Dashboards
Track coverage and performance by domain, service, or audit type

🔗 Link to Incidents and Actions
Surface learning and improvement journeys, from incident to audit follow-up

🧾 Auto-Generated Reports
Create board-ready views showing audit alignment across the organisation

Related Reading

Conclusion: From Coverage to Confidence 

The most effective audit programmes we’ve seen aren’t bigger.
They’re better aligned.

They reflect Quality Statements not just in name, but in outcome-focused evidence. And they make it easier for governance leads to demonstrate assurance, not just activity.

As more NHS teams embed this alignment into their tools and workflows, audits are becoming more than checks.

They’re becoming stories of progress and trust.

Curious how your audit programme aligns with CQC’s Quality Statements?
Book a call with the MEG team and get a domain-based snapshot in under 30 minutes.