audit alert amid FinTech symbols

What happens when audit stops guarding the past and starts guiding the future? Risk was never meant to be static. But for too long, audit approached it as something to control, not something to navigate.

Innovation is moving fast, but governance often isn’t. While CIOs drive digital change and CISOs fight off evolving threats, audit teams can get stuck in the past, tied to legacy controls and outdated checklists. I’ve seen it happen. But I’ve also seen what it takes to break that cycle. 

I saw this firsthand as I wrestled with aging Sarbanes–Oxley (SOX) processes, emerging AI risks and the challenges of leading cross-border teams. This story isn’t just about fixing controls. It’s about reimagining audit as a strategic partner that helps guide transformation. We weren’t just auditing controls. We were auditing the culture of trust. That mindset shift helped us cut through noise, reduce audit fatigue and build trust across teams. 

If you’re a CIO, CISO or audit leader looking to align trust with innovation, this is your story too. 

Not long ago, I was deep in a maze of 450 IT SOX controls: many repetitive, some obsolete and most no longer reflecting how our systems truly operated. At the same time, AI was creeping into every corner of the business. Cloud platforms were racing ahead of legacy controls. Our global teams were managing risk through vastly different lenses. There was no playbook — just a growing tension between innovation and oversight. 

So, I asked myself: What if audit didn’t just keep up, but stepped up? 

What came next wasn’t some big shiny overhaul. It was a shift in how we thought, talked and showed up. We stopped chasing checklists and started asking what actually matters. What helps the business? Where can we lean in and make it better? 

This is a peek inside that journey — not a list of best practices, but what really played out in the middle of the mess. They’re field-tested insights, difficult trade-offs and the kind of lessons that only emerge when theory collides with real-world complexity. 

If you’ve ever felt like governance can’t keep pace, or wished audit could be a strategic force instead of just a gatekeeper, this is for you.

Early on, our interactions with IT and InfoSec were formal at best, frustrating at worst. Audit came in with checklists. Engineering countered with speed. We were speaking different languages, both aiming for security but defining it differently.

That changed when we paused to understand their reality: sprint cycles, architecture decisions and the fire drills they were constantly navigating. We realized our audits were landing like blockers, not enablers. So we flipped the script.

Instead of issuing findings, we offered context. We aligned timing with change windows. We previewed control intent during design, not after deployment. We brought security architects into planning sessions, not just walkthroughs.

The result? Friction turned into collaboration. One of our biggest wins was merging our quarterly access reviews with engineering’s identity audits, reducing duplicate effort while increasing control clarity. Audit stopped being a checkpoint and became a collaborator.

When I first laid out all 450 IT SOX controls on a whiteboard, it was clear we had drifted. Many were legacy artifacts — born from audits past, inherited through acquisitions or duplicated across systems that no longer existed. We were spending more time proving compliance than improving risk posture.

We didn’t start with a giant rationalization effort. We started by asking: Where’s the real risk, and who owns it now?

That single question helped realign our controls with the reality of our environment. Identity became the backbone. If a tool like Azure AD, Okta or CyberArk was already governing access, did we really need three layers of manual reviews on top? If Snowflake had built-in role visibility, why were we still exporting CSVs for point-in-time checks? 

We also questioned the intent behind controls. Was this for actual mitigation or just documentation? That forced a shift in mindset, from maintaining control quantity to elevating control quality. With buy-in from process owners, application teams, compliance and our external auditors, we redefined what ‘sufficient’ looked like by replacing historical habits with risk-based rationale. 

By partnering with InfoSec, Engineering and our SOX testers, we created a shared control map that merged overlapping reviews, removed tool-level redundancies and replaced checklists with evidence that actually mattered. What looked like a control reduction on paper was actually a clarity gain in practice. 

It wasn’t just about cutting. It was about reframing ownership. We pushed decision-making closer to the systems, empowered control owners with visibility and built dashboards that replaced stale trackers. Even our quarterly access reviews went from PDF fatigue to tool-driven accountability. 

The result: 

  • 70% fewer controls 
  • 50% less testing time 
  • And not a single critical finding 

More importantly, it gave us back the bandwidth to focus forward — on AI, cloud, cybersecurity and the shifting risk terrain.

Trimming SOX controls wasn’t about reducing oversight. It was about redirecting energy. By shrinking from 450 to 132 controls, we slashed testing time in half and gave both auditors and control owners something priceless: breathing room. 

We didn’t waste it. 

With that time, we explored higher-risk areas like AI. We reviewed how GenAI tools were being used in vendor selection and customer communications. We piloted an audit of algorithmic transparency before most teams even had AI policies. 

We also reinvested in people. The team had bandwidth to learn about cybersecurity tools, build data-driven dashboards and co-develop risk registers with IT. Rationalization didn’t just clean house — it made us future-ready.

When AI first started surfacing in our environment, it didn’t come with red flags or policy alerts. It came quietly: through third-party tools, SaaS features and one-off business experiments. There wasn’t a formal inventory. No centralized governance. Just a slow creep of machine learning and GenAI into decisions, predictions and workflows we hadn’t audited before. 

At first, we tried to apply old frameworks to new problems. But AI isn’t just another system — it’s a moving target. The risks aren’t just technical; they’re ethical, rooted in explainability and often invisible until something breaks. 

We realized that trying to retrofit traditional controls was like putting guardrails on shifting sand. What we needed instead were flexible principles like trust, transparency and traceability that could evolve with technology. 

So, we did what audit rarely gets to do: we helped build the playbook. 

We collaborated with data science, engineering, legal and compliance teams to map where AI was being used or planned. We co-developed intake processes to evaluate GenAI risks before deployment. We flagged AI features embedded in SaaS tools and worked with procurement to tie AI disclosures into vendor due diligence. 

More importantly, we created a living register: a source of truth that tracked known AI assets, their functions, model types, owners and risk exposure. It wasn’t perfect, but it was actionable. 

Instead of waiting for AI to be “auditable,” we asked: What makes it trustworthy? 

We tested AI model outputs against bias, transparency and business logic. We challenged assumptions around explainability and validation. And we made sure the Audit Committee wasn’t hearing about AI from headlines, but from us. 

The shift wasn’t just operational. It was cultural. We moved from reacting to AI risk to shaping how the business thinks about AI accountability. That’s where audit can lead—not just by assessing control gaps, but by influencing how innovation rolls out in the first place. 

In hindsight, it wasn’t about controlling AI. It was about designing confidence into the system, before risk made the headlines.

When I first moved into a global leadership role, it wasn’t the time zones that tested me. It was the trust zones. 

As an immigrant cybersecurity leader managing teams across the U.S. and India, I had to earn credibility in rooms where I was the only one with my accent, my background or my lens on risk. Technical skill got me invited. But empathy, clarity and consistency kept the door open. 

The challenge wasn’t just language. It was context. A policy update that felt urgent in New York might land as abstract in Noida. A control deficiency flagged in Stamford could feel irrelevant in Pune—unless we could connect it to business impact, not just compliance checkboxes. 

So, I stopped leading with controls. I started leading with curiosity. 

I listened more. I asked how systems were actually used, not just how they were supposed to be. I brought in local leads to co-present findings. I turned audit into a conversation — where trust wasn’t assumed, it was built. 

Over time, my role shifted. I wasn’t just the “auditor in charge.” I became a connector between cultures, expectations and risk perspectives. We didn’t always agree, but we aligned. And that alignment made our findings sharper, our remediation faster and our relationships stronger. 

Cross-border leadership isn’t about being everywhere. It’s about being understood everywhere. 

And that starts with making people feel seen, not just assessed.

Audit’s job is not just to assess trust. It’s to help design it. 

When we reduced SOX controls, we weren’t just cutting. We were creating clarity and giving time back to people. That opened the door to smarter audits, stronger partnerships and risk conversations that actually moved the business forward. 

Auditing AI wasn’t about catching up. It was about shaping how the organization thinks about risk, before headlines or regulations forced our hand. 

Leading across continents taught me that influence doesn’t begin with authority. It begins with curiosity, by asking the right questions and by making others feel seen, not just assessed. 

If audit wants a seat at the strategy table, we need to bring more than checklists. We need to bring courage, context and a mindset that accepts change as part of the job, not the enemy of it.

This article is published as part of the Foundry Expert Contributor Network.
Want to join?