
Miller on Designing an Effective Cyber Defense Architecture
Cybersecurity is moving into a more demanding era, one where protecting critical infrastructure requires more than just advanced tools. It calls for a shift in how organizations think about system design, resilience, and scalability. As cyber threats grow more automated, harder to detect, and influenced by global tensions, traditional security methods are falling short. The focus is now on flexible platforms, real-time insights, and security approaches that can perform reliably even in complex, high-risk environments.
To dive deeper into this shift, we spoke with Jan Miller, Global CTO at OPSWAT. Drawing from his extensive experience, Miller explains how organizations can scale innovation without losing rigor, why security strategies must align with real-world operational needs, and how detection systems are evolving beyond static methods to more intelligent, layered approaches.
More than just outlining emerging risks, Miller’s insights provide a practical roadmap for building a strong, future-ready cyber defense architecture – helping technology leaders strengthen resilience while continuing to innovate with confidence.
Leadership journey and role
You’ve spent over a decade building malware analysis and threat detection technologies used by security teams worldwide. How has your journey shaped your approach to designing a modern cyber defense architecture? And how does it influence the way you now lead global technology strategy in your current role?
Miller: Building and scaling security startups teaches you something that no amount of corporate experience can replicate. It teaches the discipline of shipping technology that works under real-world pressure, with limited resources, against adversaries who do not wait.
When you have built detection engines that intelligence agencies depend on, you learn to think in terms of consequences – not features. That mindset carries directly into the Global CTO role.
I approach architecture and strategy decisions through the lens of operational impact: what happens when this technology is the last line of defense between a threat actor and a power grid, a water treatment facility, or a defense network? The difference is scale. At a startup, you optimize for speed and product-market fit.
At OPSWAT, with 70+ engineers across four continents and 15 core technologies, you optimize for institutional resilience – the ability to sustain innovation across teams while maintaining the engineering rigor that critical infrastructure demands.
You have built companies that were later acquired by CrowdStrike and OPSWAT. What lessons still shape your approach to security innovation and cyber defense architecture today? How do those lessons shape the way you drive innovation within a larger global organization?
Miller: Three lessons stand out.
First, the “second opinion” model. So instead of trying to displace what customers already have, position your technology as an indispensable addition. That philosophy is built into OPSWAT’s multi-scanning architecture. A place where 30+ engines work simultaneously because no single vendor covers the full threat landscape.
Second, community-driven hardening. When we made our sandbox analysis platform publicly accessible in 2013, researchers and adversaries alike tested it relentlessly. That pressure produced a far more resilient product than any internal QA program could. We apply the same principle today -every file processed by MetaDefender Aether strengthens the detection pipeline.
Subscribe to our bi-weekly newsletter
Get the latest trends, insights, and strategies delivered straight to your inbox.
Third, keep the deployment threshold low. At my first company, we deliberately priced below procurement thresholds so security teams could deploy without months of approval cycles.
Inside OPSWAT, that translates to SDK-first design -composable libraries that fit into architectures customers are already building, rather than forcing them to rearchitect around us.
Cybersecurity landscape and cyber defense architecture
As Global CTO, how do you approach long-term platform architecture? How do you balance that with the need to respond quickly to rapidly evolving cyber threats?
Miller: You architect for composability and respond with velocity. Our platform is designed around modular detection technologies – multi-scanning, CDR, sandboxing, threat intelligence, DLP, vulnerability assessment – each of which can evolve independently without destabilizing the others.
That modularity is what allows us to respond quickly when a new threat class emerges. Because we are extending our capability rather than redesigning a system. On the strategic side, I protect dedicated research capacity that is not tied to quarterly roadmaps. Small teams build MVPs rapidly and advance or discard them without organizational consequences. That is where technologies such as our adaptive emulation engine and predictive AI detection originated – as focused research efforts that proved their value before being integrated into the platform.
The balance comes down to a simple principle: the platform absorbs innovation; it should never constrain it.
Many industrial and infrastructure environments operate with legacy systems and air-gapped networks. From an architectural perspective, what approaches are proving most effective for securing these environments without disrupting operations?
Miller: Air-gapped and OT environments require a fundamentally different security posture. You cannot assume cloud connectivity, continuous signature updates, or that endpoint agents are even deployable. What works is perimeter-focused file sanitization. Meaning every file entering the environment is treated as untrusted and reconstructed using Deep Content Disarm and Reconstruction (Deep CDRâ„¢) technology before it touches operational systems. That eliminates entire classes of threats, including zero-days embedded in document formats, without requiring signatures at all.
For removable media, which is still one of the primary vectors into air-gapped networks, we deploy MetaDefender Kioskâ„¢ at physical entry points, scanning at over 17,000 files per minute with multiple engines. For cross-domain data transfers, our optical diode provides hardware-enforced unidirectional flow certified to Common Criteria EAL4+ and NATO standards.
And critically, our newest sandbox technology uses CPU-level emulation rather than virtual machines. Meaning it can operate fully offline on a single RHEL server. No cloud dependency, no VM infrastructure, no fingerprinting vectors for evasion-aware malware. The guiding principle is that your cyber defense architecture must adapt to the operational constraints, not the other way around.
Which cyber threats concern you the most when it comes to critical infrastructure?
Miller: Three categories keep me focused.
First, supply chain compromise with attackers embedding malicious code in legitimate software updates or firmware that bypasses traditional perimeter controls entirely. We address this with file-based vulnerability assessment and software bill of materials analysis that can detect known vulnerabilities in binaries before installation, drawing on over one billion vulnerable software data points.
Second, the industrialization of initial access. Credential phishing attacks spiked 703% in late 2024 alone, and we are seeing a 127% increase in multi-stage malware complexity. Attackers are automating the reconnaissance and delivery phases at a scale that overwhelms traditional SOC workflows. Our data shows that 73% of SOC analysts cite false positives as their top challenge, with 10,000+ alerts per day consuming 1,600 analyst hours monthly on triage alone.
Third, and perhaps most consequential, is the convergence of IT and OT attack surfaces. As operational technology networks become more connected for efficiency and monitoring, they inherit IT-class threats without IT-class defenses. The consequences in these environments are physical, such as disrupted power, contaminated water, and halted manufacturing.
That is fundamentally different from data loss, and it demands technologies purpose-built for those constraints. Check data from the 2025 OPSWAT Threat Landscape Report.
Malware detection, AI, and analysis
You’ve worked extensively on sandboxing and behavioral malware analysis. How have these technologies evolved as attackers become better at evading detection?
Miller: The evolution has been an arms race, and each generation solved real problems while creating new constraints.
Early sandboxes relied on user-mode API hooking, which was effective initially, but trivially detectable by malware that checked for instrumentation.
The next generation moved observation into the OS kernel, making monitoring invisible to user-level code.
Then came full virtual machine sandboxes, which offered deeper isolation but introduced significant infrastructure overhead and a critical weakness: malware learned to detect VM environments through timing checks, hardware fingerprinting, and environmental cues like MAC address prefixes or screen resolution.
The current frontier, and what we have built with MetaDefender Aether, is CPU-level instruction emulation. Rather than executing malware inside a virtual machine, we emulate the processor itself. There is no VM to fingerprint, no hypervisor artifacts to detect.
Anti-analysis techniques like extended sleep timers, geolocation checks, and user interaction gates are defeated structurally, not through countermeasures that need continuous updating. The result is 20 times faster processing than traditional sandboxes with 100 times greater resource efficiency, which means that a single server can process over 25,000 files per day. That changes the economics of sandbox deployment from forensic tool to inline detection at scale.
Attackers are increasingly using automation and AI. How should malware detection technologies evolve to keep up?
Miller: Detection must shift from recognizing what a file looks like to understanding what it does and what it intends. Signature-based detection is already insufficient. Our research shows that 1 in 14 files initially deemed safe by public threat feeds is later confirmed malicious. When attackers use AI to generate polymorphic variants at scale, appearance-based detection becomes a losing game. The response must be multi-layered.
At OPSWAT, we combine pre-execution AI prediction -our Alin engine analyses file structure and predicts malicious intent in under 50 milliseconds, before any code runs -with behavioral emulation that observes actual execution at the instruction level, followed by threat intelligence correlation against 50 billion+ indicators.
Each layer addresses a different phase of the detection problem. Critically, these layers feed back into each other: zero-days confirmed through emulation automatically train the predictive models, and extracted indicators enrich the threat intelligence corpus. The system learns continuously. That feedback loop is what closes the gap between encountering a novel threat and understanding it, reducing the time it takes from days to minutes.
Security platforms generate huge amounts of data. How do you ensure the insights remain useful and actionable for security teams?
Miller: This is one of the most important and underappreciated challenges in security. Generating data is easy; generating decisions is hard.
Our approach with MetaDefender Aether’s threat scoring layer is to contextualize every detection by not just saying “this file is malicious” but indicating where it sits on the Pyramid of Pain, what TTPs it maps to in MITRE ATT&CK, what similar samples exist in your environment, and what specific indicators to hunt for. That context transforms an alert into an action. We also address alert fatigue architecturally. Deep CDR technology removes threat vectors from files before they reach analysts, eliminating an entire category of alerts. Multiscanning with 30+ engines produces a confidence-weighted verdict rather than 30 individual alerts.
And our threat hunting capability uses ML-driven similarity analysis to cluster related samples automatically, so analysts work on campaigns rather than individual files. The goal is to compress the decision cycle from “something happened” to “here is what it means and what to do” with minimal manual triage.
Engineering for industry resilience
You’ve spent years researching advanced malware techniques. How do you turn deep research into practical security products?
Miller: The bridge between research and product is disciplined translation. Research explores the possibility; product engineering constrains it to what is deployable, maintainable, and useful at the customer scale. I run dedicated research programs where small teams can prototype rapidly without roadmap pressure. But the transition to product has strict gates: does it solve a problem customers are paying to solve today? Can it operate at the throughput and latency our platform requires? Can it be maintained by the engineering organization without depending on the researcher who built it? A good example is our emulation-based sandbox. The core research insight that instruction-level emulation could replace full VM execution was proven in a focused research effort.
But making it production-ready required years of work on file format coverage, evasion resilience, indicator extraction, and integration with our broader detection pipeline. The research created the technical possibility; disciplined engineering made it a product that processes 25,000+ files per day in production environments.
Many organizations struggle with too many security tools. Do you see the industry moving toward fewer platforms or more specialized solutions?
Miller: The industry is consolidating around platforms, and I think that is the right trajectory, but only if the platform preserves depth of specialization within a unified architecture. Tool sprawl is a real problem. Security teams managing 40+ point products spend more time integrating and correlating than detecting and responding.
The answer is not a single tool that does everything adequately; it is a platform where each component is best-in-class, but they share context natively. That is how we have built MetaDefender: multiscanning, Deep CDR technology, sandbox emulation, threat intelligence, DLP, vulnerability assessment, and SBOM analysis all operate within a unified pipeline.
A file entering through our managed file transfer gateway is automatically scanned by 30+ engines, sanitized through Deep CDR technology, and if suspicious, detonated in our emulation sandbox -all without human intervention and all sharing contexts. The alternative is buying each of these as separate products and stitching them together, which introduces latency, blind spots at integration boundaries, and operational overhead that undermines the security posture it was meant to improve.
As cyber threats become increasingly geopolitical, what role should technology leaders play in improving cyber resilience?
Miller: Technology leaders have a responsibility that extends beyond their own organizations. We need to contribute to standards and frameworks that raise the industry baseline.
At OPSWAT, we have been active in developing the AMTSO Sandbox Evaluation Framework -an industry-wide effort to establish objective, reproducible criteria for evaluating detection technologies, because inconsistent evaluation leads to misplaced confidence.
We also invest heavily in education through OPSWAT Academy, which has trained over 456,000 students and certified 268,000+ professionals in critical infrastructure protection.
On the geopolitical dimension specifically, technology leaders need to advocate for security architectures that maintain sovereignty over sensitive data and operations. That means building products that work fully offline, that do not require data to leave national boundaries, and that can operate autonomously in air-gapped environments. When 98% of U.S. nuclear power facilities trust OPSWAT for their critical infrastructure protection, the responsibility to maintain that trust is not just commercial, it is civic.
Advice for leaders on cyber defense architecture
What advice would you give CTOs responsible for protecting critical systems while still supporting business innovation?
Miller: Treat security as an enabler, not a gate. The organizations that struggle most are the ones where security is positioned as the department that says no. Instead, build security into the architecture so deeply that it becomes invisible to the business processes it protects.
If file sanitization happens in line at wire speed, if vulnerability assessment runs automatically on every software deployment, if removable media is scanned before it physically enters the facility, then security is not slowing innovation.
Instead, it is the foundation that makes innovation safe. Practically, I would also advise CTOs to resist the temptation of complexity. The most effective security architecture is the one that requires the least explanation. Deploy fewer tools with deeper integration rather than more tools with shallow coverage.
And invest in your team’s ability to understand threats, not just operate consoles, as the technology is only as effective as the judgment directing it.
For engineers entering cybersecurity today, which technical skills will matter most in the coming years?
Miller: Understanding systems at a fundamental level will always be the most durable skill: how operating systems manage memory, how network protocols negotiate trust, how file formats encode structure. These are the layers where vulnerabilities exist and where detection happens. Tooling changes; fundamentals do not. Beyond that, I would highlight three areas.
First, applied machine learning for security; not as a buzzword, but understanding how to train models on structured threat data, how to evaluate detection accuracy against adversarial inputs, and how to build feedback loops between automated detection and human analysis.
Second, reverse engineering and binary analysis. As AI-assisted development makes it easier to produce code, the ability to understand what code actually does at the binary level becomes more scarce and more valuable.
Third, operational technology and embedded systems. The attack surface is shifting toward critical infrastructure, and engineers who understand both IT security and OT/ICS constraints will be in extraordinary demand. To build an effective cyber defense architecture, you don’t need more generalists who can configure dashboards. It needs engineers who can reason about systems from first principles.
In essence:
The key takeaway is clear: Cybersecurity must be built into the foundation of technology, and not added as an afterthought. As Jan Miller emphasizes, the most effective approach is to simplify architecture, focus on integrated platforms, and design a cyber defense architecture that works seamlessly within real-world environments.
For technology or cybersecurity leaders, the path forward lies in reducing complexity, prioritizing actionable intelligence, and treating security as a critical enabler of innovation and resilience – not a barrier to it.