Interviews
The So-Called Castle and Moat Security Model is Dead

Vitaliy Trifonov, the Creative Technical Director at Group-IB, says the aim of creating and implementing ZTNA is one that has entered the mainstream
How has the Zero Trust Network Architecture evolved since it was first coined in 2010?
On paper, zero-trust network architecture is the natural successor to the perimeter-based security model. The preceding model, which gives unfettered access to any user upon completion of initial verification is no longer fit for purpose, especially given the vast numbers of individuals now working in a remote or hybrid setting following the COVID-19 pandemic.
Zero Trust Network Architecture (ZTNA) as a concept has gained significant traction within the cybersecurity space, and more companies are beginning to implement this form of security infrastructure, although many so-called ZTNA solutions still only scratch the surface. Indeed, the strive towards ZTNA has played a major role in the wide-scale adoption of multi-factor authentication (MFA) as an industry standard, and the increasing utilization of technologies such as identity-aware proxies and software-defined perimeters is removing some of the user experience (UX) barriers that implementing ZTNA could create.
For zero trust to be effective, companies require a universal access control system that works seamlessly with all operating systems and software and can be connected and integrated anywhere. Companies must also ensure they protect against any hidden backdoors and potential supply chain attacks by regularly verifying and auditing their processes and procedures.
Do you believe that technologies that support zero trust are moving into the mainstream?
The aim of creating and implementing ZTNA is one that has entered the mainstream. However, we are still a long way from a one-size-fits-all solution that will allow businesses and organizations to establish ZTNA on their networks. Technologies that embrace ZT principles are becoming more widely available on the market, and a large number of cloud service providers have crafted products containing security features that adhere to the tenets of ZT. That being said, if companies are truly committed to implementing ZTNA, piecemeal solutions won’t work. Companies and organizations must take an all-in approach for ZT to become an industry benchmark.
Do you believe that enterprise IT departments today require a new way of thinking because the castle itself no longer exists in isolation as it once did?
The so-called castle and moat security model is dead, and it’s not coming back. Castles (companies) are now too big, their digital infrastructure is so vast, and their attack perimeter is so large, that it’s now impossible to build a moat (security perimeter) around them. According to a recent FlexJobs survey, 87% of workers are looking for jobs that will allow them to work in a remote or hybrid environment, creating endpoint security risks as individuals use personal devices for work purposes. Furthermore, password policies, firewalls, and VPNs are becoming less reliable, given that they are often based on implicit trust.
Cybercriminals, who have shown time and again that they are highly adaptive and opportunistic, are incredibly skilled at exploiting the implicit trust contained in traditional defensive measures. With ZTNA, the new perimeter starts with each endpoint. Instead of relying on IP addresses in isolation, networks with ZTNA can authenticate resources and use them individually. Microsegmentation, a central concept of establishing ZTNA, is also vital to reducing attack surfaces and hindering attackers from moving laterally across networks. In short, ZT can make companies more resilient and responsive to new attacks.
How can companies get started with zero trust?
Firstly, companies must start from the concept that ZT is a system where every person, device, file, and application is considered to be a threat until properly verified. Additionally, to establish a ZT framework, companies must adhere to three core principles: that authorization may be granted only after explicit verification, that companies must enforce a least-privileged model and limit access to a need-to-know basis, and that all traffic must be continuously inspected and logged to verify user behaviour.
ZT policy, like any cybersecurity plan, must be tailored to a business or organization’s interests and needs. For example, the introduction of multiple new solutions to meet ZT goals could in fact create new security gaps that threat actors could exploit. At Group-IB, our audit and consulting team can provide companies with all they need to evaluate their infrastructures and processes, and give them the required information to understand what their current security risks are, and how to mitigate them. A thorough audit can be an invaluable tool for companies looking to implement ZT, as it can provide a much-needed reality check along with an implementation action plan.
Industry experts have warned that cyber-attacks will be focused on techniques that zero trust controls can’t mitigate. What according to you can be done to address this?
Zero Trust may be the gold standard for cybersecurity, but it is by no means a silver bullet. Additional measures and solutions will always be required to complement any Zero Trust architecture. This includes services such as Managed Extended Detection and Response and data loss prevention solutions.
Organizations should ensure that they are up-to-date with the latest Threat Intelligence research produced by vendors, and they should conduct regular security checks, including audits, compromise assessments, and penetration testing exercises to ensure that their security perimeter can stand strong against the threats of today and tomorrow.
What according to you are the limitations of zero trust?
One of the major limitations of Zero Trust in its current form is its complexity. Introducing ZT is often associated with the complete overhaul of any established infrastructure, making the creation of ZTNA a costly and time-consuming process that, in and itself has no guarantee of success. The human factor has and will continue, to play a crucial role in successful cyber attacks. Cybercriminals will continue to turn to social engineering ploys to gain initial access, and human error will still lead to security breaches, even if ZTNA is established.
For companies, one of the major challenges for establishing ZTNA will be ensuring that the tradeoffs in user experience are mitigated. Increased security measures can lead to increased frustrations for consumers, who may choose to turn to other services. Security should be paramount, but companies will have to ensure that their applications and services remain usable.
Interviews
COP28: AI Can Be Leveraged to Deliver Actionable Insights

Paul Park, the Regional Director of MENAT at Milestone Systems, says climate change is complex and demands collaborative, cross-border solutions, often constrained by geopolitical tensions. (more…)
Interviews
COP28: Fortinet is Committed to Innovating for a Safer Internet

Alain Penel, the VP for Middle East, Turkey, and CIS, at Fortinet, says sustainability is central to his company’s vision
Please tell us about your efforts that ensure a sustainable and equitable digital future.
Sustainability is central to our company vision at Fortinet: making possible a digital world you can always trust, which is a fundamental element to achieving just and sustainable societies. Our corporate social responsibility mission is to deliver on that vision by innovating sustainable security technologies, diversifying cybersecurity talent, respecting the environment, and promoting responsible business across our value chain.
We are actively implementing our sustainability strategy across most material areas, and we continue to prioritize the security and privacy of individuals and organizations to enable digital progress and establish sound governance. We also remain committed to the vital issues of climate change and resource scarcity that impact us and our stakeholders.
What is your commitment to combat climate change?
Our commitment to the environment and our efforts to curtail climate change are reflected in our product innovation and manufacturing standards, the eco-footprint of our facilities, and our support of environmental policies and regulations. Fortinet has a strong commitment to product energy efficiency and has also sought to reduce its environmental impact by redesigning its packaging, shipping over 500,000 boxes with 100% eco-friendly, biodegradable packaging in 2022. We have also taken tangible measures to mitigate our environmental impact and harmful emissions by signing onto the Science-Based Targets initiative (SBTi) to achieve net-zero greenhouse gas emissions across our value chain by no later than 2050.
How are you aligning your sustainability initiatives with the themes of COP28?
In line with the COP28 theme of education and skills, we have a mission to grow an inclusive cybersecurity workforce. Fortinet has already trained 219,465 people in cybersecurity as part of our goal to reach 1 million individuals trained in cybersecurity by 2026. We have also seen a +39% year-on-year increase in women hired.
When it comes to promoting responsible business and accountability, Fortinet delivers training on the impacts of human rights throughout the product life cycle to key business units. 100% of our key contract manufacturers and over 90% of our distributors globally have completed Fortinet’s training on compliance and business ethics.
Finally, in line with the COP28 theme of innovation, Fortinet is committed to innovating for a safer internet. Over 200,000 pieces of malicious cyberinfrastructure were disrupted as part of INTERPOL’s anti-cybercrime operation in Africa; 5 new product families and services were designed to support security teams in the arms race against cybercrime; and 13 new information security certifications and assessments were completed, including SOC2, HIPAA, TISAX.
Cyber Security
Databases Are the Black Boxes for Most Organisations

Nik Koutsoukos, the Vice President of SolarWinds, says databases represent the most difficult ecosystems to observe, tune, manage, and scale
Tell us about the SolarWinds database observability platform.
Nearly everything a modern business does from a digital perspective requires data. Thus, databases are among the enterprise’s most valuable IT assets. This makes it critical for organisations to ensure their databases are optimised for performance and cost.
That said, databases represent the most difficult ecosystems to observe, tune, manage, and scale. Not only are there different types of databases that serve different purposes, but they are also populated by different types of data, adding to their complexity. The implications of not having visibility into your databases can be anywhere from a costly annoyance to a significant issue that causes business service disruption. For example, most application performance issues, between 70% and 88%, are rooted in the database.
For this reason, databases have largely been seen as a black box for most organisations. You know what goes into it. And you know what comes out and how long that took. However, the complexities that occur within the black box of the database are harder to discern.
This is where the SolarWinds Database Observability comes in. This offering is built for the needs of the modern enterprise environment and helps ensure optimal performance by providing full, unified visibility and query-level workload monitoring across centralised, distributed, cloud-based, and on-premises databases. Organisations armed with SolarWinds Database Observability enhance their ability to understand database implications as new code is deployed, utilise real-time troubleshooting of database performance issues, and isolate unusual behaviour and potential issues within the database.
How does database observability help IT teams track and manage infrastructure, applications, and possible threats?
Database observability collects data about the performance, stability, and overall health of an organisation’s monitored databases to address and prevent issues, and provides deep database performance monitoring to drive speed, efficiency, and savings. With SolarWinds Observability — which supports MongoDB, MySQL, PostgreSQL, and SQL Server database instances — database performance, responsiveness, and error rate are conveniently displayed in dashboards.
Moreover, alerts can be configured to notify admins by email or other methods when user-defined thresholds are crossed. This allows them to identify and remedy issues before they can develop. By gaining insight into the activities taking place inside their database instances, teams can understand user experience as well as ensure systems can scale to meet demand.
What sort of enhancements has your observability platform received recently?
Just this November, we announced major enhancements in the Database Observability capability within our cloud-based SolarWinds Observability platform. SolarWinds Database Observability provides full visibility into open-source, cloud-enabled, and NoSQL databases to identify and address costly and critical threats to their systems and business. It is now possible to navigate across all of the samples collected globally, giving IT teams an empirical distribution of random samples, which resembles the main workload.
What factors according to you will drive the adoption of observability tools in the MEA region?
The Middle East, Türkiye, and Africa (META) are riding a wave of rampant digital transformation as organisations seek to remain competitive. According to IDC, digital transformation spending in the Middle East will accelerate at a compound annual growth rate (CAGR) of 16% over the five-year period, topping US$74 billion in 2026 and accounting for 43.2% of all ICT investments made that year. As organisations continue to shift workloads to multi- and hybrid-cloud environments, the complexity of their IT environments still continues to increase. This raises the potential for visibility and monitoring gaps which ultimately translate to underwhelming or outright frustrating experiences for end users.
Tell us about the top three trends you foresee for 2024.
There are clear signs of the continued adoption of cloud technologies to allow enterprises to become more agile, giving engineering teams the ability to focus on their core competencies and expand and contract on demand.
The adoption of Kubernetes is also increasing as the refocusing introduced by the cloud enables the move to microservices-based architectures which require sophisticated orchestration management.
Finally, we are starting to see an uptick in Vector databases, as applications demand better handling of relationships between data points.
What is going to be your top priority in terms of strategies for 2024?
We will continue to deliver on our vision of making observability easy. OpenTelemetry is driving observability, but data collection is nothing if it can’t provide insights. So, we aim to ensure the data is both collected and curated such that users find it easy to consume and extract valuable insight.
Regionally, through 2024, we will continue to focus on our key markets of the UAE and Saudi Arabia, the ongoing enhancement of our product portfolio, and the strengthening of our channel ecosystem to create more markets for our business and for our partners.
-
Cyber Security1 week ago
Databases Are the Black Boxes for Most Organisations
-
News1 week ago
Proofpoint Appoints Sumit Dhawan as Chief Executive Officer
-
Cyber Security1 week ago
Cybersecurity on a Budget: Affordable Cybersecurity Strategies for Small Businesses
-
Cyber Security1 week ago
ManageEngine Intros Enhanced SIEM with Dual-Layered System for Better Precision in Threat Detection
-
Cloud1 week ago
Google Clarifies the Cause of Missing Google Drive Files
-
Interviews3 days ago
COP28: AI Can Be Leveraged to Deliver Actionable Insights
-
Interviews3 days ago
COP28: Fortinet is Committed to Innovating for a Safer Internet
-
Expert Speak3 days ago
Don’t Brush It Off – Plan Your Incident Response Now