Connect with us

Cloud

Addressing Blind Spots in the Hybrid Cloud

Published

on

Written by Mark Jow, EMEA Technical Evangelist, Gigamon

With the rapid growth of the hybrid cloud market, businesses are experiencing numerous benefits. According to a study by Amazon Web Services, cloud computing is projected to add almost $181 billion to the UAE’s economy by the year 2033. Further reports reveal that in 2021, the adoption of cloud computing in the UAE contributed an astounding 2.26% to the country’s GDP, uplifting the economic value to $9.5 billion.

However, security has emerged as a significant challenge. In a recent survey conducted by Gigamon, we found that 90 percent of IT and Security leaders across EMEA, APAC and the US have experienced a data breach in the last 18 months. We also uncovered that over 70 per cent of IT security leaders admit they allow encrypted data to flow freely across their IT infrastructure. It seems therefore that there’s an industry-wide lack of awareness about blind spots and the complexity and risks in maintaining security in hybrid cloud environments.

How to identify blind spots
Going back to the basics, blind spots are areas within a hybrid cloud infrastructure that are not adequately reached by traditional security and monitoring tools. These areas remain hidden from view, hindering effective data collection and analysis and therefore compromising security.

The good news is that IT and Security professionals are increasingly becoming aware of the importance of avoiding blind spots: our research uncovered that unexpected blind spots being exploited are a major concern to CISOs. To address this concern, CISOs and their teams are embracing deep observability to provide complete visibility across their entire infrastructure. This is achieved by harnessing immutable, precise and actionable network-derived intelligence to amplify the power of existing tools, eliminating blind spots both on-premise and in the cloud, and providing greater visibility and understanding of an organisation’s security posture and potential threats.

Encrypted traffic and limited visibility
Yet there is still work to be done. There’s a huge underestimation of blind spots and what these consist of, considering only 30 per cent of organisations have visibility into encrypted traffic. Moreover, 35 per cent of respondents reported limited visibility into containers, and less than half (48 per cent) had visibility of east-to-west traffic, which involves the lateral movement of data within the hybrid cloud infrastructure. These limitations further contribute to the existence of unobserved segments in the hybrid cloud.

The impact of unrecognised blind spots
As a result, nearly one-third of breaches go undetected by IT and Security professionals and their tools, as identified in the latest survey that included 1000 IT professionals across EMEA, the US, Australia and Singapore. The failure to recognise blind spots significantly hampers the ability to effectively protect sensitive data and respond to security incidents. While surface-level confidence appears high, with 94 per cent of global respondents to our survey believing their security tools provide complete visibility, it’s clear this perception is simply not the reality of hybrid cloud security.

The hybrid cloud is inherently complex, and traditional security and monitoring tools are often insufficient in addressing blind spots in this area. To effectively eliminate blind spots and narrow the perception vs. reality gap in hybrid cloud security, CISOs and their teams must actively prioritise deep observability. By leveraging actionable network-derived intelligence, businesses can amplify the power of existing security and observability tools and gain comprehensive visibility of their complete hybrid cloud estate.

Implementing deep observability will significantly accelerate progress in improving visibility into containers, east-west traffic and encrypted data to bolster security and totally eradicate the blind spots that are keeping today’s CISOs up at night.

Cloud

Check Point Software Announces 2024 Cloud Security Report

Published

on

Check Point Software Technologies Ltd. has today unveiled its 2024 Cloud Security Report. The report exposes a critical surge in cloud security incidents, marking a significant increase from 24% in 2023 to 61% in 2024 (a 154% increase), highlighting the escalating complexity and frequency of cloud threats. The latest survey from Check Point reveals a concerning trend: while most organizations continue to prioritize threat detection and monitoring, focusing on known vulnerabilities and patterns of malicious behaviour, only a mere 21% emphasize prevention. This is particularly alarming as companies struggle to keep pace with rapid technological advancements, including the speed of DevOps and the deployment of new codes and applications in the cloud.

The survey underscores a daunting reality— although cloud attacks are on the rise, only 4% of organizations disclosed that they can mitigate risks easily and quickly. An overwhelming 96% have expressed concern about their ability to handle such risks. In addition, 91% of respondents are alarmed by the surge in more sophisticated cyber threats, including unknown risks and zero-day attacks, which cannot be detected using conventional security tools.

“The data speaks volumes about the urgent need for organizations to shift their focus towards implementing AI-powered threat prevention measures,” states Itai Greenberg, Chief Strategy Officer at Check Point Software Technologies. “By adopting a consolidated security architecture and enhancing collaborative security operations, businesses can preemptively tackle emerging threats, ensuring a more secure and resilient cloud environment.”

Other insights from the 2024 Cloud Security Report:

  1. Escalation of Cloud Incidents: There has been a 154% increase in cloud security incidents compared to last year, with 61% of organizations reporting significant disruptions.
  2. Deep Concerns Over Risk Management: An overwhelming 96% of respondents reported concerns about their ability to effectively manage cloud risks, reflecting a considerable escalation from previous years.
  3. Rapid Adoption of AI Technologies: With 91% of organizations now prioritizing AI to enhance their security posture, the focus has shifted towards leveraging AI for proactive threat prevention.
  4. CNAPP for Enhanced Prevention: Despite the growing threat landscape, only 25% of organizations have fully implemented Cloud Native Application Protection Platforms (CNAPP). This underscores the urgent need for comprehensive solutions that go beyond traditional tooling.
  5. Complexity in Cloud Security Integration: Despite the potential for streamlined solutions, 54% of respondents face challenges in maintaining consistent regulatory standards across multi-cloud environments. Additionally, 49% struggle with integrating cloud services into legacy systems, often complicated by limited IT resources.

The report advises organizations to embrace a more comprehensive, collaborative, and AI-driven cybersecurity framework.

Continue Reading

Cloud

Google Clarifies the Cause of Missing Google Drive Files

Published

on

Many Google Drive users recently experienced the unsettling disappearance of their files, prompting concerns. Google has now identified the root cause, attributing the issue specifically to the Google Drive for Desktop app. While assuring that only a limited subset of users is affected, the tech giant is actively investigating the matter and promises timely updates.

To prevent inadvertent file deletion, Google provides the following recommendations:

  1. Avoid clicking “Disconnect account” within Drive for desktop.
  2. Refrain from deleting or moving the app data folder, located at:
    • Windows: %USERPROFILE%\AppData\Local\Google\DriveFS
    • macOS: ~/Library/Application Support/Google/DriveFS
  3. Optionally, create a copy of the app data folder if there is sufficient space on your hard drive.

Before Google officially addressed the issue, distressed users took to the company’s support forum to report deleted files. One user from South Korea highlighted a particularly severe case where their account reverted to May 2023, resulting in the loss of anything uploaded or created after that date. Additionally, the user emphasised that they had not synced or shared their files or drive with anyone else.

As Google delves deeper into resolving this matter, affected users are advised to heed the provided precautions. The company’s commitment to ongoing updates reflects its dedication to swiftly addressing and rectifying the situation. The incident serves as a reminder of the importance of proactive measures to safeguard digital data, especially as users navigate cloud-based platforms such as Google Drive.

Continue Reading

Cloud

Five Ways to Maximise the Security, Performance and Reliability of Your Online Business

Published

on

Written by Bashar Bashaireh, Managing Director, Middle East & Turkey, Cloudflare

With a shift to digital transformation, enterprises face new challenges and opportunities for growth — from anticipating and meeting customers’ digital needs to mounting a strong defence against web-based attacks, overcoming latency issues, preventing site outages, and maintaining network connectivity and performance. When optimizing the online customer experience, enterprises need to adopt a strategy that integrates robust site security, performance, and reliability. Although this strategy involves many components, here are five key considerations that can help businesses meet customer needs and provide a secure and seamless user experience:

Leverage DNS and DNSSEC support to maximize availability and uptime
Frequently referred to as the ‘phone book of the Internet,’ DNS (domain name system) translates domain names into numeric IP addresses and enables browsers to load Internet resources. As DNS attacks become more prevalent, businesses are starting to realize that a lack of resilient DNS creates a weak link in their overall security strategy.

There are multiple approaches that companies can take to deploy a resilient DNS strategy. They can get a managed DNS provider that hosts all DNS records, offers query resolution at multiple nodes globally, and provides integrated DNSSEC support. DNSSEC adds a layer of security to the domain name system by adding cryptographic signatures to existing DNS records.

Companies can also build additional redundancy by deploying a multi-DNS strategy — even if the primary DNS goes down, secondary DNS helps keep the applications online. Large enterprises that prefer to maintain their own DNS infrastructure can implement a DNS firewall in conjunction with a secondary DNS. This setup adds a security layer to the on-prem DNS infrastructure and helps ensure overall DNS redundancy.

Accelerate content delivery by routing traffic across the least-congested routes
Today, the majority of web traffic is served through Content Delivery Networks (CDNs), including traffic from major sites like Amazon and Facebook. A CDN is a geographically distributed group of servers that help provide fast delivery of Internet content to globally dispersed users and can also reduce bandwidth costs.

With servers in multiple locations around the globe, a CDN is able to distribute content closer to website visitors, and in doing so, reduce any inherent network latency and improve page load times. CDNs also serve static assets from cache across their network, reducing the number of requests being made to hosted web servers and resulting in lower bandwidth and hosting costs.

Minimize the risk of site outages by globally load-balancing traffic
Maximizing server resources and efficiency can be a delicate balancing act. Cloud-based load balancers distribute requests across multiple servers in order to handle spikes in traffic. The load balancing decision takes place at the network edge, closer to the users — allowing businesses to boost response time and effectively optimize their infrastructure while minimizing the risk of server failure.

Protect web applications from malicious attacks
When securing web applications and other business-critical properties, a layered security strategy can help defend against many different kinds of threats.

  • Web application firewall protection – A web application firewall, or WAF, protects web applications by filtering and monitoring HTTP traffic. Cloud-based WAFs are typically the most flexible and cost-effective solution to implement, as they can be consistently updated to protect against new threats without significant additional work or cost on the user’s end.
  • DDoS attack protection – A DDoS attack is a malicious attempt to overburden servers, devices, networks, or surrounding infrastructure with a flood of illegitimate Internet traffic. By consuming all available bandwidth between targeted devices and the Internet, these attacks not only cause significant service disruptions but have a tangible and negative impact on business as customers are unable to access a business’s resources.
  • Malicious bot mitigation – Sites may become compromised when targeted by malicious bot activity, which can overwhelm web servers, skew analytics, prevent users from accessing webpages, steal user data, and compromise critical business functions. By implementing a bot management solution, businesses can distinguish between useful and harmful bot activity and prevent malicious behaviour from impacting user experience.

Keep your network up and running

  • Protect your network infrastructure – It’s not enough to just protect web servers. Enterprises often have on-premise network infrastructure hosted in public or private data centres that needs protection from DDoS attacks, too. Many DDoS mitigation providers rely on one of two methods for stopping an attack: scrubbing centres or on-premise scanning and filtering via hardware boxes. The problem with both approaches is that they impose a latency penalty that can adversely affect a business. A better way to detect and mitigate DDoS attacks is to do so close to the source — at the network edge. By scanning traffic at the closest data centre in a global, distributed network, high service availability is assured, even during substantial DDoS attacks. This approach reduces the latency penalties that come from routing suspicious traffic to geographically distant scrubbing centres. It also leads to faster attack response times.
  • Protect TCP/UDP applications – At the transport layer, attackers may target a business’s server resources by overwhelming all available ports on a server. These DDoS attacks can cause the server to respond slowly to legitimate requests — or not at all. Preventing attacks at the transport layer requires a security solution that can automatically detect attack patterns and block attack traffic.

In conclusion, creating a superior online experience requires the right security and performance strategy — one that not only enables enterprises to accelerate content delivery, but ensures network reliability and protects their web properties from site outages, data theft, and other critical attacks.

Continue Reading
Advertisement

Follow Us

Trending

Copyright © 2021 Security Review Magazine. Rysha Media LLC. All Rights Reserved.