Connect with us

Cloud

Preparing a Secure Cloud Environment in the Digital New Norm

Published

on

Written by Daniel Jiang, General Manager of the Middle East and Africa, Alibaba Cloud Intelligence

As hybrid or remote working is being adopted by many companies globally and becoming the ‘new norm’ for millions of workers, cyberattacks meanwhile continue unabated. Building a secure and reliable IT environment has therefore become an increasingly important priority for many businesses who are exploring opportunities in the global digital economy. While moving to the cloud and using cloud-based security features is a good way to challenge cyber risks, it’s important to delve deeper into how best to construct a secure and reliable cloud environment that can fend off even the most determined attacker.

In today’s digital environment, discussions about cyber security’s best practices have never been more important. The UAE in particular established the Cybersecurity Council to develop a cybersecurity strategy and build a secure cyber infrastructure by creating related regulations. Following this move, the nation ranked 5th place on the International Telecommunications Union’s Global Cybersecurity Index 2020, jumping 33 places and it continues to prioritize cyber security and awareness. Creating a secure cloud environment – from building the architecture to adopting cutting-edge security technologies and putting in place important security management practices – will inspire more thorough conversations on this subject.

A resilient and robust security architecture is essential for creating a cloud environment capable of assuring an organisation about the availability, confidentiality and integrity of its systems and data. From the bottom up, the architecture should include security modules of different layers, so that companies can build trustworthy data security solutions on the cloud layer by layer – from the infrastructure security, data security, and application security to business security layers.

In addition to the security modules of all of the layers, there are a variety of automated data protection tools that enable companies to perform data encryption, visualisation, leakage prevention, operation log management and access control in a secure computing environment. Enterprises can also leverage cloud-based IT governance solutions for custom designs of cloud security systems to meet compliance requirements from network security and data security to operation auditing and configuration auditing. This ensures full-lifecycle data security on the cloud, with controllable and compliant data security solutions in place.

Another consideration is to build a multi-tenant environment, abiding by the principle of least privilege and adopting consistent management and control standards to protect user data from unauthorised access. In addition, establishing strict rules for data ownership and operations on data, such as data access, retention and deletion, is also pivotal in creating a safe environment.

Moreover, enterprises can embrace the zero-trust security architecture and build a zero-trust practice by design to protect the most sensitive systems. The architecture requires everything (including users, devices and nodes) requesting access to internal systems to be authenticated and authorised using identity access protocols. As such, the zero-trust security architecture cuts down on automatic trust, or trust without continuous verification, addressing modern challenges in securing remote working environments, hybrid cloud settings and increasingly aggressive cyber threats.

Cutting-edge security technologies such as comprehensive data encryption, confidential computing and many more emerging tech solutions, can be leveraged to ensure we stay on top of the trends in cybersecurity. Comprehensive data encryption provides advanced data encryption capabilities on transmission links (such as data-in-motion), compute nodes (such as data-in-use), and storage nodes (such as data-at-rest). Key Management Service and Data Encryption Service help users securely manage their keys and use a variety of encryption algorithms to perform encryption operations.

Another emerging technology to safeguard the cloud environment is confidential computing. Confidential computing is dedicated to securing data in use while it is being processed, protecting users’ most sensitive workloads. Confidential computing based on trusted execution environments (TEEs), ensures data security, integrity and confidentiality while simplifying the development and delivery of trusted or confidential applications at lower costs.

It is equally important to adopt proper security management practices and mechanisms to maximise the security protection of one’s critical system and important data. One essential mechanism to protect the cloud environment is to develop a comprehensive disaster recovery system, which enables businesses to configure emergency plans for data centres based on factors such as power, temperature and disasters, and establish redundant systems for basic services such as cloud computing, network and storage. It helps companies to deploy their business across regions and zones and build disaster recovery systems that support multiple recovery models.

Setting the effective reviewing and response mechanism for your cloud security issues is imperative. First, having vulnerability scanning and testing in place is important to assess the security status of systems; second, it is vital to use cloud-native monitoring tools to detect any anomalous behaviour or insider threats; furthermore, establishing proper procedures and responsibility models to quickly and accurately assess where vulnerabilities exist and their severity, will help ensure that quick remedy actions can be taken when security problems emerge.

In the future, developing the security architecture, technologies, management and response mechanism will no longer be perceived as a cost-centre burden for companies, but rather, as critical capabilities to safeguard the performance and security of daily business operations. Crafting a comprehensive cloud security plan, adopting the best industrial practices, and choosing a professional cloud service provider with strong security credentials to work with, should be an imperative subjects in a CXO’s agenda.

Cloud

Google Clarifies the Cause of Missing Google Drive Files

Published

on

Many Google Drive users recently experienced the unsettling disappearance of their files, prompting concerns. Google has now identified the root cause, attributing the issue specifically to the Google Drive for Desktop app. While assuring that only a limited subset of users is affected, the tech giant is actively investigating the matter and promises timely updates.

To prevent inadvertent file deletion, Google provides the following recommendations:

  1. Avoid clicking “Disconnect account” within Drive for desktop.
  2. Refrain from deleting or moving the app data folder, located at:
    • Windows: %USERPROFILE%\AppData\Local\Google\DriveFS
    • macOS: ~/Library/Application Support/Google/DriveFS
  3. Optionally, create a copy of the app data folder if there is sufficient space on your hard drive.

Before Google officially addressed the issue, distressed users took to the company’s support forum to report deleted files. One user from South Korea highlighted a particularly severe case where their account reverted to May 2023, resulting in the loss of anything uploaded or created after that date. Additionally, the user emphasised that they had not synced or shared their files or drive with anyone else.

As Google delves deeper into resolving this matter, affected users are advised to heed the provided precautions. The company’s commitment to ongoing updates reflects its dedication to swiftly addressing and rectifying the situation. The incident serves as a reminder of the importance of proactive measures to safeguard digital data, especially as users navigate cloud-based platforms such as Google Drive.

Continue Reading

Cloud

Addressing Blind Spots in the Hybrid Cloud

Published

on

Written by Mark Jow, EMEA Technical Evangelist, Gigamon

With the rapid growth of the hybrid cloud market, businesses are experiencing numerous benefits. According to a study by Amazon Web Services, cloud computing is projected to add almost $181 billion to the UAE’s economy by the year 2033. Further reports reveal that in 2021, the adoption of cloud computing in the UAE contributed an astounding 2.26% to the country’s GDP, uplifting the economic value to $9.5 billion.

However, security has emerged as a significant challenge. In a recent survey conducted by Gigamon, we found that 90 percent of IT and Security leaders across EMEA, APAC and the US have experienced a data breach in the last 18 months. We also uncovered that over 70 per cent of IT security leaders admit they allow encrypted data to flow freely across their IT infrastructure. It seems therefore that there’s an industry-wide lack of awareness about blind spots and the complexity and risks in maintaining security in hybrid cloud environments.

How to identify blind spots
Going back to the basics, blind spots are areas within a hybrid cloud infrastructure that are not adequately reached by traditional security and monitoring tools. These areas remain hidden from view, hindering effective data collection and analysis and therefore compromising security.

The good news is that IT and Security professionals are increasingly becoming aware of the importance of avoiding blind spots: our research uncovered that unexpected blind spots being exploited are a major concern to CISOs. To address this concern, CISOs and their teams are embracing deep observability to provide complete visibility across their entire infrastructure. This is achieved by harnessing immutable, precise and actionable network-derived intelligence to amplify the power of existing tools, eliminating blind spots both on-premise and in the cloud, and providing greater visibility and understanding of an organisation’s security posture and potential threats.

Encrypted traffic and limited visibility
Yet there is still work to be done. There’s a huge underestimation of blind spots and what these consist of, considering only 30 per cent of organisations have visibility into encrypted traffic. Moreover, 35 per cent of respondents reported limited visibility into containers, and less than half (48 per cent) had visibility of east-to-west traffic, which involves the lateral movement of data within the hybrid cloud infrastructure. These limitations further contribute to the existence of unobserved segments in the hybrid cloud.

The impact of unrecognised blind spots
As a result, nearly one-third of breaches go undetected by IT and Security professionals and their tools, as identified in the latest survey that included 1000 IT professionals across EMEA, the US, Australia and Singapore. The failure to recognise blind spots significantly hampers the ability to effectively protect sensitive data and respond to security incidents. While surface-level confidence appears high, with 94 per cent of global respondents to our survey believing their security tools provide complete visibility, it’s clear this perception is simply not the reality of hybrid cloud security.

The hybrid cloud is inherently complex, and traditional security and monitoring tools are often insufficient in addressing blind spots in this area. To effectively eliminate blind spots and narrow the perception vs. reality gap in hybrid cloud security, CISOs and their teams must actively prioritise deep observability. By leveraging actionable network-derived intelligence, businesses can amplify the power of existing security and observability tools and gain comprehensive visibility of their complete hybrid cloud estate.

Implementing deep observability will significantly accelerate progress in improving visibility into containers, east-west traffic and encrypted data to bolster security and totally eradicate the blind spots that are keeping today’s CISOs up at night.

Continue Reading

Cloud

Five Ways to Maximise the Security, Performance and Reliability of Your Online Business

Published

on

Written by Bashar Bashaireh, Managing Director, Middle East & Turkey, Cloudflare

With a shift to digital transformation, enterprises face new challenges and opportunities for growth — from anticipating and meeting customers’ digital needs to mounting a strong defence against web-based attacks, overcoming latency issues, preventing site outages, and maintaining network connectivity and performance. When optimizing the online customer experience, enterprises need to adopt a strategy that integrates robust site security, performance, and reliability. Although this strategy involves many components, here are five key considerations that can help businesses meet customer needs and provide a secure and seamless user experience:

Leverage DNS and DNSSEC support to maximize availability and uptime
Frequently referred to as the ‘phone book of the Internet,’ DNS (domain name system) translates domain names into numeric IP addresses and enables browsers to load Internet resources. As DNS attacks become more prevalent, businesses are starting to realize that a lack of resilient DNS creates a weak link in their overall security strategy.

There are multiple approaches that companies can take to deploy a resilient DNS strategy. They can get a managed DNS provider that hosts all DNS records, offers query resolution at multiple nodes globally, and provides integrated DNSSEC support. DNSSEC adds a layer of security to the domain name system by adding cryptographic signatures to existing DNS records.

Companies can also build additional redundancy by deploying a multi-DNS strategy — even if the primary DNS goes down, secondary DNS helps keep the applications online. Large enterprises that prefer to maintain their own DNS infrastructure can implement a DNS firewall in conjunction with a secondary DNS. This setup adds a security layer to the on-prem DNS infrastructure and helps ensure overall DNS redundancy.

Accelerate content delivery by routing traffic across the least-congested routes
Today, the majority of web traffic is served through Content Delivery Networks (CDNs), including traffic from major sites like Amazon and Facebook. A CDN is a geographically distributed group of servers that help provide fast delivery of Internet content to globally dispersed users and can also reduce bandwidth costs.

With servers in multiple locations around the globe, a CDN is able to distribute content closer to website visitors, and in doing so, reduce any inherent network latency and improve page load times. CDNs also serve static assets from cache across their network, reducing the number of requests being made to hosted web servers and resulting in lower bandwidth and hosting costs.

Minimize the risk of site outages by globally load-balancing traffic
Maximizing server resources and efficiency can be a delicate balancing act. Cloud-based load balancers distribute requests across multiple servers in order to handle spikes in traffic. The load balancing decision takes place at the network edge, closer to the users — allowing businesses to boost response time and effectively optimize their infrastructure while minimizing the risk of server failure.

Protect web applications from malicious attacks
When securing web applications and other business-critical properties, a layered security strategy can help defend against many different kinds of threats.

  • Web application firewall protection – A web application firewall, or WAF, protects web applications by filtering and monitoring HTTP traffic. Cloud-based WAFs are typically the most flexible and cost-effective solution to implement, as they can be consistently updated to protect against new threats without significant additional work or cost on the user’s end.
  • DDoS attack protection – A DDoS attack is a malicious attempt to overburden servers, devices, networks, or surrounding infrastructure with a flood of illegitimate Internet traffic. By consuming all available bandwidth between targeted devices and the Internet, these attacks not only cause significant service disruptions but have a tangible and negative impact on business as customers are unable to access a business’s resources.
  • Malicious bot mitigation – Sites may become compromised when targeted by malicious bot activity, which can overwhelm web servers, skew analytics, prevent users from accessing webpages, steal user data, and compromise critical business functions. By implementing a bot management solution, businesses can distinguish between useful and harmful bot activity and prevent malicious behaviour from impacting user experience.

Keep your network up and running

  • Protect your network infrastructure – It’s not enough to just protect web servers. Enterprises often have on-premise network infrastructure hosted in public or private data centres that needs protection from DDoS attacks, too. Many DDoS mitigation providers rely on one of two methods for stopping an attack: scrubbing centres or on-premise scanning and filtering via hardware boxes. The problem with both approaches is that they impose a latency penalty that can adversely affect a business. A better way to detect and mitigate DDoS attacks is to do so close to the source — at the network edge. By scanning traffic at the closest data centre in a global, distributed network, high service availability is assured, even during substantial DDoS attacks. This approach reduces the latency penalties that come from routing suspicious traffic to geographically distant scrubbing centres. It also leads to faster attack response times.
  • Protect TCP/UDP applications – At the transport layer, attackers may target a business’s server resources by overwhelming all available ports on a server. These DDoS attacks can cause the server to respond slowly to legitimate requests — or not at all. Preventing attacks at the transport layer requires a security solution that can automatically detect attack patterns and block attack traffic.

In conclusion, creating a superior online experience requires the right security and performance strategy — one that not only enables enterprises to accelerate content delivery, but ensures network reliability and protects their web properties from site outages, data theft, and other critical attacks.

Continue Reading
Advertisement

Follow Us

Trending

Copyright © 2021 Security Review Magazine. Rysha Media LLC. All Rights Reserved.