Cloud Security
Cloud security refers to an array of policies, technological procedures, services, and solutions
designed to support safe functionality when building, deploying, and managing cloud-based
applications and associated data whether operating in public, private, or hybrid cloud
environments.
Cloud security creates and maintains preventative strategies and actions to combat any threat to
networked systems and applications.
Key Areas of Cloud Security
Security Policy Framework
• When creating a secure cloud solution, organizations must adopt strong security policies and
governance to mitigate risk and meet accepted standards for security and compliance.
• As organizations increasingly adopt cloud environments, they establish cloud-specific security
policies that are often an extension of their corporate security policy.
• To ensure a successful cloud adoption, both cloud service consumers and cloud service providers
need to establish and follow their respective cloud security policies.
• These security policies are often aligned to the cloud consumption and delivery
model Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a
Service (SaaS).
Basic concepts (CIA) - Confidentiality, Integrity
and Availability
Cloud Security is a mix of technologies, controls to safeguard the data, and policies to protect the
data, services, and infrastructure.
This combination is a target of possible attacks. Despite powerful and reliable server compared
to client processing power and reliability, there are many threats facing the cloud not only from
an outsider but also from an insider who can utilize cloud vulnerabilities to do harm.
These threats may jeopardize data confidentiality, data integrity, and data availability. A model
designed to guide policies for information security within an organization is a starting point for
security matters.
The elements of the triad are considered the three most crucial components of security.
Confidentiality is a set of rules that limits access to information.
Integrity is the assurance that the information is trustworthy and accurate.
Availability is a guarantee of reliable access to the information by authorized people.
Confidentiality
Confidentiality refers to the prevention of the unauthorized access of the data and hence making
sure that only the user who has the permission can access the data.
Due to a greater number of parties, devices, and applications involved, the threat of compromise
of data is high. This happens because of the increase in point of access.
Since confidentiality plays a major role in protecting organizational or individual data,
information security protocols should be implemented at various layers of cloud applications.
Confidentiality breach in the cloud
• There is always a possibility that the data stored in the cloud may mingle with other user’s data.
• Data can also be compromised unintentionally due to data remanence.
• Data Remanence is the residual representation of the data that remains even after efforts are
made to erase the data.
• Confidentiality can also be compromised due to non-trustworthy cloud service providers (CSP).
Ensuring Confidentiality in the cloud
• Confidentiality can be ensured through better encryption techniques. Basically, there are two
different approaches to achieve confidentiality: physical isolation and Cryptography.
• Biometric encryption (BE) - Confidentiality of biometric data can be achieved through biometric
encryption. Biometric identification includes iris, voice, fingerprint, face recognition, etc.
Biometric identification has shortcomings.
• Secret sharing scheme - Secret sharing scheme refers to any method for distributing a secret
among a group of participants, each of which allocates a share of the secret. The secret can only
be reconstructed when the shares are combined; individual shares are of no use on their own.
• Confidentiality preserving through encryption and obfuscation.
• HPI_SECURE - HPISecure allows the user to store an encrypted version of his data on the cloud
without breaking the functionality of the application. HPISecure intercepts the HTTP
request/response objects encrypts data before transmitting to the cloud and decrypts data
received back from the server.
• Ensure confidentiality using Encryption and trust-based solution techniques.
• K-NN classifier for data confidentiality- K-NN data classification technique is modulated in the
cloud virtual environment. The aim to use K-NN is to classify the data based on their security
needs.
• Encryption needs to be integrated with some other techniques to provide better and stronger
security.
Integrity
Data integrity refers to maintaining and assuring the accuracy and consistency of data over its
entire lifecycle.
Data that is stored in the cloud could suffer from the damage on transmitting to/from cloud data
storage.
Since the data and computation are outsourced to a remote server, the data integrity should be
maintained and checked constantly in order to prove that data and computation are intact.
Data integrity means data should be kept from unauthorized modification. Any modification to
the data should be detected.
Computation integrity means that program execution should be as expected and be kept from
malware, an insider, or a malicious user that could change the program execution and render an
incorrect result. Any deviation from normal computation should be detected.
Integrity should be checked at the data level and computation level. Data integrity could help in
getting lost data or notifying if there is data manipulation.
Data Integrity violation in the Cloud
The data integrity could be violated through Data Loss or Manipulation and Untrusted Remote
Server Performing Computation.
Data Loss or Manipulation:
Users have a huge number of user files. Therefore, cloud providers provide Storage as Service
(SaaS). Those files can be accessed every day or sometimes rarely. Therefore, there is a strong
need to keep them correct. This need is caused by the nature of cloud computing since the data
is outsourced to a remote cloud, which is unsecured and unreliable. Since the cloud is
untrustworthy, the data might be lost or modified by unauthorized users. In many cases, data
could be altered intentionally or accidentally. Also, there are many administrative errors that
could cause loss of data such as getting or restoring incorrect backups. The attacker could utilize
the users out-sourced data since they have lost control over it.
Untrusted Remote Server Performing Computation:
Cloud computing is not just about storage. Also, there are some intensive computations that
need cloud processing power to perform their tasks. Therefore, users outsource their
computations. Since the cloud provider is not in the security boundary and is not transparent to
the owner of the tasks, no one will prove whether the computation integrity is intact or not.
Sometimes, the cloud provider behaves in such a way that no one will discover a deviation of
computation from normal execution. Because the resources have a value to the cloud provider,
the cloud provider could not execute the task in a proper manner. Even if the cloud provider is
considered more secure, there are many issues such as those coming from the cloud provider’s
underlying systems, vulnerable code, or misconfiguration.
Protecting Data Integrity in the Cloud
Tenants of cloud systems commonly assume that if their data is encrypted before outsourcing it
to the cloud, it is secure enough. Although encryption is to provide solid confidentiality against
attacks from a cloud provider, it does not protect that data from corruption caused by
configuration errors and software bugs.
There are two traditional ways of proving the integrity of data outsourced in a remote server.
Checking the integrity of data can be by a client or by a third party. Checking the data integrity by
a client involves a message authentication code algorithm.
1. The first step is downloading the file and then checking the hash value.
2. The second step is to compute that hash value in the cloud by using a hash tree.
Sometimes, when the provided service is just storage without computation, the user downloads
the file, the same as in the first case, or sends it to a third party, which will consume more
bandwidth. Therefore, there is a need to find a way to check data integrity while saving
bandwidth and computation power.
These are the few different ways in which data integrity can be achieved in cloud.
1. Third-Party Auditor
2. Provable Data Possession
3. Proof of Retrievability
4. Proof of Ownership
Availability
Bookmark this page
Availability
Availability and business continuity management ensures your infrastructure, runtime
components, and management components are highly available.
Deploying apps to multiple geographic regions enables continuous availability that protects
against unplanned, simultaneous loss of multiple hardware or software components, or the loss
of an entire data center.
Separating those components that track the state of interactions (stateful) from those that do
not (stateless) allows the cloud to move apps flexibly as needed to achieve scalability and
resiliency.
The availability of the cloud means the set of resources that are accessible at any time by
authorized entities. Availability is considered as one of the essential security requirements in
cloud computing as it ensures the usage of resources at any time and at any place by cloud
users.
In some cases, the resources are available in a diverse manner which must be provided to the
cloud users without any interruptions during the cloud service access.
Data centers provide a huge number of services that are hosted on multiple servers. To support
these services or a huge amount of data transfer, it requires proper network links with high
bandwidth.
There are several types of security attacks that affect the availability of a cloud-based service like,
DoS, DDoS, flooding attacks, DNS reflection, and amplification attacks.
Basically, Denial of Service (DoS) attacks are classified into two categories as direct and indirect
attacks.
• In a direct attack, a single malicious request creates the server overloads by exploiting a
vulnerability or processing numerous requests.
• In the indirect attack, the flow of packets fully saturates the network links or intermediate
routers with bogus requests which terminates the honest connections while reaching the
bandwidth capacity.
To overcome the impact of DoS attacks, it requires setting up a High Availability (HA)
environment which spreads across multiple data centers and requires a proper Disaster
Recovery (DR) plan.
Best ensured by rigorously maintaining all hardware and performing hardware repairs
immediately when needed. Maintaining a correctly functioning operating system environment
that is free of software conflicts.
Continuously maintain the systems health by upgrading often to eliminate risks. Redundancy,
failover, High Availability clusters can mitigate serious consequences when hardware issues
occur.
Fast and adaptive disaster recovery plans must be in place. Plan to handle the loss of service due
to natural disasters or malicious attacks (like DDoS).