
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
- Publications
- Account settings
- Advanced Search
- Journal List
- Comput Intell Neurosci
- v.2022; 2022


The Rise of Cloud Computing: Data Protection, Privacy, and Open Research Challenges—A Systematic Literature Review (SLR)
Junaid hassan.
1 Department of Computer Science, National University of Computer and Emerging Sciences, Islamabad, Chiniot-Faisalabad Campus, Chiniot 35400, Pakistan
Danish Shehzad
2 Department of Computer Science, Superior University, Lahore 54000, Pakistan
Usman Habib
3 Faculty of Computer Sciences and Engineering, GIK Institute of Engineering Sciences and Technology, Topi, Swabi 23640, Khyber Pakhtunkhwa, Pakistan
Muhammad Umar Aftab
Muhammad ahmad, ramil kuleev.
4 Institute of Software Development and Engineering, Innopolis University, Innopolis 420500, Russia
Manuel Mazzara
Associated data.
The data used to support the findings of this study are provided in this article.
Cloud computing is a long-standing dream of computing as a utility, where users can store their data remotely in the cloud to enjoy on-demand services and high-quality applications from a shared pool of configurable computing resources. Thus, the privacy and security of data are of utmost importance to all of its users regardless of the nature of the data being stored. In cloud computing environments, it is especially critical because data is stored in various locations, even around the world, and users do not have any physical access to their sensitive data. Therefore, we need certain data protection techniques to protect the sensitive data that is outsourced over the cloud. In this paper, we conduct a systematic literature review (SLR) to illustrate all the data protection techniques that protect sensitive data outsourced over cloud storage. Therefore, the main objective of this research is to synthesize, classify, and identify important studies in the field of study. Accordingly, an evidence-based approach is used in this study. Preliminary results are based on answers to four research questions. Out of 493 research articles, 52 studies were selected. 52 papers use different data protection techniques, which can be divided into two main categories, namely noncryptographic techniques and cryptographic techniques. Noncryptographic techniques consist of data splitting, data anonymization, and steganographic techniques, whereas cryptographic techniques consist of encryption, searchable encryption, homomorphic encryption, and signcryption. In this work, we compare all of these techniques in terms of data protection accuracy, overhead, and operations on masked data. Finally, we discuss the future research challenges facing the implementation of these techniques.
1. Introduction
Recent advances have given rise to the popularity and success of cloud computing. It is a new computing and business model that provides on-demand storage and computing resources. The main objective of cloud computing is to gain financial benefits as cloud computing offers an effective way to reduce operational and capital costs. Cloud storage is a basic service of cloud computing architecture that allows users to store and share data over the internet. Some of the advantages of cloud storage are offsite backup, efficient and secure file access, unlimited data storage space, and low cost of use. Generally, cloud storage is divided into five categories: (1) private cloud storage, (2) personal cloud storage, (3) public cloud storage, (4) community cloud storage, and (5) hybrid cloud storage.
However, when we outsource data and business applications to a third party, security and privacy issues become a major concern [ 1 ]. Before outsourcing private data to the cloud, there is a need to protect private data by applying different data protection techniques, which we will discuss later in this SLR. After outsourcing the private data to the cloud, sometimes the user wants to perform certain operations on their data, such as secure search. Therefore, while performing such operations on private data, the data needs to be protected from intruders so that intruders cannot hack or steal their sensitive information.
Cloud computing has many advantages because of many other technical resources. For example, it has made it possible to store large amounts of data, perform computation on data, and many other various services. In addition, the cloud computing platform reduces the cost of services and also solves the problem of limited resources by sharing important resources among different users. Performance and resource reliability requires that the platform should be able to tackle the security threats [ 2 ]. In recent years, cloud computing has become one of the most important topics in security research. These pieces of research include software security, network security, and data storage security.
The National Institute of Standards and Technology (NIST) defines cloud computing as [ 3 ] “a model for easy access, ubiquitous, resource integration, and on-demand access that can be easily delivered through various types of service providers. The Pay as You Go (PAYG) mechanism is followed by cloud computing, in which users pay only for the services they use. The PAYG model gives users the ability to develop platforms, storage, and customize the software according to the needs of the end-user or client. These advantages are the reason that the research community has put so much effort into this modern concept [ 4 ].
Security is gained by achieving confidentiality, integrity, and data availability. Cloud users want assurance that their data must be saved while using cloud services. There are various types of attacks that launch on a user's private data, such as intrusion attacks, hacking, stealing the user's private data, and denial of service attacks. 57% of companies report security breaches using cloud services [ 5 ]. Data privacy is more important than data security because cloud service providers (CSPs) have full access to all cloud user's data and can monitor their activities, because of which the cloud user privacy is compromised. For example, a user is a diabetic, and the CSP is analyzing their activities, such as what he is searching for more and what kind of medicine he is using the most. Because of this access, CSP can get all the sensitive information about an individual user and can also share this information with a medicine company or an insurance company [ 6 ]. Another problem is that the user cannot fully trust CSP. Because of this reason, there are many legal issues. Users cannot store their sensitive data on unreliable cloud services because of this mistrust. As a result, many users cannot use cloud services to store their personal or sensitive data in the cloud. There are two ways to solve this problem. One is that the user installs a proxy on his side, and this proxy takes the user's data, encrypts and saves their data using some data protection techniques, and then sends it to the untrusted CSP [ 7 ].
The recent Google privacy policy is that any user can use any Google service free of cost; however, Google monitors their activity by monitoring their data to improve their services [ 8 ]. In this paper, we compare different types of data protection techniques that provide privacy and security over the data stored on the cloud. Many papers discuss outsourcing data storage on the cloud [ 9 , 10 ], however, we also discuss how we can secure the outsourced data on the cloud. Most of the paper describes the data security on the cloud vs the external intruder attacks [ 11 , 12 ]. This paper not only discusses the security attacks from outside intruders and securing mechanisms but also inner attacks from the CSP itself. Many surveys cover data privacy by applying cryptographic techniques [ 13 , 14 ]. These cryptographic techniques are very powerful for the protection of data and also provide a very significant result. However, there is a problem as these cryptographic techniques require key management, and some of the cloud functionalities are not working on these cryptographic techniques. In this paper, we also discuss some steganographic techniques. To the best of our knowledge, no study discusses all the conventional and nonconventional security techniques. Therefore, all the data protection techniques need to be combined in one paper.
The rest of this paper is organized as follows: Section 3 of the paper describes the research methodology that consists of inclusion, exclusion criteria, quality assessment criteria, study selection process, research questions, and data extraction process. Also, we discuss assumptions and requirements for data protection in the cloud. Section 4 presents all the cryptographic and also noncryptographic techniques that are used for data protection over the cloud. Also, we discuss the demographic characteristics of the relevant studies by considering the following four aspects: (i) publication trend, (ii) publication venues (proceeding and journals), (iii) number of citations, and (iv) author information. Section 4 also compares all these data protection techniques. Lastly, in Section 5 , we discuss results and present conclusion and future work.
2. Related Work
The first access control mechanism and data integrity in the provable data possession (PDP) model is proposed in the paper [ 15 ], and it provides two mobile applications based on the RSA algorithm. Like the PDP, the author in the paper [ 16 ] proposed a proof of retrievability (PoR) scheme that is used to ensure the integrity of remote data. PoR scheme efficiency is improved using a shorter authentication tag that is integrated with the PoR system [ 17 ]. A more flexible PDP scheme is proposed by the author of the paper [ 18 ] that uses symmetric key encryption techniques to support dynamic operations. A PDP protocol with some flexible functionality is developed, in which, we can add some blocks at run time [ 19 ]. A new PDP system with a different data structure is introduced, and it improves flexibility performance [ 20 ]. Similarly, another PDP model with a different data structure is designed to handle its data functionality [ 21 ]. To improve the accuracy of the data, the author of the paper [ 22 ] designed a multireplicas data verification scheme that fully supports dynamic data updates.
A unique data integration protocol [ 23 ] for multicloud servers is developed. The author of the paper [ 24 ] also considers the complex area where multiple copies are stored in multiple CSPs and builds a solid system to ensure the integrity of all copies at once. A proxy PDP scheme [ 25 ] is proposed, which supports the delegation of data checking that uses concessions to verify auditor consent. In addition, the restrictions of the verifier are removed that strengthened the scheme, and it proposes a separate PDP certification system [ 26 ]. To maintain the security of information, a concept for information security is proposed and a PDP protocol for public research is developed [ 27 ]. To resolve the certification management issue, the PDP system with data protection is introduced [ 28 ].
Identity-based cryptography is developed, in which a user's unique identity is used as input to generate a secret key [ 29 ]. Another PDP protocol is recommended to ensure confidentiality [ 30 ]. The author of the paper [ 31 ] proposed a scheme, in which tags are generated through the ring signature technique for group-based data sharing that supports public auditing and maintains user privacy. A new PDP system is introduced for data sharing over the cloud while maintaining user privacy [ 32 ]. Additionally, it supports the dynamic group system and allows users to exit or join the group at any time. Another PDP system [ 33 ] that is based on broadcast encryption and supports dynamic groups [ 34 ] is introduced. The issue of user revocation has been raised [ 35 ], and to address this issue, a PDP scheme has been proposed, which removes the user from the CSP using the proxy signature method. A PDP-based group data protocol was developed to track user privacy and identity [ 36 ]. A PDP system [ 37 ] is proposed for data sharing between multiple senders. The author of the paper [ 38 ] provides SEPDP systems while maintaining data protection. However, the author of the paper [ 39 ] proved that the scheme proposed in [ 38 ] is vulnerable to malicious counterfeiting by the CSP. A collision-resistant user revocable public auditing (CRUPA) system [ 40 ] is introduced for managing the data that is shared in groups. Another scheme [ 41 ] is introduced as a way to ensure the integrity of mobile data terminals in cloud computing.
To address the PKI issue, identity-based encryption [ 42 ] is designed to enhance the PDP protocol and maintain user privacy in a dynamic community. Before sharing user-sensitive data with third parties or researchers, data owners ensure that the privacy of user-sensitive data is protected. We can do this using data anonymization techniques [ 43 ]. In recent years, the research community has focused on the PPDP search area and developed several approaches for tabular data and SN [ 44 – 49 ]. There are two popular settings in PPDP: one is interactive, and the other is noninteractive [ 50 ]. The K-anonymity model [ 51 ] and its effects are most commonly used in the noninteractive setting of PPDP [ 52 – 56 ]. Differential privacy (DP) [ 57 ] and an interactive configuration of PPDP make extensive use of DP-based methods [ 58 – 60 ]. Meanwhile, several studies for a noninteractive setting reported a PD-dependent approach [ 61 ]. Researchers have expanded the concepts used to anonymize tabular data to protect the privacy of SN users [ 62 – 64 ].
Most images on the internet are in a compressed form. Hence, various studies design some techniques for AMBTC-compressed images. Data concealment has become an active research area. We can hide the data by adding confidential information to the cover image, and as a result, we get the stego image. There are two types of data hiding schemes: one is irreversible [ 65 – 68 ], and the other is a reversible data hiding scheme [ 69 – 71 ]. A cipher text designated for data collection can be re-encrypted as designated for another by a semitrusted proxy without decryption [ 72 ]. The first concrete construction of collusion-resistant unidirectional identity-based proxy re-encryption scheme, for both selective and adaptive identity, is proposed in the paper [ 73 ]. One of the data hiding schemes is the histogram shifting scheme [ 74 – 76 ], and it is the most widely used. A histogram-shifting data hiding scheme [ 77 ] that detects pixel histograms in the cover image is introduced. When big and diverse data are distributed everywhere, we cannot control the vicious attacks. Therefore, we need a cryptosystem to protect our data [ 78 – 80 ].
Some identity-based signature (IBS) schemes [ 81 – 84 ] are introduced that are based on bilinear pairing. However, the authentication schemes based on bilinear pairing over elliptic curve are more efficient and safer than traditional public key infrastructure [ 85 , 86 ]. The paper [ 87 ] proposed a preserving proxy re-encryption scheme for public cloud access control. A differential attack is performed on one-to-many order preserving encryption OPE by exploiting the differences of the ordered ciphertexts in [ 88 ]. Another scheme is proposed, which consists of a cancelable biometric template protection scheme that is based on the format-preserving encryption and Bloom filters [ 89 ]. Some of the researchers also use the concept of paring free identity-based signature schemes [ 90 – 93 ]. A lightweight proxy re-encryption scheme with certificate-based and incremental cryptography for fog-enabled e-healthcare is proposed in [ 94 ].
3. Research Methodology
The objective of this SLR is to evaluate, investigate, and identify the existing research in the context of data storage security in cloud computing to find and evaluate all the existing techniques. SLR is a fair and unbiased way of evaluating all the existing techniques. This way provides a complete and evidence-based search related to a specific topic. At this time, there is no SLR conducted on data storage security techniques that explains all the cryptographic and noncryptographic techniques. Hence, this SLR fulfills the gap by conducting itself. This SLR aims to provide a systematic method using the guidelines of an SLR provided by Kitchenham [ 95 ]. Furthermore, to increase the intensity of our evidence, we follow another study that is provided by [ 96 ]. Our SLR consists of three phases, namely planning, conducting, and reporting. By following these three phases, we conduct our SLR, as shown in Figure 1 .

Review procedure.
3.1. Research Questions
The primary research question of this systematic literature review is “What types of data protection techniques have been proposed in cloud computing?” This primary research question is further divided into four RQs. All these four questions are enlisted below.
- RQ1: what types of data protection techniques have been proposed in cloud computing?
- RQ2: what are the demographic characteristics of the relevant studies?
- RQ3: which data protection technique provides more data protection among all the techniques?
- RQ4: what are the primary findings, research challenges, and directions for future research in the field of data privacy in cloud computing?
3.2. Electronic Databases
Six electronic databases were selected to collect primary search articles. All these six electronic databases are well-reputed in the domain of cloud computing. Most of the relevant articles are taken from two electronic databases, namely IEEE and Elsevier. All the electronic databases that we use in this research process are given in Table 1 .
Databases sources.
3.3. Research Terms
First of all, the title base search is done on the different electronic databases, which are given in Table 1 . After that, most related studies/articles are taken. Search is done using the string (p1 OR p2. . . . . .OR pn.) AND (t1 OR t2. . . . . . OR tn.). This string/query is constructed using a population, intervention, control, and outcomes (PICO) structure that consists of population, intervention, and outcome. Database search queries are given in Table 2 .
- Population : “cloud computing”
- Intervention : “data security,” “data privacy,” “data integrity”
- Using the PICO structure, we construct a general query for the electronic database. Generic: ((“Document Title”: cloud∗) AND (“Document Title”: data AND (privacy OR protect∗ OR secure∗ OR integrity∗))).
Databases search query.
3.4. Procedure of Study Selection
The procedure of study selection is described in Figure 2 . This procedure has three phases: the first one is exclusion based on the title, in which articles are excluded based on the title, and the relevant titles are included. The second is exclusion based on the abstract in which articles are excluded. By reading the abstract of the articles, the most relevant abstract is included, and the last one is exclusion based on a full text that also includes quality assessment criteria.

Study selection procedure.
3.5. Eligibility Control
In this phase, all the selected papers are fully readied, and relevant papers are selected to process our SLR further. Table 3 shows the final selected papers from each database based on inclusion and exclusion criteria. The related papers are selected based on inclusion and exclusion criteria, which are given in Table 4 .
Results from electronic databases.
Inclusion and exclusion criteria.
3.6. Inclusion and Exclusion Criteria
We can use the inclusion and exclusion criteria to define eligibility for basic study selection. We apply the inclusion and exclusion criteria to those studies that are selected after reading the abstract of the papers. The criteria for inclusion and exclusion are set out in Table 4. Table 4 outlines some of the conditions that we have applied to the articles. After applying the inclusion and exclusion criteria, we get relevant articles, which we finally added to our SLR. The search period is from 2010 to 2021, and most of the papers included in our SLR are from 2015 to onward.
We apply inclusion and exclusion criteria in the third phase of the study selection process, and we get 139 results. After that, we also apply quality criteria, and finally, we get 52 articles, which are included in this SLR. Most of the articles are taken from Elsevier and IEEE electronic databases. IEEE is the largest Venus for data storage security in cloud computing. The ratio of the selected articles from different electronic databases is shown in Figure 3 .

Percentage of selected studies.
3.7. Quality Assessment Criteria
Quality checking/assessment is done in the 3 rd phase of the study selection process. A scale of 0-1 is used for the quality assessment (QA) of the articles.
Poor-quality articles get 0 points on the scale, and good-quality articles get 1 point on the scale. The articles with 1 point on the scale are included in this SLR. Hence, by applying the quality checking/assessment criteria on all the articles, we finally get 52 articles. All the selected papers have validity and novelty for different data protection techniques, and also, we find the relevance of the articles in the quality assessment criteria, which ensures that all the articles are related to the SLR (data storage protection and privacy in cloud computing). The quality checking (QC) criteria are given in Table 5 .
Quality checking criteria.
3.8. Taxonomy of the Data Protection Techniques
In this section, all the data protection techniques are depicted in Figure 4 . All the data protection techniques are arranged and classified in their related categories. The purpose of the taxonomy is to give a presentational view of all the data protection techniques. The data protection techniques are mainly divided into two categories, namely (1) noncryptographic techniques and (2) cryptographic techniques.

Taxonomy of the data protection techniques.
4. Results and Discussions
Data protection on the cloud is done by developing a third-party proxy that is trusted by the user. The trusted proxy is not a physical entity. It is a logical entity that can be developed on the user end (like on the user's personal computer) or at that location on which the user can trust. Mostly, all the local proxies are used as an additional service or as an additional module (like browser plugins). To fulfill the objective of data protection by proxies, some requirements are needed to fulfill necessarily. The requirements are given below:
- User privilege. There are several objectives of user privilege or user empowerment, however, the main objective is to increase the trust of the users in data protection proxies used by the cloud.
- Transparency. Another important objective is that when users outsource their sensitive data to trusted proxies, their data should remain the same and should not be altered.
- Cloud computing provides large computing power and cost saving resources. However, one concern is that if we increase data security, computation overhead should not increase. We want to minimize the computation overhead over the proxies.
- Cloud functionalities preservation. Cloud functionalities preservation is the most important objective. The users encrypt their sensitive data on their personal computers by applying different encryption techniques to increase the protection of their data, however, by applying these different encryption techniques, they are not able to avail some of the cloud functionalities because of compatibility issues [ 97 ]. Hence, it is the main issue.
Figure 5 provides a data workflow for protecting sensitive data on the cloud using a local proxy. There are different types of the assumption that are made for data protection, and some of them are discussed below.
- Curious CSPs, the most commonly used model in cloud computing, is given in the literature [ 98 ]. The cloud service provider honestly fulfills the responsibilities, i.e., they do not interfere in the user activities, and they only follow the stander protocols. The CSP is honest, however, sometimes, it is curious to analyze the users' queries and analyze their sensitive data, which is not good because it is against the protocol. Also, by this, the privacy of the user is compromised. Hence, we can avoid these things by applying some data protection techniques on the user end to protect the users' sensitive data from the CSPs.
- In some cases, CSPs may collaborate with data protection proxies that are present on the users' sides to increase the level of trust between the users and CSPs because better trust can motivate more users to move to the cloud. This collaboration can be done if CSPs provide some services to the users with a stable interface for storing, searching, and computing their data.
- A multicloud approach to cloud computing infrastructure has also been proposed to improve their performance. In this regard, multiple cloud computing services are provided in the same heterogeneous architecture [ 19 ]. A multicloud gives the user multiple different places to store their data at their desired location. There are several benefits to use a multicloud, e.g., it reduces reliance on a single CSP, which increases flexibility.

Data workflow on cloud using local proxy.
4.1. RQ1: What Type of Data Protection Techniques has Been Proposed in Cloud Computing?
In this session, we will discuss all the techniques for data storage security over the cloud. All these techniques are divided into two main categories, namely (i) cryptographic techniques and (ii) noncryptographic techniques. The local proxy uses different techniques to protect data that are stored on the cloud. Because of this reason, we cannot gain all the advantages of cloud services. Therefore, we analyze and compare all these techniques based on different criteria. These different criteria are as follows: (i) the data accuracy of all the techniques, (ii) the data protection level of all the techniques, (iii) all the functionalities these schemes allow on masked and unmasked data, and (iv) the overhead to encrypt and decrypt data over the cloud.
4.1.1. Noncryptographic Techniques
There are some noncryptographic techniques, and we discuss them in this paper as follows:
(1) Data Anonymization . Data anonymization is a data privacy technique used to protect a user's personal information. This technique hides the person's personal information by hiding the person's identifier or attributes that could reveal a person's identity. Data anonymization can be done by applying various mechanisms, for example, by removing or hiding identifiers or attributes. It can also be done by encrypting the user's personal information. The main purpose of performing data anonymization is that we can hide the identity of the person in any way. Data anonymity can be defined as the user's personal data being altered in such a way that we cannot directly or indirectly identify that person, and the CSP cannot retrieve any person's personal information. Data anonymization techniques have been developed in the field of statistical control disclosure. These techniques are most often used when we want to outsource sensitive data for testing purposes. Data anonymization is graphically represented in Figure 6 .

Data anonymization flow diagram.
Data anonymization techniques are most often used when we want to outsource sensitive data for testing purposes. For example, if some doctors want to diagnose certain diseases, some details of these diseases are required for this purpose. This information is obtained from the patients that suffer from these diseases, but it is illegal to share or disclose anyone's personal information. However, for this purpose, we use data anonymization technique to hide or conceal the person's personal information before outsourcing the data. In some cases, however, the CSP wants to analyze the user's masked data. In the data anonymization technique, attributes are the most important part. Attributes can include name, age, gender, address, salary, etc. Table 6 shows the identifiers classification.
Identifiers classification.
Data anonymization can be performed horizontally or vertically on this table and also on the record or group of records. The attributes are further classified into the following categories.
- Sensitive Attributes: sensitive attributes possess sensitive information of the person, such as salary, disease information, phone number, etc. These attributes are strongly protected by applying some protection techniques.
- Nonsensitive Attributes: these types of attributes do not belong to any type of category. Hence, they do not disclose the identity of a person.
- Identifiers: identifier belongs to the identity of a person, such as Id card, name, social security number, etc. Because of the presence of these identifiers, the relationship between different attributes can be detected. Hence, these identifiers must be replaced or anonymized.
- Quasi-Identifiers: quasi-identifiers are the group of identifiers that are available publicly, such as zip-code, designation, gender, etc. Separately, these identifiers cannot reveal the personal identity, however, by combining them, they may reveal the identity of the person. Hence, we want to separate these quasi-identifiers to avoid the discloser.
There are two main categories of data masking: (1) perturbative masking and (2) nonperturbative masking.
- (1) Perturbative Masking
- In perturbation, masking data is altered or masked with dummy datasets. Original data is replaced with dummy data, however, this data looks like the original data with some noise addition. The statistical properties of the original data are present in the masked data, however, nonperturbative masking does not contain the statistical properties of original data, because in perturbation masking, data is altered or masked with physically same but dummy data.
- Data swapping
- In data swapping, the data is randomly changed with the same but dummy data between different records [ 99 ]. However, if the numerical values are present in the dataset, then in certain limits, the values can be changed. Otherwise, the meaning of the data is changed. The masked data cannot look like the original data. For those attributes that can be ranked, the attribute is replaced with the nearby ranked attributes, and a very large difference between ranks is not suitable [ 100 ]. In data swapping, higher-level attributes are swapped [ 101 ] and individual values are not changed.
- Noise Addition
- In this mechanism, some noise is added to the original dataset to alter the original data. Noise is only added to the data that is continuous and divided into categories [ 102 ]. The noise is added into all the attributes that are present in the original dataset, such as sensitive attributes and also quasi-attributes.
- Microaggregation
- In this technique, all the relevant data is stored into different groups, and these different groups release average values from each record [ 103 ]. If a large number of similar records is present in different groups, then more data utility is done. We can cluster the data in many ways, e.g., in categorical versions [ 104 ]. Microaggregation is done on a quasi-attribute to protect these attributes from reidentification, and the quasi-attributes protect all the other attributes from reidentification. We can also minimize reidentification by data clustering [ 105 ].
- Pseudonymization
- In this method, the original data is replaced with artificial datasets [ 106 ]. In this technique, each attribute present in the original data is a pseudonym, and by doing this, data is less identifiable.
- (2) Nonperturbative Masking
- Nonperturbative masking does not change or alter the original data, however, it changes the statistical properties of the original data. Mask data is created by the reduction of the original data or suppressions of the original data [ 107 ].
- Bucketization
- In this method, original data is stored in different buckets, and these buckets are protected through encryption [ 108 ]. We can protect the sensitive attributes through bucketization.
- Data slicing is a method in which a larger group of data is divided into smaller slices or segments [ 109 ]. Hence, we can slice the data, and in this way, the sensitive attribute and the quasi-attributes are divided into different slices. By identifying the individual slice, the identity of the person cannot be disclosed.
- Sampling is a technique in which the population and sample concept is present. The entire data is called population, and the masked data is called a sample. In this technique, we make different samples of the original data. A smaller data sample provides more protection [ 110 ].
- Generalization
- It is a technique in which some additional attributes are added to the record. If the number of quasi-attributes is less rare, then some dummy attributes are added into the record, which look like the quasi-attributes. Hence, by doing this, reidentification becomes more difficult [ 111 ]. By applying generalization on data, we can protect the identity of a person because it hides the relationship between the quasi-attributes.
The summary of data anonymization techniques is given in Table 7 .
The summary of data anonymization techniques.
(2) Data Splitting . Data splitting is a technique in which sensitive data is divided into different fragments [ 112 ] to protect it from unauthorized access. In this technique, we first split the data into different fragments, then these fragments are randomly stored on different clouds. Even if the intruder gains access to a single fragment in any way, still the intruder will not be able to identify the person. For example, if an intruder gets a fragment from the cloud that contains the salary information of an organization, it is useless until he knows which salary belongs to which person. Hence, data splitting is a very useful technique for protecting data stored on the cloud.
Local proxies outsource data to the cloud without splitting the data, and they can also split the data first and then outsource to the same cloud using different accounts in the same CSP. It can also store data on different cloud platforms that run through different CSPs but provide some of the same services. Data is split before storing in different locations because even if some part or piece of data is known to an intruder, they will not be able to identify anyone.
Firstly, the local proxy retrieves sensitive data from the user and then calculates the risk factor for disclosure. In this method, the user can define the privacy level, and this privacy level provides information about all the sensitive attributes that can reveal someone's identity. These sensitive attributes are called quasi-attributes or quasi-identifiers. Next, the local proxy decides the number of pieces into which the sensitive data will be split and the number of locations that will be needed to store those pieces. Therefore, no one can reveal a person's identity, and all this information about the data splitting mechanism is stored at the local proxy. However, the system must be able to function properly and respond to the queries on time. After that, the local proxy stores these different data fragments in different cloud databases, and now, they are free from disclosure. The data-splitting mechanism supports almost all the functions of the cloud. Hence, we can use almost all the services provided by CSP using the data-splitting mechanism for storing data in the cloud.
When the users want to retrieve the original data, they process a query on a local proxy. The query is processed, and the data storage locations are retrieved from the local database. After that, the query is replicated as many times as the data is split into fragments, and these queries are forwarded to the relevant CSPs. As a result, each CSP provides a set of results that represent a partial view of the complete result. Finally, the proxy collects partial results according to the criteria used to split the data and provides the complete result to the user. Mostly, all these fragments are stored on different cloud databases in their original structure. Therefore, computation on these fragments can be performed easily. However, there is a problem if we want to perform computation separately on the individual fragment. Then, there is no algorithm that exists for this computation. Therefore, some algorithms are required to perform these types of computation as this computation requires communication between different CSPs. The redundancy of proxy metadata and backup policies must be essential to ensure the robustness of the mechanism. The data-splitting is graphically represented in Figure 7 .

Data-splitting flow diagram.
The summary of the data-splitting is given in Table 8 . Different data-splitting techniques are used for the protection of data stored on the cloud. Some of these are given below.
- Byte level splitting
- In this type, all the sensitive data is converted into bytes [ 113 ]. Then, these bytes are randomly shuffled with each other. After that, all the bytes are recombined. Fixed length fragments are made, and then, these fragments are stored on a different cloud.
- Privacy level splitting
- In this mechanism, the user chose the privacy level of each file [ 114 ] that is to be stored on a cloud database. Hence, a privacy level is attached with the file that is to be stored on the cloud. Using this privacy level, the user can decide that the higher privacy level files should be stored on the trusted cloud.
- Byte level splitting with replication
- Byte-level data splitting is combined with data replication to improve both performance and security. The author of the paper [ 115 ] proposed an algorithm to store the data fragments on different clouds, so that they are at a certain distance and by doing this; we can avoid confabulation attacks where the intruder can aggregate the split fragments.
- Byte level splitting with encryption
- Firstly, byte-level data splitting [ 116 , 117 ] is proposed. In this scheme, every fragment of data is encrypted to enhance the security of sensitive data. In this mechanism, the data is split into bytes, and these bytes are randomly shuffled and finally recombined. This type of data splitting is suitable for binary or multimedia files that are not processed through the cloud.
- Another problem is the length of a fragment in which we can say that the data cannot be reidentified or the identity of a person cannot be revealed. If the length is too short, then the probability of disclosure increases, and if the length is too long, then it is difficult to handle these fragments. Hence, it should have a certain length so that we can also protect the identity of a person.
- There is another type of data splitting in which we split data into attributes. The attribute level splitting is performed in two ways: one is horizontal splitting and the second is vertical splitting. These types of splitting are mostly done on structural databases, and they provide strong privacy.
- Vertical splitting
- In vertical data splitting [ 118 , 119 ], we divide quasi-identifiers or quasi-attributes in such a way that all the risky attributes are divided into different fragments to secure the reidentification. Some of the sensitive fragments required encryption on it. Hence, we can encrypt these fragments by applying some encryption algorithms or by applying some other privacy methods to increase the security level.
The summary of the data-splitting techniques.
A solution for sensitive data splitting without performing encryption on fragments is proposed [ 120 ]. This mechanism is suitable for data on which we want to perform some computation, because on encrypted data, we cannot perform computation directly. Another technique has been proposed [ 121 ], which demonstrates the redaction and sanitization of a document that identifies all sensitive attributes and protects the data in most documents.
The schemes that use vertical splitting to protect data are faster than other splitting techniques because data fragments consist of a single attribute or multiple attributes. It does not involve data masking or encryption. Hence, the computation is easy. There is another type of encryption in which we do not encrypt and decrypt every time to perform computation. It is called homomorphic encryption. In this case, all data modification is done on encrypted data, and actual data is not changed, however, the final result is preserved [ 122 ].
(3) Steganography . Steganography is the practice of concealing a message within another message or a physical object. In computing contexts, video, audio, image, message, or computer file is concealed within another image, message, or file. The steganography flow diagram is depicted in Figure 8 . There are two main types of steganography, namely (1) linguistic steganography and (2) technical steganography. These techniques are given as follows:
- (1) Linguistic Steganography
- It uses images and symbols alone to cover the data. There are two types of Semagrams [ 123 ]. The first is a visual Semagram. In this type, we can visualize the massage. The second type is a text Semagram. In this type, we change the font, color, or symbols of the text message.
- In this case, we hide the real message from the intruder by installing the original massage in an authorized carrier [ 124 ]. Open code technique is further divided into two types: one is jargon code, and the second is covered ciphers.
- (2) Technical Steganography
- Text steganography
- In this type, we change some textual characteristics of text, such as the font, color, or symbols of the text message [ 127 ]. Three coding techniques are used to change these textual features, which are as follows: (1) line-shift coding, (2) word-shift coding, and (3) feature coding.
- Image steganography
- It is the most popular type of steganography. Image steganography refers to the process of hiding sensitive data inside an image file [ 128 ]. The transformed image is expected to look very similar to the original image because the visible features of the stego image remain the same. The image steganography is divided into three parts, namely (1) least significant bits coding, (2) masking and filtering, and (3) transformations.
- Audio steganography
- Audio steganography is a technique that is used to transmit secret data by modifying a digitalized audio signal in an imperceptible manner [ 129 ]. Following types of audio steganography are given: (1) least significant bits coding, (2) phase coding, (3) spread spectrum, and (4) echo hiding.
- Video steganography
- In video steganography, both image and audio steganography are used [ 130 ]. A video consists of many frames. Hence, video steganography hides a large amount of data in carrier images. In this type of steganography, we select the specific frame in which we want to hide the sensitive data.
- (ii) Methods
- Frequency Domain
- A frequency-domain steganography technique is used for hiding a large amount of data with no loss of secret message, good invisibility, and high security [ 131 ]. In the frequency domain, we change the magnitude of all of the DCT coefficients of the cover image. There are two types of frequency domain: (1) discrete cosine transformation and (2) discrete wavelet transformation.
- Spatial Domain
- The spatial domain is based on the physical location of pixels in an image [ 132 ]. A spatial domain technique gives the idea of pixel regulation, which minimizes the progressions of a stego image created from the spread image. Some methods of the spatial domain are given as follows: (1) least significant bit, (2) pixel value differencing, (3) pixel indicator, (4) gray level modification, and (5) quantized indexed modulation.

Steganography flow diagram.
The summary of the steganographic techniques is given in Table 9 .
The summary of the steganographic techniques.
4.1.2. Cryptographic Techniques
Cryptography is the most important and most widely used technique for security purposes. In cryptography, the plain text is converted into ciphertext using a key and some encryption algorithms. Cryptographic techniques are the most secure techniques among all the other security techniques. Hence, these cryptography techniques are widely used in data storage security over the cloud. The present day's cryptography techniques are more realistic. We can achieve different objectives by applying these cryptographic techniques, for example, data confidentiality and data integrity. Because of an increase in the number of data breaches in the last few years, some cloud service provider companies are shifting toward cryptographic techniques to achieve more security. The most commonly used cryptographic technique is AES [ 133 ]. Key management is an important issue in cryptographic techniques because if the key is hacked by an intruder, then all the data will be hacked or stolen by this intruder. Hence, key protection or key management is a very important issue. Therefore, it is mostly the responsibility of CSP to manage the key and also provide the protection of key. Cryptographic techniques also protect the user from an untrusted CSP because sometimes the CSP outsources sensitive data without taking the permission of users, and it is an illegal activity. Hence, to avoid these things and protect our sensitive data from untrusted CSPs, we use cryptographic techniques, and it is the best option for users. However, there are some difficulties the user has to face while using cryptographic techniques, i.e., if a user wants to update a small amount of data, the user needs to decrypt the data and then perform this minor update. Hence, this work is very costly. Over time, implementing cryptographic techniques gives us a higher level of security, however, we compromise on performance or speed. It all depends on the user, the standard, the performance, or the high level of security the user wants to achieve. In this paper, we are focusing on the four main functionalities that are required or needed on cloud computing when using cryptographic techniques. Figure 9 shows the flow diagram of encryption.

Encryption flow diagram.
Some of the main functionalities of cryptographic functions are given below.
- Search on encrypted data
- If a user wants to retrieve their data stored in a cloud database, they generate a query and run the query on a local proxy server and search for the data they want. Searching for encrypted data is a very important part of cryptography because every user who stores their sensitive data in a cloud database wants to retrieve it, and it is done by searching their sensitive data through queries. Therefore, the procedure of retrieving their data is very difficult.
- Storage control
- Sometimes the user wants to store data in a desired location or trusted database. Hence, the user must have full control over the storage of data.
- Access control
- It is a very important control and is referred to as data access restriction. Sometimes, the user does not want to share a private file publicly. Hence, access control is an important functionality.
- Computation on data
- Data computation is the main functionality of cloud computing. Sometimes, the user wants to perform some computation on data that are stored on a cloud database. For example, if a user wants to perform computation on encrypted data that is stored on cloud databases, then there are two ways. One is that the user, firstly, decrypts the entire data, performs computation on the data, and finally, the user encrypts the entire data and stores on the cloud database. This process is very expensive in terms of computation.
Some of the cryptographic techniques are as follows:
(1) Homomorphic Encryption . Homomorphic encryption is a form of encryption that permits users to perform computations on encrypted data without decrypting it. These resulting computations are left in an encrypted form, which, when decrypted, result in an identical output to that produced had the operations been performed on the unencrypted data. There are some types of homomorphic encryption that are described below.
- Partial Homomorphic Encryption
- In partial homomorphic encryption, only one arithmetic function addition or multiplication is performed at one time. If the resultant ciphertext is the addition of the plain text, then it is called an additive homomorphic scheme, and if the resultant ciphertext is the multiplication of the plaintext, then it is called the multiplicative homomorphic scheme. Two multiplicative homomorphic schemes are given as in [ 134 , 135 ]. There is one additive homomorphic scheme that is called Paillier [ 136 ].
- Somewhat Homomorphic Encryption
- This technique allows the user to perform the multiplication and subtraction mathematical operations. However, this scheme allows a limited number of arithmetic operations, because if it allows a large number of arithmetic operations, then it produces noise. This noise changes the structure of the original data. Hence, limited numerical math operations are allowed. There is a somewhat homomorphic encryption scheme that is presented by the authors of the papers [ 137 , 138 ]. In this scheme, the time of encryption and decryption is increased when multiplication operations are increased. To avoid this increase in time, we allow only a limited number of mathematical operations.
- Fully Homomorphic Encryption
- This technique allows a large number of arithmetic operations, namely multiplication and subtraction. Multiplication and addition in this technique are performed in the form of XOR and AND gates [ 139 ]. Completely homomorphic encryption techniques require a higher computation time to encrypt and decrypt data. Therefore, this technique is not applicable in real-life applications for implementation. This technique uses a bootstrapping algorithm when a large number of multiplication operations is performed on data and also for the decryption of the data it is used. Homomorphic encryption, on the other hand, represents the trade-off between operations and speed performance. Only a limited number of arithmetic operations are allowed if someone wants low computation, and a large number of arithmetic operations are allowed if someone wants high security. It depends on the needs of the user.
(2) Searchable Encryption . A searchable encryption technique is proposed by the author of the paper [ 140 ]. In this technique, before storing data on a cloud database, encryption is performed, and after that, it is stored on the cloud. The advantage of this technique is that when we search for some data over the cloud database, this technique provides a secure search over the cloud database.
- Searchable Asymmetric Encryption
- Over the past two decades, we have focused on searchable encryption. Much of the work is related to the multiwriter and single-reader cases. Searchable encryption is also called public keyword search encryption along with keyword search (PEKS) [ 141 ].
- Searchable Symmetric Encryption
- Symmetric-key algorithms use the same key for massage encryption and ciphertext decryption. The keys can be the same, or there can be a simple transformation to go between the two keys. Verifiable searchable symmetric encryption, as a key cloud security technique, allows users to retrieve encrypted data from the cloud with keywords and verify the accuracy of the returned results. Another scheme is proposed for keyword search over dynamic encrypted cloud data with a symmetric-key-based verification scheme [ 142 ].
(3) Encryption . In cryptography, encryption is the process of encoding information. This process converts the original representation of the information, known as plaintext, into an alternative form known as ciphertext. Ideally, only authorized parties can decipher a ciphertext back to plaintext and access the original information.
- Symmetric Key Encryption
- Only one key is used in symmetric encryption to encrypt and decrypt the message. Two parties that communicate through symmetric encryption should exchange the key so that it can be used in the decryption process. This method of encryption differs from asymmetric encryption, where a pair of keys is used to encrypt and decrypt messages. A secure transmission method of network communication data based on symmetric key encryption algorithm is proposed in [ 143 ].
- Public Key Encryption
- The public-key encryption scheme is proposed by the author of the paper [ 144 ]. In this scheme, a public key pair is created by the receiver. This public key pair consists of two keys. One is called a public key, which is known publicly to everyone, and the second is the private key, which is kept a secret. Hence, in this scheme, the sender performs encryption on the data using the public key of the receiver and then sends this encrypted data to the receiver. After receiving this encrypted data, the receiver can decrypt this data using the private key. Hence, in this way, we can perform secure communication between two parties.
- Identity-Based Encryption
- Identity-based encryption is proposed by the author of the paper [ 145 ]. In this technique, a set of users is registered on the database and a unique identity is assigned to all the registered users by an admin that controls this scheme. The identity of the users can be represented by their name or their e-mail address. Just like in a public-key encryption, there is a public key pair that consists of one public key, which is the identity of the user, and one private key, which is a secret key. Just like in public-key encryption, the receiver cannot generate their public key in identity-based encryption. The identity cannot be generated by the user. There is a central authority that generates and manage the user's identity. The identity-based encryption is improved by the author [ 146 ]. The main advantage of identity-based encryption is that anyone can generate the public key of a given identity with the help of the central main authority.
- Attribute-Based Encryption
- The authors of the papers [ 147 , 148 ] propose a technique called attribute-based encryption. Similar to identity-based encryption, attribute-based encryption also depends on the central main authority. The central main authority generates the private key and distributes it to all the registered users. It can be encrypting the messages, however, if it does not have this designation, then it cannot be generating the messages. Attribute-based encryption is used when the number of registered users is very large. Then, the attribute-based encryption is useful. The attribute-based encryption consists of two schemes, which are key policy and ciphertext policy.
- Functional Encryption
- A functional encryption technique [ 149 , 150 ] consists of identity-based encryption, attribute-based encryption, and public-key encryption. All the functionalities of these three techniques combinedly make function encryption. In this technique, all the private keys are generated by the central main authority, which is associated with a specific function. Functional encryption is a very powerful encryption technique that holds all the functionalities of three encryption techniques. A functional encryption technique is used in many applications.
(4) Signcryption . Cryptography is publicly open-source, and it functions simultaneously as a digital signature and cipher. Cryptography and digital signatures are two basic encryption tools that can ensure confidentiality, integrity, and immutability. In [ 151 ], a new scheme called signature, encryption and encryption is proposed, based on effectively verifiable credentials. The system not only performs encryption and encryption but also provides an encryption or signature form only when needed [ 152 ]. The paper proposes lightweight certificate-based encryption using a proxy cipher scheme (CSS) for smart devices connected to an IoT network to reduce computing and communications costs. To ensure the security and efficiency of the proposed CBSS project, we used a cipher system encoded with 80 bit subparameters. Reference [ 153 ] proposes an input control scheme for the IoT environment using a cryptographic scheme corresponding to the efficiency and robustness of the UK security system. The proposed scheme shows that besides security services, such as protection against attacks, confidentiality, integrity, nonblocking, nondisclosure, and confidentiality, accounting and communication costs are low compared to the current scheme. Document [ 154 ] gives the informal and formal security proof of the proposed scheme. Automated Validation of Internet Security Protocols and Applications (AVISPA) tool is used for formal security analysis, which confirms that the proposed CB-PS scheme can potentially be implemented for resource-constrained low-computing electronic devices in E-prescription systems. The proposed scheme [ 155 ] introduced a new concept that does not require a reliable channel. The main production center sends a part of the private key to the public consumers. The summary of the cryptographic schemes is given in Table 10 .
The summary of the cryptographic techniques.
All data storage protection on cloud computing is discussed in session 3. There are a lot of data protection techniques, however, all these techniques are only divided into three main categories, namely (i) data splitting, (ii) data anonymization, and (iii) cryptography. From different points views, we discuss all these techniques, e.g., overhead on the local proxy, computation cost, search on encrypted data, data accuracy all these techniques retained, and data protection level all these techniques have, and all the masked data techniques have the functionalities. These are some different views, and by considering them, we can analyze all the data protection techniques. Cryptography provides high-level security but limited cloud functionalities and a high cost of performing computation on cloud data. Data splitting provide low computation cost but a low level of security. Data anonymization is of two types: one is perturbative masking, and the second is nonperturbative masking. Hence, in perturbative masking, data is altered with dummy data. Hence, security is high, however, we cannot perform some functionalities.
4.2. RQ2: What are the Demographic Characteristics of the Relevant Studies?
We answer this question by considering the four following aspects: (i) publication trend, (ii) publication venues (proceeding and journals), (iii) number of citations, and (iv) author information.
4.2.1. Publication Trend
From 2010 to 2021, we found 52 papers that were of top ranked journals and conferences. From 2010 to 2017, there is linear work in cloud computing, however, after 2017, a lot of work is done in cloud computing data security. From 2018 to 2021, 37 papers are published. After 2018, the trend about data security in cloud computing increased very vastly. Most of the work is done in 2021. High-ranked studies are published in 2021. Figure 10 shows all trends of all the publications from 2010. Most of the articles are published in journals venue, and the highest number of papers have been published in IEEE Access journal. 6 papers were published in this journal.

Number of publications per year.
4.2.2. Publication Venues
There are different types of publication venues, and some of them are book articles, conference proceedings, journals, workshop proceedings, and symposium proceedings. Hence, in our SLR, the number of publications in a different venue is given in Figure 11 . We have a total of 52 papers after applying the inclusion and exclusion criteria in Section 2 .

Publication venues.
Out of 52 papers, 0 papers are published in book chapters. 1 paper is published in workshop proceedings. 0 papers are published in symposium proceedings. 43 papers are published in journals. 8 papers are published in conference proceedings. There are some most active journals in cloud data security, which are enlisted in Table 11 .
Top 5 most active journals.
The most active journal is the IEEE Access. In this journal, 6 papers are published. Journal of Cryptology is the second most active journal in the field of data storage, security, and privacy in cloud computing. In this journal, 3 papers are published. In the third journal, i.e., in the Journal of Information Fusion, 3 papers are published. The fourth journal is the Information Science. In this journal, 2 papers are published. The fifth journal is IEEE Transactions on Knowledge and Data Engineering, and in this journal, 2 papers are published. Most active conferences are given in Table 12 .
Top 5 most active conferences.
4.2.3. Number of Citations
The number of citations of a paper also tells the quality of the paper. The more the number of citations, the higher the quality, and the fewer the number of citations of the paper, the lower the paper quality. Table 13 shows the most influential authors, and Figure 12 shows the number of citations of all the papers that we have used in this SLR. Few papers have citations of more than 100. Hence, it shows that papers have a very high quality, and hence, the citation of those papers is very high. These papers are [ 105 , 118 , 124 , 139 ].

Number of citations of the papers.
Top 10 most influential authors in data protection in cloud computing.
4.2.4. Author Information
Some authors are most active in their publication. To identify these authors, we enlist the names of the top 10 authors that are more active in the field of data protection and privacy in cloud computing. Hence, we enlist the names of the top 10 authors and also their numbers of publications in Table 13 .
4.3. RQ3: Which Data Protection Technique Provides More Data Protection among all the Techniques?
We answer this question by considering the following four aspects: (i) publication trend, (ii) publication venues (proceeding and journals), (iii) number of citations, and (iv) author information.
4.3.1. Comparison of Data Protection Techniques
In this section, we compare all the data protection techniques that are discussed in this SLR, and finally, we review which technique is better and provides more protection among all these data protection techniques. We compare these techniques based on different functionalities, which are given as (i) local proxy overhead, (ii) data accuracy retain, (iii) level of data protection, (iv) transparency, and (v) operation supported, and finally, we discuss RQ2. Table 14 depicts a comparison of all the data protection techniques and provides a brief comparison of all the data protection techniques discussed in this SLR. Now, we discuss all these five functionalities one by one in more detail.
- The overhead on the local proxy for encryption is very high because the data is encrypted. If the user wants to update the data, firstly, the user decrypts the data and then updates the data. After that, the user encrypts the data again. Hence, this operation requires a lot of time, and all this work is performed by the local proxy. It is the reason the overhead on the local proxy for encryption is very high for encryption.
- Data Splitting
- The overhead on a local proxy for data splitting is very low. The local proxy overhead remains constant while splitting data into fragments.
- Anonymization
- The overhead on a local proxy for anonymization is average because most of the anonymization methods require quasilinear computation in the number of records to generate the anonymized data set. Whenever the anonymized data is generated and stored in the cloud database, then there is no overhead on the local proxy.
- Homomorphic Encryption
- The overhead on local proxies for homomorphic encryption is very high because homomorphic encryption involves a large number of mathematical operations. Therefore, there is a lot of overhead on local proxies for homomorphic encryption.
- Steganography
- The overhead on the local proxy for steganography is not too much as the data is concealed inside the cover for secure communication. However, based on the complexity of the operation in the transformed domain technique, the local proxy overhead is more than the spatial domain technique.
- Signcryption
- The overhead on the local proxy for signcryption is high compared to the simple encryption because in signcryption, hashing and encryption are performed in a single logical step. Because of an extra operation in signcryption, the overhead on the local proxy is higher than the simple encryption.
- The data accuracy level for encryption is very high because data is encrypted by applying some algorithms. The sensitive data is encrypted by the sender, and this data is decrypted by the receiver using a key. This data cannot be read by anyone who does not have the secret key. Therefore, data accuracy is very high for encryption.
- The data accuracy level for data splitting is average because data-splitting data is present in the form of fragments. Therefore, CSP can easily access the fragments of data. Both encryption and data splitting are irreversible methods. Hence, we can retrieve the original data easily.
- The data accuracy level for data anonymization is very low because anonymization is not irreversible. In anonymization, data is replaced with dummy data, and it cannot be retrieved back. Therefore, anonymization has a very low level of data accuracy.
- The data accuracy level for homomorphic encryption is very high because data is encrypted by applying some algorithms.
- The data accuracy level for steganography is very low as compared to the other cryptographic techniques because data is embedded inside the cover of another medium. Any change in the cover during transmission results in the change of the concealed data. Therefore, it is hard to ensure a high accuracy level in steganography. The stego image contains the secrete data that is transmitted over the communication channel. Data concealed by the sender is extracted from the cover by the receiver. Therefore, the concealment of data results in accurate data transmission.
- The data accuracy level for signcryption is also very high, because in signcryption, confidentiality and authentication are achieved. Therefore, we can also verify the identity of the sender.
- The level of data protection is very high for encryption techniques, because in encryption, data is changed into ciphertext, which cannot be understood. Therefore, we can say that the identification of data is impossible without decryption using a secret key because encryption is a one-way function that is easy to execute in one direction, however, it is impossible to execute in the opposite direction.
- The level of data protection for data splitting is less high as compared to cryptographic techniques because data is split into different fragments, and these fragments contain original forms of data. Hence, if an intruder hacks or steal these fragments, then the untired data can be easily read. Hence, the data protection level is not high as compared to encrypted methods.
- The level of data protection for data anonymization is less high as compared to cryptographic techniques, because in anonymization techniques, quasi-identifiers are protected if the quasi-identifiers are not protected strongly. Then, there is a change in the reidentification of person-sensitive data.
- The level of data protection is very high for homomorphic encryption techniques because encryption data is changed into ciphertext, which cannot be understood.
- The data protection level for steganography is medium because data is embedded inside the cover of another medium. The stego image contains the secrete data that is transmitted over the communication channel. Data concealed by the sender is extracted from the cover by the receiver. Therefore, the concealment of data results in secure data transmission.
- The data protection level for signcryption is also very high, because in signcryption, both confidentiality and authentication are achieved. Therefore, we can also verify the identity of the sender.
- There is no transparency for the encrypted data, because in encryption, there is a need for key management. Hence, the local proxy needs to keep the records of all the keys and manage all these keys. Therefore, there is no transparency for the encrypted data.
- There is no transparency for the data-splitting mechanism, because in the data-splitting mechanism, data is split into different fragments, and the local proxy stores these fragments in different locations. Hence, there is a need to keep the record of the location of all the fragments that are stored on different locations.
- Anonymization is fully transparent, because in anonymization, there is no need to keep the record of data storage by the local proxy. In anonymization, data is statistically similar to the original data. Hence, CSP also performs computation and some analysis on the anonymized data.
- There is no transparency for the homomorphically encrypted data, because in encryption, there is a need for key management. Hence, the local proxy needs to keep the records of all the keys.
- In steganography, as compared to other data protection techniques, the main aim is to transmit data without letting the attacker know about the data transmission as it is concealed inside the cover of another medium. The data transmission in steganography is fully transparent. No key management is required, and there is no need to keep track of data storage.
- There is no transparency for the signcrypted data, because in signcryption, there is a need for key management. Hence, the local proxy needs to keep the records of all the keys and also manage all these keys.
- Only the data storage operation is supported on the encrypted data, because if the user wants to update some encrypted data that are stored on a cloud database, firstly, the user needs to decrypt this data, and then the user performs an update on this data. We cannot perform any modification operation on encrypted data.
- All the operations cloud be performed on data splitting, because in data splitting, the data is present in their original structure. Hence, we can perform data storage, search, data update, and also data computation.
- In anonymization, there are two types of data anonymization: one is data masking, and the second is data nonmasking. If data is nonmasked, then we can perform data storage and search on this data. Otherwise, we can only perform data storage.
- Only the data storage operation is supported on the encrypted data, because if the user wants to update some encrypted data that are stored on the cloud database, firstly, the user needs to decrypt this data, and then the user performs some updates on this data.
- A stego image only supports data storage operations because if the user wants to update the data hidden in a stego image, the user, firstly, retrieves that data from the stego image, and the user can perform any modification on this data.
- Only the data storage operation is supported on the signcrypted data, because if the user wants to update signcrypted data that are stored on the cloud database, firstly, the user needs to unsign this data, and then the user can perform any update on this data.
Comparison of data protection techniques.
5. Conclusion and Future Work
5.1. rq4: what are the primary findings, research challenges, and direction for future work in the field of data privacy in cloud computing, 5.1.1. conclusion and research challenges.
In this SLR, we have presented all the data privacy techniques related to data storage on cloud computing systematically, and we also present a comparison among all the protection techniques concerning the five finalities, which are the (i) local proxy overhead, (ii) data accuracy retains, (iii) level of data protection, (iv) transparency, and (v) operation supported. There are some research gaps we found in all these techniques of data splitting, anonymization, steganography, encryption, homomorphic encryption, and signcryption.
- There is a very strong need to develop some ad hoc protocols for the communication of data splitting fragments that are stored on different CSPs, and also, there is a strong need to develop some protocol for the communication between different CSPs. Noncryptographic techniques are faster on different CSPs but do not provide enough security. Hence, we can improve security by developing some methods for data-splitting techniques.
- Anonymity techniques work very effectively on a small amount of data but not for big data. Hence, there is a search gap in which we can develop some anonymity techniques to achieve more efficient performance. Therefore, some anonymous schemes need to be developed, which provide stronger protection to the quasi-identifier. Current anonymity techniques are very immature.
- One of the limitations of steganography is that one can only use it to defend against a third party who does not know steganography. If the third party knows steganography, it can extract the data in the same way that the recipient extracts it. Therefore, we always use encryption with steganography. Therefore, there is a need to develop such steganography techniques that can protect sensitive data from third parties.
- There is a need to develop some cryptographic techniques that can take less time than the existing cryptographic techniques to perform search and computation operation on encrypted data. Cryptographic techniques provide high security but low computational utility. Therefore, it is a search gap to develop some techniques that provide both high security with more efficiency.
- The complexity of homomorphic encryption and decryption is far greater than that of normal encryption and decryption, and it is not applicable to many applications, such as healthcare and time-sensitive applications. Therefore, there is an urgent need to develop such homomorphic encryption schemes that have low complexity and computation cost.
- Signcryption is used to verify and authenticate users. We can obtain confidentiality and authentication using signcryption, however, the main limitation of signcryption is that the calculation costs of the encryption algorithm used in signcryption are very high. Therefore, there is a need to develop such signcryption schemes that use such encryption algorithms, which have low computation cost.
Acknowledgments
This research was financially supported by The Analytical Center for the Government of the Russian Federation (Agreement nos. 70-2021- 00143 dd. 01.11.2021, IGK 000000D730321P5Q0002).
Data Availability
Conflicts of interest.
The authors declare that there are no conflicts of interest regarding the publication of this paper.
Journal of Cloud Computing
Advances, Systems and Applications

Special Issues - Guidelines for Guest Editors
For more information for Guest Editors, please see our Guidelines
Special Issues - Call for Papers
We welcome submissions for the upcoming special issues of the Journal of Cloud Computing
Computational Intelligence Techniques for Pattern Recognition in Multimedia Data Guest Editors: Mughair Aslam Bhatti, Sibghat Ullah Bazai, Konstantinos E. Psannis Submission deadline: 3 February 2024
Advanced Blockchain and Federated Learning Techniques Towards Secure Cloud Computing Guest Editors: Yuan Liu, Jie Zhang, Athirai A. Irissappane, Zhu Sun Submission deadline: 3 February 2024
Mobile Edge Computing Meets AI Guest Editors: Lianyong Qi, Maqbool Khan, Qiang He, Shui Yu, Wajid Rafique Submission deadline: 3 February 2024
Edge/Cloud-based Secure, trustable, and privacy-conscious digital twins Guest Editors: Aftab Ali, Farrukh Aslam Khan, Mohand Tahar Kechadi, Liming Chen Submission deadline: 31 January 2024 Blockchain-enabled Decentralized Cloud/Edge Computing Guest Editors: Qingqi Pei, Kaoru Ota, Martin Gilje Jaatun, Jie Feng, Shen Su Submission deadline: 31 st March 2023
- Most accessed
Design and analysis of wireless data center network topology HCDCN based on VLC
Authors: Qingfang Zhang, Xiaoyu Du, Jie Li and Zhijie Han
Volatile Kernel Rootkit hidden process detection in cloud computing
Authors: Suresh Kumar S and Sudalai Muthu T
CIA-CVD: cloud based image analysis for COVID-19 vaccination distribution
Authors: Vivek Kumar Prasad, Debabrata Dansana, S Gopal Krishna Patro, Ayodeji Olalekan Salau, Divyang Yadav and Madhuri Bhavsar
Algorithm selection using edge ML and case-based reasoning
Authors: Rahman Ali, Muhammad Sadiq Hassan Zada, Asad Masood Khatak and Jamil Hussain
Exploring cross-cultural and gender differences in facial expressions: a skin tone analysis using RGB Values
Authors: Sajid Ali, Muhammad Sharoze Khan, Asad Khan, Muhammad Abdullah Sarwar, MS Syam, Muhammad Aamir, Yazeed Yasin Ghadi, Hend Khalid Alkahtani and Samih M. Mostafa
Most recent articles RSS
View all articles
A quantitative analysis of current security concerns and solutions for cloud computing
Authors: Nelson Gonzalez, Charles Miers, Fernando Redígolo, Marcos Simplício, Tereza Carvalho, Mats Näslund and Makan Pourzandi
Critical analysis of vendor lock-in and its impact on cloud computing migration: a business perspective
Authors: Justice Opara-Martins, Reza Sahandi and Feng Tian
Intrusion detection systems for IoT-based smart environments: a survey
Authors: Mohamed Faisal Elrawy, Ali Ismail Awad and Hesham F. A. Hamed
Load balancing in cloud computing – A hierarchical taxonomical classification
Authors: Shahbaz Afzal and G. Kavitha
Future of industry 5.0 in society: human-centric solutions, challenges and prospective research areas
Authors: Amr Adel
Most accessed articles RSS
Aims and scope
The Journal of Cloud Computing: Advances, Systems and Applications (JoCCASA) will publish research articles on all aspects of Cloud Computing. Principally, articles will address topics that are core to Cloud Computing, focusing on the Cloud applications, the Cloud systems, and the advances that will lead to the Clouds of the future. Comprehensive review and survey articles that offer up new insights, and lay the foundations for further exploratory and experimental work, are also relevant.
Published articles will impart advanced theoretical grounding and practical application of Clouds and related systems as are offered up by the numerous possible combinations of internet-based software, development stacks and database availability, and virtualized hardware for storing, processing, analysing and visualizing data. Where relevant, Clouds should be scrutinized alongside other paradigms such Peer to Peer (P2P) computing, Cluster computing, Grid computing, and so on. Thorough examination of Clouds with respect to issues of management, governance, trust and privacy, and interoperability, are also in scope. The Journal of Cloud Computing is indexed by the Science Citation Index Expanded/SCIE. SCI has subsequently merged into SCIE.
Cloud Computing is now a topic of significant impact and, while it may represent an evolution in technology terms, it is revolutionising the ways in which both academia and industry are thinking and acting. The Journal of Cloud Computing, Advances, Systems and Applications (JoCCASA) has been launched to offer a high quality journal geared entirely towards the research that will offer up future generations of Clouds. The journal publishes research that addresses the entire Cloud stack, and as relates Clouds to wider paradigms and topics.
Chunming Rong, Editor-in-Chief University of Stavanger, Norway
- Editorial Board
- Sign up for article alerts and news from this journal
Annual Journal Metrics
2022 Citation Impact 4.0 - 2-year Impact Factor 4.4 - 5-year Impact Factor 1.711 - SNIP (Source Normalized Impact per Paper) 0.976 - SJR (SCImago Journal Rank)
2022 Speed 12 days submission to first editorial decision for all manuscripts (Median) 86 days submission to accept (Median)
2022 Usage 458,186 downloads 124 Altmetric mentions
- More about our metrics
- ISSN: 2192-113X (electronic)
Benefit from our free funding service
We offer a free open access support service to make it easier for you to discover and apply for article-processing charge (APC) funding.
Learn more here
Green Cloud Computing-To Build A Sustainable Tomorrow
Ieee account.
- Change Username/Password
- Update Address
Purchase Details
- Payment Options
- Order History
- View Purchased Documents
Profile Information
- Communications Preferences
- Profession and Education
- Technical Interests
- US & Canada: +1 800 678 4333
- Worldwide: +1 732 981 0060
- Contact & Support
- About IEEE Xplore
- Accessibility
- Terms of Use
- Nondiscrimination Policy
- Privacy & Opting Out of Cookies
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2023 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

- Conference proceedings
- © 2022
Cloud Computing – CLOUD 2021
14th International Conference, Held as Part of the Services Conference Federation, SCF 2021, Virtual Event, December 10–14, 2021, Proceedings
- Kejiang Ye 0 ,
- Liang-Jie Zhang ORCID: https://orcid.org/0000-0002-6219-0853 1
Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
You can also search for this editor in PubMed Google Scholar
Kingdee International Software Group Co., Ltd., Shenzhen, China
Part of the book series: Lecture Notes in Computer Science (LNCS, volume 12989)
Part of the book sub series: Information Systems and Applications, incl. Internet/Web, and HCI (LNISA)
Conference series link(s): CLOUD: International Conference on Cloud Computing
4453 Accesses
1 Citations
Conference proceedings info: CLOUD 2021.
- Table of contents
- Other volumes
About this book
Editors and affiliations, bibliographic information, buying options.
- Available as EPUB and PDF
- Read on any device
- Instant download
- Own it forever
- Compact, lightweight edition
- Dispatched in 3 to 5 business days
- Free shipping worldwide - see info
Tax calculation will be finalised at checkout
Other ways to access
This is a preview of subscription content, access via your institution .
Table of contents (7 papers)
Front matter, a brokering model for the cloud market.
- Georgios Chatzithanasis, Evangelia Filiopoulou, Christos Michalakelis, Mara Nikolaidou
An Experimental Analysis of Function Performance with Resource Allocation on Serverless Platform
- Yonghe Zhang, Kejiang Ye, Cheng-Zhong Xu
Electronic Card Localization Algorithm Based on Visible Light Screen Communication
- Kao Wen, Junjian Huang, Chan Zhou, Kejiang Ye
BBServerless: A Bursty Traffic Benchmark for Serverless
- Yanying Lin, Kejiang Ye, Yongkang Li, Peng Lin, Yingfei Tang, Chengzhong Xu
Performance Evaluation of Various RISC Processor Systems: A Case Study on ARM, MIPS and RISC-V
- Yu Liu, Kejiang Ye, Cheng-Zhong Xu
Comparative Analysis of Cloud Storage Options for Diverse Application Requirements
- Antara Debnath Antu, Anup Kumar, Robert Kelley, Bin Xie
COS2: Detecting Large-Scale COVID-19 Misinformation in Social Networks
- Hailu Xu, Macro Curci, Sophanna Ek, Pinchao Liu, Zhengxiong Li, Shuai Xu
Back Matter
Other volumes.
The 6 full papers and 1 short paper presented were carefully reviewed and selected from 25 submissions. They deal with the latest fundamental advances in the state of the art and practice of cloud computing, identify emerging research topics, and define the future of cloud computing.
- cloud computing
- Cloud Computing
- communication systems
- computer networks
- Distributed Architecture
- distributed computer systems
- High Availability
- Network performance analysis
- Network performance modeling
- network protocols
- parallel processing systems
- Reliability
- signal processing
- software architecture
- software design
- software engineering
- telecommunication networks
Liang-Jie Zhang
Book Title : Cloud Computing – CLOUD 2021
Book Subtitle : 14th International Conference, Held as Part of the Services Conference Federation, SCF 2021, Virtual Event, December 10–14, 2021, Proceedings
Editors : Kejiang Ye, Liang-Jie Zhang
Series Title : Lecture Notes in Computer Science
DOI : https://doi.org/10.1007/978-3-030-96326-2
Publisher : Springer Cham
eBook Packages : Computer Science , Computer Science (R0)
Copyright Information : Springer Nature Switzerland AG 2022
Softcover ISBN : 978-3-030-96325-5 Published: 26 February 2022
eBook ISBN : 978-3-030-96326-2 Published: 25 February 2022
Series ISSN : 0302-9743
Series E-ISSN : 1611-3349
Edition Number : 1
Number of Pages : XIII, 105
Number of Illustrations : 13 b/w illustrations, 35 illustrations in colour
Topics : Computer Communication Networks , Security , Database Management , Information Systems Applications (incl. Internet) , Software Engineering/Programming and Operating Systems
- Find a journal
- Publish with us
green cloud computing Recently Published Documents
Total documents.
- Latest Documents
- Most Cited Documents
- Contributed Authors
- Related Sources
- Related Keywords
Unconstrained Power Management Algorithm for Green Cloud Computing
Green cloud computing-oriented algorithmic approach-based virtual machine consolidation, green-cloud computing (g-cc) data center and its architecture toward efficient usage of energy, a new metaheuristic‐based method for solving the virtual machines migration problem in the green cloud computing, mapping and consolidation of vms using locust-inspired algorithms for green cloud computing, enabling rank-based distribution of microservices among containers for green cloud computing environment, an optimized vm placement approach to reduce energy consumption in green cloud computing, investigation study on secured data communication with blockchain and iot in green cloud computing, energy aware load balancing algorithm for upgraded effectiveness in green cloud computing, study on green cloud computing—a review, export citation format, share document.
All cloud computing research papers
Free ieee paper and projects, ieee projects 2022, seminar reports, free ieee projects ieee papers.

An official website of the United States government
Here’s how you know
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Take action
- Report an antitrust violation
- File adjudicative documents
- Find banned debt collectors
- View competition guidance
- Competition Matters Blog
Contract Terms That Impede Competition Investigations
View all Competition Matters Blog posts
We work to advance government policies that protect consumers and promote competition.
View Policy
Search or browse the Legal Library
Find legal resources and guidance to understand your business responsibilities and comply with the law.
Browse legal resources
- Find policy statements
- Submit a public comment

Vision and Priorities
Memo from Chair Lina M. Khan to commission staff and commissioners regarding the vision and priorities for the FTC.
Technology Blog
Cloud computing rfi: what we heard and learned.
View all Technology Blog posts
Advice and Guidance
Learn more about your rights as a consumer and how to spot and avoid scams. Find the resources you need to understand how consumer protection law impacts your business.
- Report fraud
- Report identity theft
- Register for Do Not Call
- Sign up for consumer alerts
Get Business Blog updates
- Get your free credit report
- Find refund cases
- Order bulk publications
- Consumer Advice
- Shopping and Donating
- Credit, Loans, and Debt
- Jobs and Making Money
- Unwanted Calls, Emails, and Texts
- Identity Theft and Online Security
- Business Guidance
- Advertising and Marketing
- Credit and Finance
- Privacy and Security
- By Industry
- For Small Businesses
- Browse Business Guidance Resources
- Business Blog
Servicemembers: Your tool for financial readiness
Visit militaryconsumer.gov
Get consumer protection basics, plain and simple
Visit consumer.gov
Learn how the FTC protects free enterprise and consumers
Visit Competition Counts
Looking for competition guidance?
- Competition Guidance
News and Events
Latest news, ftc approves modifications to horseracing integrity and safety authority’s anti-doping and medication control rule.
View News and Events
Upcoming Event
Chair khan discussion at 2023 new york times dealbook summit.
View more Events
Sign up for the latest news

Follow us on social media
--> --> --> --> -->

Open Commission Meetings
Track enforcement and policy developments from the Commission’s open meetings.
Latest Data Visualization

FTC Refunds to Consumers
Explore refund statistics including where refunds were sent and the dollar amounts refunded with this visualization.
About the FTC
Our mission is protecting consumers and competition by preventing anticompetitive, deceptive, and unfair business practices through law enforcement, advocacy, and education without unduly burdening legitimate business activity.
Learn more about the FTC

Meet the Chair
Lina M. Khan was sworn in as Chair of the Federal Trade Commission on June 15, 2021.
Chair Lina M. Khan
Looking for legal documents or records? Search the Legal Library instead.
- Cases and Proceedings
- Premerger Notification Program
- Merger Review
- Anticompetitive Practices
- Competition and Consumer Protection Guidance Documents
- Warning Letters
- Consumer Sentinel Network
- Criminal Liaison Unit
- FTC Refund Programs
- Notices of Penalty Offenses
- Advocacy and Research
- Advisory Opinions
- Cooperation Agreements
- Federal Register Notices
- Public Comments
- Policy Statements
- International
- Military Consumer
- Consumer.gov
- Bulk Publications
- Data and Visualizations
- Stay Connected
- Commissioners and Staff
- Bureaus and Offices
- Budget and Strategy
- Office of Inspector General
- Careers at the FTC

Cloud computing has emerged and grown significantly over the recent decades – from its infancy in 2004, to a $576 billion industry in 2023. [1] FTC’s Office of Technology, Bureau of Competition, and Bureau of Consumer Protection worked together to examine four specific areas of cloud computing through a Request for Information (RFI) and a public panel of cloud computing experts. These areas included competition, single points of failure, security, and AI.
The FTC received 102 public comments from a range of stakeholder groups, including industry participants, academia, and civil society groups. All comments are posted on the RFI’s public docket . Below we highlight key themes that emerged in both the RFI responses and panel – and pose a few forward-looking questions.
Competition
The majority of comments addressed issues related to competition, with the following themes most frequently mentioned.
Software Licensing Practices. Many commenters pointed to certain software licensing practices as a source of concern. For instance, some submissions argued that certain cloud providers have practices related to licensing software that limit the ability to use certain software in other cloud infrastructure providers’ environments, particularly when a company is moving existing on-premises workloads to the cloud. [2]
Egress Fees. A number of submissions raised concerns about the fees paid by cloud customers to transfer their data out of (and within) certain cloud environments. According to some commenters, these egress fees could have the effect of discouraging customers from using multiple cloud providers or switching from one provider to another. [3]
Minimum Spend Contracts. Some RFI submissions argued that certain provisions in cloud computing contracts incentivize customers to consolidate their use of cloud services to just one cloud provider. For instance, minimum spend contracts typically involve a cloud provider discounting its services in return for an agreed upon committed spend. Some commenters argued that these types of provisions act as a lock-in mechanism, and that customers are pushed to use just one cloud provider for all the services the customer needs - even if other providers offer certain superior services. [4]
Single Points of Failure
A number of submissions raised concerns about the widespread reliance on a small number of cloud providers – arguing that outages, or other issues that degrade the service of a cloud provider, could have a cascading impact on the economy or specific sectors. [5] Such degradations could be the result of an issue inadvertently introduced by a cloud provider, or the result of a targeted attack. [6] Another submission addressed how the resiliency of cloud systems depends on the implementation details of a cloud provider’s offerings, and a cloud customer’s use of those offerings. [7]
Panelists and RFI submissions noted that cloud services can provide a higher baseline level of security than on-premises options, and that cloud services offer a range of sophisticated security options and tooling. [8] This is especially important for small businesses and startups that can access more robust security options by using cloud services. [9] However, a number of commenters argued there is a great deal of room for improvement in cloud security; [10] that default security configurations could be better; [11] and that the “shared responsibility” model for cloud security often lacks clarity, which can lead to situations where neither the cloud provider nor the cloud customer implements necessary safeguards. [12]
Generative AI and Cloud
The relationship between generative AI and cloud computing was another area of focus among RFI submissions and participants in the cloud computing panel. Generative AI products are heavily reliant on cloud providers. [13] Some argue that cloud credits as a form of investment — in which investors provide money that can only be spent on their cloud services — could lead to vendor lock-in. [14] The FTC is paying close attention to the development of generative AI markets, and recently wrote on the topic of generative AI and potential competition concerns.
Looking forward
The Cloud RFI and panel shined a light on the business practices of cloud computing providers. Looking ahead, here are areas of on-going interest and inquiry.
Are there signs that cloud markets are functioning less than fully competitively, and that certain business practices are inhibiting competition?
Are cloud providers incentivized enough by competition to create systems that are sufficiently secure?
Will competition alone create resilient systems, or is government intervention needed to avoid single points of failure? What policy options are available to improve resiliency and avoid single points of failure?
How will cloud providers respond to a limited supply of specialized AI chips? How will markets for these chips develop given their importance to rapidly developing AI markets, and the growing demand for specialized AI chips?
Thank you to staff from across the Office of Technology, the Bureau of Competition, and the Bureau of Consumer Protection who collaborated on this effort (in alphabetical order): Krisha Cerilli, Mark Eichorn, Patricia Galvan, Alex Gaynor, Hillary Greene, Elisa Jillson, Nick Jones, Kevin Moriarty, Stephanie T. Nguyen, Dan Principato, and Kelly Signs.
More from the Technology Blog
Preventing the harms of ai-enabled voice cloning, consumers are voicing concerns about ai, generative ai raises competition concerns, an inquiry into cloud computing business practices: the federal trade commission is seeking public comments.
Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser .
Enter the email address you signed up with and we'll email you a reset link.
- We're Hiring!
- Help Center

Cloud computing research paper
by kavitha chinnadurai
Free Related PDFs
Venkataravana Nayak
Cloud security is one of most important issues that has attracted a lot of research and development effort in past few years. Particularly, attackers can explore vulnerabilities of a cloud system and compromise virtual machines to deploy further largescale Distributed Denial-of-Service (DDoS).DDoS attacks usually involve early stage actions such as multi-step exploitation, low frequency vulnerability scanning, and compromising identified vulnerable virtual machines as zombies, and finally DDoS attacks through the compromised zombies. Within the cloud system, especially the Infrastructure-as-a-Service (IaaS) clouds, the detection of zombie exploration attacks is extremely difficult. This is because cloud users may install vulnerable applications on their virtual machines. To prevent vulnerable virtual machines from being compromised in the cloud, we propose a multi-phase distributed vulnerability detection, measurement, and countermeasure selection mechanism called NICE, which is bui...

IJERT Journal
2014, International Journal of Engineering Research and Technology (IJERT)
https://www.ijert.org/zombie-avoidance-using-attack-analyzer-in-cloud-environment https://www.ijert.org/research/zombie-avoidance-using-attack-analyzer-in-cloud-environment-IJERTV3IS041961.pdf Cloud Computing refers to On-demand network access to a shared pool of manageable computing resources which are applicable to both software and hardware services. Cloud Computing makes use of available resources on "on-need" basis. Security problem plays the most Significant role in cloud computing. Due to this nature, attackers launch Distributed Denial of Service to make resources unavailable to potential users. Usually, Denial of Service easily attacks the virtual machines as Zombies but it is extremely hard to detect Zombies. In this paper, we have proposed a viable approach to prevent the vulnerable virtual machine from Zombies through multi-phase distributed susceptibility detection, measurement and countermeasure selection mechanism called NICE. This experimental model shows a more secure and reliable network access to reconfigure the virtual network.

https://www.ijert.org/an-efficient-mechanism-for-intrusion-detection-and-prevention-of-zombie-attacks-in-cloud https://www.ijert.org/research/an-efficient-mechanism-for-intrusion-detection-and-prevention-of-zombie-attacks-in-cloud-IJERTV3IS042095.pdf Cloud Computing is an emerging technology in recent trends as it provides several services to user. One of the main objectives of cloud is to provide storage capability of the users. Hence security is the main concern with respect to cloud computing as the users are given access to install various applications and store data in the cloud. Among the various attacks on the cloud, Zombie attack and in particular Distributed Denial of Service (DDoS) is considered loosely in the literature survey. DDoS involves the attacker launching an attack at early stage with multistep exploitation, low-vulnerability scanning and compromising the identified vulnerable virtual machine as zombies and later launching a DDoS through the zombie machines. Detecting such an attack is difficult as the cloud user may install vulnerable application in the virtual machine created for the user's purpose. This survey paper aims at compiling the various mechanism developed so far to provide solutions for the various attacks and we conclude by providing an efficient intrusion detection system to identify zombie machine in particular and avoid the attacker from launching an DDoS attack.

IERJ Journal
This paper focuses on DDoS problem and trying to give solution using auto correlation and alert generation methods. Cloud trace back model has efficient and it's dealing with DDoS attacks using back propagation neural network method and finds that the model is useful in tackling Distributed Denial of Service attacks. Distributed denial of service attacks has become more sophisticated as to exploit application-layer vulnerabilities. NICE (Network Intrusion Detection and Countermeasure Selection) is used to propose multiphase distributed vulnerability detection for attack measurement, and the countermeasure selection mechanism which is built on attack graph-based analytical models and reconfigurable virtual network-based countermeasures. The systems and security evaluations demonstrate the efficiency and effectiveness of the solution.

Editor IJCS

GJESR Journal
Cloud computing is getting popular now a days. Use of cloud is increase daily. As in the cloud environment resources such as OS virtual machines, software are shared by billions of users of the cloud. The virtual machines resides on the cloud are more vulnerable to the denial of service attack. If this machines are connected to more achiness then it becomes more dangerous as it harms all cloud network. In the cloud especially infrastructure as a service the detection of denial of service attack is more challenging task. This is due to cloud users can install vulnerable software on the virtual machines. In this paper we have proposed multiphase vulnerability detection in the cloud environment. We have proposed an open flow network program NICE to detect and mitigate the attacks on the virtual machines. NICE is built on the scenario attack graph based model. We have proposed a novel approach to mitigate the attacks in the cloud environment by selecting different countermeasure depending upon the percentages of vulnerability of the virtual machines. Now a days IDS is used to detect the attack in the network by many organizations. In the proposed system we focuses on the distributed denial of service attack in the cloud. Keywords: Cloud computing, Ccenario attack graph, Correlation, Network analyzer, Intrusion, zombies.

Intrusion Detection and Prevention Systems (IDPS) are used: to identify possible attacks, collecting information about them and the trying to stop their occurrence and at last reporting them to the system administrator. These systems are used by some organizations to detect the weaknesses in their security policies, documenting existing attacks and threats and preventing an individual from violating security policies. Because of their advantages these systems became an important part of the security infrastructure in nearly every organization. In a Cloud computing environment, attackers can determine the vulnerabilities in the cloud systems and compromise the virtual machines to set out large scale Distributed Denial-of-Service (DDOS) attack. To avert these virtual machines from concession, we propose a multi-phase solution NICE (Network Intrusion Detection and Countermeasure selection in Virtual Network Systems).

Hosam El-Sofany
2020, International Journal of Intelligent Engineering and Systems

Anh Trần Việt
—Innovation is necessary to ride the inevitable tide of change. The buzzword of 2009 seems to be "cloud computing" which is a futuristic platform to provides dynamic resource pools, virtualization, and high availability and enables the sharing, selection and aggregation of geographically distributed heterogeneous resources for solving large-scale problems in science and engineering. But with this ever developing cloud concept, problems are arising from this ―golden solution‖ in the enterprise arena. Preventing intruders from attacking the cloud infrastructure is the only realistic thing the staff, management and planners can foresee. Regardless of company size or volume and magnitude of the cloud, this paper explains how maneuver IT virtualization strategy could be used in responding to a denial of service attack. After picking up a grossly abnormal spike in inbound traffic, targeted applications could be immediately transferred to virtual machines hosted in another datacenter. We're not reinventing the wheel. We have lots of technology and standardized solutions we can already use to engineer into the stack. We are just introducing them in the way least expected.

IRJET Journal
2020, IRJET
Clouding Computing is very popular nowadays. It is the on demand availability of computer system resources like data storage and computing program, information or multimedia content. All these data or information is available to different user with the help of internet. The cloud computing architecture in which third party users, virtual machine and cloud service provider are involved for data uploading and downloading with the help of internet. Security of this architecture is the main issue, today's organizations, but there exist many security vulnerabilities. Among all type of security attacks zombie attack is the most dangerous type of attack. This attack decreases the network performance in terms of delay, information and bandwidth consumption. In the zombie attack, some unauthorized user may join the network which spoof data of the authorized user and zombie nodes start communicate with virtual machine on the behalf of authorized user. In this proposed work, the technique based on the strong authentication which has been detecting malicious and unauthorized user from the network on cloud and isolates them from the cloud architecture. There are various techniques, methods and algorithms described in this paper to isolate a zombie attack and other security vulnerabilities at cloud architecture. We discuss the methods for detection and stopping this kind of zombie attack from cloud.

FREE RELATED PAPERS
Jamal Bentahar
2017, IEEE Transactions on Services Computing

SALIFU ABDUL-MUMIN
Asian Journal of Research in Computer Science
The cloud computing architecture is a berth in which third party, virtual machine and cloud service providers are involved in data uploading and downloading. A major challenge in this architecture, however, is the security of the data as there exist various forms of attacks from malicious peopleand devices. Among these security attacks, the zombie attack is the most advance type of attack. The zombie attack reduces network performance in terms of delay and bandwidth consumption. With zombie attack, some malicious users may join the network which, in turn takes off the data of legitimate users and at the same time enable zombie nodes to communicate with a virtual machine on behalf of the legitimate user. In this paper, a technique based on strong authentication which, is able to detect malicious users from a network and isolates them from the cloud architecture is proposed.

A/Prof Abeer Alsadoon
Cloud computing is a distributive and scalable computing architecture. It provides sharing of data and other resources which are accessible from any part of the world for a very low cost. However, Security is one major concern for such computing environment. Distributed Denial of Service (DDoS) is an attack that consumes all the cloud resources may have making it unavailable to other general users. This paper identifies characteristics of DDoS attack and provides an Intrusion Detection System (IDS) tool based on Snort to detect DDoS. The proposed tool will alert the network administrator regarding any attack for any possible resources and the nature of the attack. Also, it suspends the attacker for some time to allow the network admin to implement a fall back plan. As Snort is an open source system, modifying different parameters of the system showed a significant aid in not only detection of DDoS, but also reduction the time for the down time of the network. The proposed tool helps minimize the effect of DDoS by detecting the attack at very early stage and by altering with various parameters which facilitates easy diagnose of the problem.

International Journal IJRITCC
With the recent rise of cloud computing, virtualization has become an increasingly important technology. Virtual machines (VM) are rapidly replacing physical machine infrastructures for their abilities to emulate hardware environments and share resources. Virtual machines are vulnerable to theft and denial of service attacks. Cloud users can install vulnerable software on their VMs, which essentially leads to violation in cloud security. Attackers can explore vulnerabilities of a cloud system and compromise virtual machines to deploy further large-scale Distributed Denial-of-Service (DDoS). This paper proposes a framework to detect and mitigate attacks in the cloud environment. Host-based IDS solutions are incorporated in Network Intrusion Detection and Countermeasure Selection in virtual Network Systems (NICE) framework to cover the whole spectrum of IDS in the cloud system. The proposed solution improves the detection accuracy.

Sabah Alzahrani
2018, Journal of Information Security

In the area of research and development effort for cloud computing, Cloud security is considered as one of challenging issues. Most commonly faced attacks are Distributed Denial-of-Service (DDoS) attacks. DDoS attacks are variation of DOS attack at distributed and large-scale level. Firstly attacker tries to discover the vulnerabilities or we can say loopholes of a cloud system and takes control over the virtual machines. And then gets success in deploying DDoS at large scale. Such attacks includes certain actions at initial stage such as exploitation in multiple steps, scanning for uncommon or less occurring vulnerabilities, identified vulnerabilities are utilized against virtual machines to use them as zombies and finally DDOS is achieved through these compromised zombies. To avoid vulnerable virtual machines from being compromised in the cloud system, proposed approach uses multiphase vulnerability detection at distributed level, measurement, countermeasure selection mechanism ca...

Cloud computing is the subject of the era and is the current keen domain of interest of organizations due to its promising opportunities and catastrophic impacts on availability, confidentiality and integrity. On the other hand, moving to cloud computing paradigm, new security mechanisms and defense frameworks are being developed against all threats and malicious network attacks that threaten the service availability of cloud computing for continuity of public and private services. Considering the increasing usage of cloud services by government bodies poses an emerging threat to e-government and e-governance structures and continuity of public services of national and local government bodies. IoT, industry 4.0, smart cities and novel artificial intelligence (AI) applications that require devices to be connected in ever present cloud platforms, provide an increasing wide range of potential zombie armies to be used in Distributed Denial of Service (DDoS) attacks which are amongst the most critical attacks under cloud computing environment. In this survey, we discuss in detail the classification of DDoS attacks threatening the cloud computing components and make analysis and assessments on the emerging usage of cloud infrastructures that poses both advantages and risks. We assert that considering various kinds of DDoS attack tools, proactive capabilities, virtual connecting infrastructures and innovative methods which are being developed by attackers very rapidly for compromising and halting cloud systems, it is of crucial importance for cyber security strategies of both national, central and local government bodies to consider pertinent pre-emptive countermeasures periodically and revise their cyber strategies and action plans dynamically.

tibA shokatpour

Dr.G. Suseendran

sarah Naiem
2022, Journal of Theoretical and Applied Information Technology

Mqhele Dlodlo
2016, Journal of Network and Computer Applications

vikram karimella
Nowadays every industry and even some parts of the public sector are using cloud computing , either as a provider or as a consumer. But there are many security issues present in cloud computing environment. There are many possible attacks in cloud computing environment, One such attack is the DoS or its version DDoS attack. Generally, attackers can explore vulnerabilities of a cloud system and compromise virtual machines to deploy further large-scale Distributed Denial-of-Service (DDoS). DDoS attacks usually involve early stage actions such as low frequency vulnerability scanning, multi-step exploitation and compromising identified vulnerable virtual machines as zombies and finally DDoS attacks using the compromised zombies. Inside the cloud system, especially the Infrastructure-as-a-Service clouds, the detection of zombie exploration attacks is very difficult. To prevent vulnerable virtual machines from being compromised in the cloud, we propose a multi-phase distributed vulnerability detection, measurement, and countermeasure selection mechanism called NICE, which is built on attack graph based systematic models and reconfigurable virtual network-based countermeasures. This paper provides a short Reveiw on the techniques to network intrusion detection and countermeasure selection in virtual network system.

Ziyad R . Alashhab
Cloud computing (CC) plays a significant role in revolutionizing the information and communication technology (ICT) industry, allowing flexible delivery of new services and computing resources at a fraction of the costs for end-users than traditional computing. Unfortunately, many potential cyber threats impact CC-deployed services due to the exploitation of CC’s characteristics, such as resource sharing, elasticity, and multi-tenancy. This survey provides a comprehensive discussion on security issues and challenges facing CC for cloud service providers and their users. Furthermore, this survey proposes a new taxonomy for classifying CC attacks, distributed denial of service (DDoS) attacks, and DDoS attack detection approaches on CC. It also provides a qualitative comparison with the existing surveys. Finally, this survey aims to serve as a guide and reference for other researchers working on new DDoS attack detection approaches within the CC environment.

Mohammad Hammoudeh
2015, International Journal of Advanced Computer Science and Applications

W Wilfred Godfrey
2020, Wireless Personal Communications

Rohini Pise
— Nowadays Cloud Computing is one of the most emerging Technology and many user works on cloud environment; but the Security in the Cloud environment is the key challenge to such system. Most of the time attackers attack on a machine in a network and try to compromise it as Zombie and Originate DDoS-Distributed Dos Attack from it. There is a need to Detect such Compromised Machine involve in suspicious activities named as Zombies and preventing them from spreading DDoS attacks in the network. Detection of such Zombies in Cloud system particularly in Infrastructure as a Service (IaaS) Cloud is very difficult job.

Editor IJRET

Anagha Rajeev
2020, International Journal of Cloud Applications and Computing
Distributed denial of service (DDoS) attacks are some of the biggest threats to network performance and security today. With the advent of cloud computing, these attacks can be performed remotely on rented virtual machines (VMs), potentially increasing their capabilities and making them harder to trace and mitigate, and negatively affecting the cloud service provider as well. By analyzing packet transmission statistics, attacks can be detected on a virtual machine monitor (VMM) that controls the behavior of the VMs. This article proposes a solution to stop such detected attacks from the source, and analyses solutions proposed for a few different types of such attacks. The authors propose a model called selective cloud egress filter (SCEF) which implements specific modules to deal with detected attacks. If an attack is detected, the SCEF relays information to the VMM about which VMs are participating in the attack, allowing for specific corrective action.

Journal of Computer Science IJCSIS
This intrusion detection mechanism examines the traffic in real-time environment to resolve if someone is sending malicious traffic or attacks on the cloud infrastructure. In this research work, a novel attack detection system called CMIDS (Composite Metric Intrusion Detection System) has been developed that worked in a cloud environment for detection of DDOS attacks. In this process different virtual machines (VM) are monitored individually and their related record is maintained as a profile. Various system, network, application and technology based characteristics (HTTP, CPU, Bandwidth usage, RAM etc.) of all these virtual machines are analyzed to get a confirmation of attack incidences. The composite metric is based on these characteristics. This metric is probed against successive times for normal and malignant behavior. Golden ratio search method is used to identify the outliers in any. The outcomes of series of experiments show that this IDS has high detection rate in many conditions verified against the actual ones. Simulated flood based attacks on the cloud are examined and a novel threshold based algorithm is constructed for detection of these attacks on cloud based network after conducting exhaustive survey of characteristics critical for the stability of the cloud operations.

Thakwan jawad
Cihan University-Erbil Scientific Journal
This paper aimed to identify the various kinds of distributed denial of service attacks (DDoS) attacks, their destructive capabilities, and most of all, how best these issues could be counter attacked and resolved for the benefit of all stakeholders along the cloud continuum, preferably as permanent solutions. A compilation of the various types of DDoS is done, their strike capabilities and most of all, how best cloud computing environment issues could be addressed and resolved for the benefit of all stakeholders along the cloud continuum. The key challenges against effective DDoS defense mechanism are also explored.

Journal of Computer Science IJCSIS , Talal Halabi , M. Bellaiche
Security is one of the obstacles preventing the full migration towards Cloud Computing. Denial of Service attacks are serious threats to the Cloud security, and current research is directing enormous efforts to find convenient defense solutions against these attacks in terms of robustness and performance. The evaluation of these solutions is a must before implementation on real Cloud infrastructures. In this paper, we conduct a survey on the evaluation strategies and critical metrics that are adopted by recent research in order to evaluate the defense solutions against DoS and DDoS attacks in Cloud Computing. We also present an evaluation taxon-omy that helps future researchers to efficiently evaluate the defense solutions they develop, and allows Cloud Service Providers (CSPs) to understand the process of evaluation of these solutions and carefully select the defense system that is suitable for their Cloud infrastructures.

Mohd Nazri Ismail , Udendran Mudaliyar
As a result of integration of many techniques such as grading, clustering, utilization computing and resource's sharing, cloud computing has been appeared as multi element's composition technology, it offers several computing services such as IaaS (infrastructure as service), PaaS (platform as service) and SaaS (software as service) based on pay as you use rule, but nevertheless, and because of cloud computing end users participate in computing resources (co_tenancy) , and by which infrastructure computing can be shared by a number of users, and as a result to this feature, some security challenges has been existed and one of the most serious security threats is flooding attack, which prevent other users from using cloud infrastructure services, that kind of attack can be done by a legitimate or illegitimate cloud computing users. To overcome this problem various approaches have been proposed based on Artificial intelligence and statistical methods, but most of them concentrate on one side of problem and neglect the other aspects. In our proposed approach, the focusing will be more in overcoming the problem in all its aspects, in attack detection stage covariance matrix statistical method will be applied and to determine attack source TTl (Time_to_Life) value counting method will be used, and the attack prevention will be based on Honeypot method, and initial simulation to this approach using UML class diagram and sequence diagram showed where our proposed framework can be done in cloud environment

2022, CERN European Organization for Nuclear Research - Zenodo

prabadevi sudhagar

Cloud computing offers users high-end and scalable infrastructure at an affordable cost. Virtualisation is the key to unlocking cloud computing. Although virtualisation has great benefits to the users, the complexity in its structure, introduces unseen and forcible threats to the security of the data and to the system infrastructure. This investigates the exploitation of compromised virtual machines to execute large-scale Distributed Denial-of-Service (DDoS) attacks. A critical review of most recent intrusion detection and prevention systems to mitigate potential DDoS attacks is presented.

Masudur Rahman
2014, International Journal of Advanced Computer Science and Applications

IJICTR Journal
2017, International Journal of Information and Communication Technology Research
Cloud computing is a dynamic environment that offers variety of on-demand services with low cost. However, customers face new security risks due to shared infrastructure in the cloud. Co-residency of virtual machines on the same physical machine, leads to several threats for cloud tenants. Cloud administrators are often encountered with a more challenging problem since they have to work within a fixed budget for cloud hardening. The problem is how to select a subset of countermeasures to be within the budget and yet minimize the residual damage to the cloud caused by malicious VMs. We address this problem by introducing a novel multi-objective attack response system. We consider response cost, co-residency threat, and virtual machines interactions to select optimal response in face of the attack. Optimal response selection as a multi-objective optimization problem calculates alternative responses, with minimum threat and cost. Our method estimates threat level based on the collaboration graph and suggests proper countermeasures based on threat type with minimum cost. Experimental result shows that our system can suggest optimal responses based on the current state of the cloud.

Iflah Aijaz
Lecture Notes in Networks and Systems

IJSRD Journal
Cloud computing is getting popular now a days. Use of cloud is increase daily. As in the cloud environment resources such as OS virtual machines, software is shared by billions of users of the cloud. The virtual machines resides on the cloud are more vulnerable to the denial of service attack. If these machines are connected to more achiness then it becomes more dangerous as it harms all cloud networks. In the cloud especially infrastructure as a service the detection of denial of service attack is more challenging task. This is due to cloud users can install vulnerable software on the virtual machines. In this paper we have proposed multiphase vulnerability detection in the cloud environment. We have proposed an open flow network program NICE to detect and mitigate the attacks on the virtual machines. NICE is built on the scenario attack graph based model. We have proposed a novel approach to mitigate the attacks in the cloud environment by selecting different countermeasure depending upon the percentages of vulnerability of the virtual machines. Now a day IDS is used to detect the attack in the network by many organizations. In the proposed system we focus on the distributed denial of service attack in the cloud.

Sam Goundar
2018, International Journal of Information Technology and Web Engineering
This article describes how cloud computing has become a significant IT infrastructure in business, government, education, research, and service industry domains. Security of cloud-based applications, especially for those applications with constant inbound and outbound user traffic is important. It becomes of the utmost importance to secure the data flowing between the cloud application and user systems against cyber criminals who launch Denial of Service (DoS) attacks. Existing research related to cloud security focuses on securing the flow of information on servers or between networks but there is a lack of research to mitigate Distributed Denial of Service attacks on cloud environments as presented by Buyya et al. and Fachkha, et al. In this article, the authors propose an algorithm and a Hybrid Cloud-based Secure Architecture to mitigate DDoS attacks. By proposing a three-tier cloud infrastructure with a two-tier defense system for separate Network and Application layers, the authors t DDoS attacks can be detected and blocked before reaching the infrastructure hosting the ud applications.

islam taj-eddin
Journal of Communications

Phalguna Krishna Sathyanarayana Ediga
2014, Global journal of computer science and technology
Security is considered as most crucial aspect in cloud computing. It has attracted lots of research in the recent years. On the other hand, attackers are exploring and exploiting the vulnerabilities in cloud. The heart of the Cloud computing lies in Virtualization technology. Attackers are taking the advantage of vulnerabilities in Virtual Machines and they can able to compromise virtual machines thereby launching DDOS attacks. Services such as Saas,IaaS which are meant to support end users may get affected and attackers may launch attacks either directly or by using zombies. Generally, Data Centres own security policies for dealing with security issues. Suppose incase of DDoS attacks, only the policies which deals with it ,can only been applied. However, in datacenters, all the security policies are commonly been applied on the applications irrespective of their category or security threats that it face. The existing approach consumes lots of time and wastage of resources. In this ...

Baldev SIngh
—Distributed denial of service (DDOS) attack constitutes one of the prominent cyber threats and among the hardest security problems in modern cyber world. This research work focuses on reviewing DDOS detection techniques and developing a numeric stable theoretical framework used for detecting various DDOS attacks in cloud. Main sections in the paper are devoted to review and analysis of algorithms used for detection of DDOS attacks. The framework theorized here deals with the variability calculation method in conjunction with sampling, searching methods to find a current state of a particular parameter under observation for detecting DDOS attacks. This way a solution is to build that measure the performance and conduct the monitoring framework to capture adversity related to DDOS attacks. The described algorithm intends to capture the current context value of the parameters that determine the reliability of the detection algorithm and the online pass algorithm helps to maintain the variability of those collected values thus maintaining numerical stability by doing robust statistical operations at endpoints of traffic in cloud based network.

Bineet Kumar Joshi , Aman Sagar

Udendran Mudaliyar
Cloud computing is the new buzz word in computer oriented services and in IT industry. But, even now cloud computing is in its infancy, there numerous challenges like performance, security, availability, etc. The infrastructure of cloud computing is potentially shared by millions of users , Denial of Service (DoS) attacks have become a major threat to cloud computing where it prevents the users from using cloud infrastructure services and this kind of attack can be made through legitimate or illegitimate cloud computing users. There are several approaches which have been proposed to overcome this DoS attack but most of these approaches gives attention to one only part of the problem and ignoring the other parts of the problem. In this paper, I propose a framework which concentrates on overcoming all the aspects of the problem, monitoring the flow of SYN packets by using a correlation engine or a flow traffic tool like snort, wireshark, is used for attack detection stage and to find the attack source, the time to live (TTL) and packet marking technique is used and for preventing, a honeypot method is used . I have also proposed a framework experimenting test bed for deploying this framework.

rajat saxena
2015, Lecture Notes in Computer Science

https://www.ijert.org/non-intrusive-decentralized-attack-analyzer-network-controller-to-monitoring-and-prevent-vulnerability-in-vm-network-system https://www.ijert.org/research/non-intrusive-decentralized-attack-analyzer-network-controller-to-monitoring-and-prevent-vulnerability-in-vm-network-system-IJERTV3IS10470.pdf Cloud security is one of most important issues that have attracted a lot of research and development effort in past few years Vulnerable virtual machines are an easy target and existence of such weak nodes in a network its entire security structure. Resource sharing nature of cloud favors the attacker, in that, compromised machines can be used to launch further devastating attacks. We propose a hybrid intrusion detection framework to detect vulnerabilities, attacks, and their carriers, i.e. malicious processes in the virtual network and virtual machines. This framework is built on attack graph based decentralized analytical models, VMM-based malicious process detection, and reconfigurable virtual network-based countermeasures. The proposed framework leverages Software Defined networking to build a monitor and control plane over distributed programmable virtual switches in order to significantly improve the attack detection and mitigate the attack consequences.

2019, Cluster Computing

Cloud Computing Security - Concepts and Practice

Dr. Dhruba Ningombam
Advances in Intelligent Systems and Computing

RELATED TOPICS
- We're Hiring!
- Help Center
- Find new research papers in:
- Health Sciences
- Earth Sciences
- Cognitive Science
- Mathematics
- Computer Science
- Academia ©2023
- Privacy Policy
Cloud Computing Research Paper Example
Big data challenges in iot and cloud.
The Internet of Things (IOT) is described as a collection of physical devices or anything else that aids in the collective transmission of data across a network without the need for human-to-human or human-to-computer contact. For example, human individuals are unable to capture their data wherever and at any time, casting doubt on its accuracy and availability.
Cloud Architecture Implementation at Metasoft Solutions
MetaSoft board members may reap the advantages of cloud computing, since there may be cost savings associated with Cloud Architecture Implementation at Metasoft Solutions. There is a cloud computing advantage in that IT costs may be reduced, and the organisation may be able to generate more money. With the support of this software, the MetaSoft board of directors can concentrate on keeping capital and operating costs to a minimum. However, it is recognised that there may be Reliability in MetaSoft board operations due to cloud computing, which may be wonderful and consistent. Nonetheless, it is recognised that there may be Reliability in MetaSoft board operations............
Amazon’s New Store Utility Computing Case Study Solution
Cloud computing, often known as utility computing, is a service provided by Amazon. Amazon sells processing power to other firms in exchange for utility services such as natural gas, electricity, and water. These firms only pay for what they use. Amazon, like other companies, only employed a tiny part of their computational power at any one moment. Many individuals and corporations consider the organization's architecture to be one of the most globally resilient.......
Advantages of Cloud Hosting for Business
People may use cloud hosting to get access to work that is linked to the job they are performing at the moment through the internet. By providing users with convenient communication facilities, the Internet has reduced differences and brought people closer. Cloud hosting enhances efficiency, improves corporate systems such as cash flow management, and provides various advantages to consumers. Converting to cloud hosting increases your company's progress standing. Many small companies, when motivated, migrate to cloud hosting for a variety of reasons. These services are ideal for today's expanding corporate needs. If your business's requirements and working demands grow, you may quickly scale up your computer resources. You may adjust it again if your needs and expectations decrease. Cloud hosting offers this service at a far lower cost than prior, more expensive means of functioning.
Literature Review on Virtual Machines and Cloud Computing
Different ways for putting Virtual Machines for Clouds were addressed by Nicolo Maria Calcavechia, Ofer Biran, Erez Hadad, and Yosef Moatti [1]. Although much study has been done on the subject of deploying virtual machines on cloud infrastructure, the dynamic character of the incoming virtual machine deployment stream has been overlooked. This article discusses a real model of a cloud management system under a stream of requests. In addition, Backward Speculative Placement, a novel approach, is being contested. In two algorithms, the BSP approach is explored......
Cloud Computing in Healthcare
In recent years, the expense of healthcare has been steadily increasing. It's becoming more difficult to locate healthcare personnel as rates climb. As a result, healthcare organisations have been pushed to embrace information systems that enable them to automate the majority of their activities and, as a result, deliver more efficient services. One such technology that health institutions are embracing is cloud computing. The writers of outline and discuss some of the advantages and disadvantages of cloud computing deployment. In addition, the paper discusses the limits of existing systems. High installation and maintenance costs, ineffective sharing of patient data, weak legislation governing the use of medical data, and a lack of a common design framework are all issues with present systems. Some of the issues are addressed by cloud computing. Institutions may improve patient care, lower operational costs, better manage precious resources, and provide higher-quality services to their customers by utilising cloud computing. Furthermore, cloud computing assists in research, national security, strategic planning, and financial operations.
The Benefits of Cloud Computing
Many individuals have predicted that cloud computing and the digital revolution would usher in a new age of virtual commerce. The fast expansion of digital resources has made technology available to an ever-increasing user base throughout the globe. New technologies emerge on a regular basis, promising to cut expenses, enhance job effectiveness, and boost efficiency while becoming smarter, quicker, and smaller (Marston et al. 2011)........
Drawbacks and Advantages of Cloud Computing
Currently, a lot of businesses are looking to move their operations to the cloud because of the many advantages that cloud computing offers. The following parts make up this document. The first part explains what cloud computing is and why it's important. The merits and downsides of cloud computing are discussed in the second part. In the final segment, two Houston-based organisations that have embraced the cloud are profiled, along with their cloud-related experiences.....
Cloud Computing Reflection Paper
The summary of your experiences that you gathered from the learning of anything that your teachers assigned to you is explained in the reflection notebook. Learning new things stimulates the mind's creativity and helps you to comprehend and investigate new aspects of the subject you're studying. For students, learning is a crucial phrase that helps them become more efficient and confident in their ability to study and discover new things in order to expand.......
The Effect of Cloud Computing on Network Management
In today's society, technology plays a critical role in almost every industry, including business. Technologies, especially Information Technology (IT), have grown throughout time and now serve organisations via a variety of uses. Cloud Computing, one of the numerous rising informational technology concepts with optimum applications, has several applications for the effective operation of organisations.........
The Future of Cloud Computing
Cloud computing has emerged as a revolutionary technology capable of hosting and delivering services through the Internet. Various enterprises have adopted the technology because it reduces the need for users to prepare ahead for operations like provisioning. Enterprises can only add resources when there is a demand for the service, thanks to technology. Even while cloud computing has several advantages, it is still in its early phases..........
Key Management in Cloud Computing
The mechanism for subjoining an integrity to a key, as well as the construction, assessment, and invalidation of such keys, are all part of key management. The goal of transferring keys is to enable the server to connect discreetly with the cloud and other servers by utilising the same string of bits. Key management, in addition to updating data, manages everything related to a key, including key formation/deformation, key communication, key storage, and so on. Most cloud service providers provide standard key formatting strategies for data storage or leave it up to the customer. To assist save data and applications in the cloud, both formation and administration are critical. Especially these days, there is a need for cloud providers to test a new management approach for their services. However, there are still certain issues that cloud computing faces.
12 Cloud Computing Articles
There's Cloud/Client Computing, for starters. It is concerned with the use of cloud and mobile computing in order to facilitate the production of centrally structured programmes that can then be deployed to any computer. Gartner's Top 10 Strategic Technology Trends for 2015 is available at https://www.gartner.com/smarterwithgartner/gartners-top-10-strategic-technology-trends-for-2015/. . Cloud computing has lately gained a lot of traction, however it has a lot of issues that servers didn't have. The most typical errors include ordering too much processing power..........
Cloud Computing in the IT Industry
In the IT business, cloud computing is the new buzzword. Cloud computing has been defined in a variety of ways by different specialists, and it is expected to dominate the software business in the next years. Cloud is another synonym for internet, and cloud computing refers to internet-based computer services. Individuals and businesses may use the internet to run programmes and store large quantities of data..........
Types of Cloud Computing
Cloud computing is described as the usage of a collection of diverse dispersed services, different applications, information, and infrastructure that consists of pools of computers, multiple networks, information, and storage resources, and is quickly becoming one of the next industry buzzwords. Grid computing, utility computing, virtualization, and clustering are all examples of this technology. The phrase "cloud computing" stems from the notion of the Internet or a big networked environment being referred to as a "cloud."
Cloud Computing Security Issues and Challenges
Cloud computing is a new and developing trend in the IT industry that is poised to transform how people think about and use computers. New technologies that have merged to generate cloud computing services have grown at an exponential rate. Cloud computing services are being provided as long-term solutions for a variety of educational......
Example of Cloud Computing Proposal
The introduction of new tools and technology has revolutionised the way businesses operate. Many benefits, opportunities, and future market and corporate function automation were supplied by such ground-breaking technology aspects. Though the concept of remote working, storing, processing, and communication is not new, it is becoming more popular. Throughout history, many attempts have been made. These advancements are made in a variety of fields, but in our day and age, we can clearly see this concept in action........
Importance of Cloud Computing to Public and Private Firms
Cloud computing is one of the most recent developments in the information era. It entails using the services of another organisation to transport, capture, process, and exchange data. Although there is no clear definition for cloud computing, it refers to the many applications offered by a cloud computing provider in the computer industry. The customer purchases the hardware and software of another company........
Cloud Computing Advantages, Challenges and Security
Cloud computing is a value-driven innovation because it saves money and provides worldwide access to superior business process virtualization. Organizations see it as a cost-effective solution since it eliminates the need to maintain a complicated IT infrastructure and resources. Risk considerations are also taken into account, since the risk indicated in the underlying contract is now controlled by a third party, namely cloud computing companies. “People frequently think of virtualization as adding to security difficulties, but it is basically the solution to a lot of those issues,”
Cloud Computing Issues in Retail Fashion Business
The company selected for examination is Magneta, a hypothetical company that works in the fashion retail industry and is situated in the United Kingdom. The company's primary demographic is young women between the ages of 18 and 34, and it mostly provides casual but fashionable female clothing.
Smartphones and Personal Cloud Computing Security Awareness
The digital world is slowly but steadily heading toward personal cloud computing, which will allow dynamic resource sharing in support of high-speed on-demand computing while preserving huge digital resources. The advantages of personal cloud computing, such as network resource sharing, data storage, apps, and servers, are well known and widely recognised. In contrast, maintaining a reliable computer environment requires information security, which includes access control and authorisation.
Implementation of Cloud Computing in City of Pittsburgh
Cloud computing is a relatively new concept that allows for quick and on-demand access to a shared pool of computer resources across a network. Software, servers, and networks are examples of services that may be maintained with little control and minimum communication between service providers (Furht & Escalante, 2010, p. 3-5). Cloud computing offers small-to-medium file storage as well as email services to the general public and working professionals.........
Use of Cloud Computing For Business Export Control in USA
As the CIO of a global shoe manufacturing firm situated in the United States, I am prepared to implement a geographically dispersed cloud-based computing platform. Security, regulation, and redundancy are all key problems when adopting this alternative. This extends my mission to address these challenges and maintain high redundancy rates of 99.999 percent availability in order to provide excellent service to foreign stockholders such as online customers, manufacturers, and merchants. When determining which choice to choose, keep in mind the Export Control Act's requirements on foreign policy and national security. As a result, my evaluation will be centred on reading Addressing Export Control in the Age of Cloud Computing and recommending the company's best course of action.
Cloud Computing Security Policy Example For an Organization
The policy describes the SNPO-MC organization's security principles and policies for utilising cloud services in everyday operations, data processing and storage, and application usage. Managers, executives, and employees will utilise the policy as a guidance when negotiating terms with cloud providers. Employees of the company are those who work for the company..........
Pros and Cons of Cloud Computing
Cloud computing is a new IT paradigm that enables IT infrastructure to be offered on a pay-as-you-go basis. The application is likewise new to the industry, since it is not yet well understood. This document attempts to educate the business community on the value of cloud computing by defining it, highlighting its benefits and drawbacks, and discussing its application. The article also goes into how to manage and protect cloud devices, cloud economics......
Related Posts
Institutional adoption of open source software, aspects and sources of cyber attacks, does the dmca need to be changed to..., factors of underutilization of computer forensics, pros and cons of cyber warfare, are cell phones harmful to the human race, pros and cons of artificial intelligence, pros and cons of barcoding technology in hospitals, what are the ethics of hacktivism, contemporary approaches to network protection, leave a comment cancel reply.
Please enter an answer in digits:
We use cookies to enhance our website for you. Proceed if you agree to this policy or learn more about it.
- Essay Database >
- Essay Examples >
- Essays Topics >
- Essay on Cloud
Cloud Computing Research Paper Sample
Type of paper: Research Paper
Topic: Cloud , Information , Computers , Cloud Computing , Customers , Vendor , System , Security
Words: 3000
Published: 02/20/2023
ORDER PAPER LIKE THIS
Cloud Computing
Abstract This analytic research paper will discuss several issues pertaining to cloud computing. A brief introduction which gives an overview of the cloud computing is given at the beginning of the research paper. After the introduction key issues surround cloud computing will be discussed in details. The paper will then give workable solutions to these issues and also that have been used in the past. A new solution will be proposed and an explanation on how it can be implemented be given.
Introduction
The term ‘cloud computing’ is still new to most people, the term only gained popularity in October 2007 following the announcement of Google and IBM collaboration in the sector (Atkins, 2003). Cloud computing is a result of researches which have been done for decades under virtualization distributed computing. In entails the concept of utility computing, networking web, software services and grid computing. Cloud computing has been so relevant and in the field of information technology as it has made significant economical and valuable contribution to cyber infrastructure. It has enabled information technology over head for end users to be reduced considerably, it ensures a service oriented architecture, increased flexibility, reduced overall ownership cost and has also ensured more demand for the service (Atkins, 2003). Despite of these contributions in the cyber infrastructure there are key issues that still face cloud computing. This paper aims to discuss the concept of cloud computing, the issues that cloud computing face and the workable solutions for this issue, it also proposes an alternative solution and its justification. Cloud computing though having several advantages have key problems that have not been resolved
Overview of Cloud Computing
Cloud computing entails sharing of resources in large scale, which is independent and cost effective. It is possible for the resources in the cloud to be used by the client and be deployed by the vendor. There are several companies that have made good use of the cloud technology including Amazon, IBM, Google, Zoho, Rockspace, Microsoft and Salesforce.com. Cloud computing ensures that customers do not necessarily have to purchase the resource from third party vendor. Customers can simply use the resource then make the payments as services, this saves on customers money and time (Erl, 2013). Though mostly used by multinational corporations cloud computing can also be used by medium and small enterprises. Cloud computing architect involves several multi cloud components which interact with each other with specific data that they have. This interaction makes it possible for the user to get the needed data at a much faster rate. Cloud gives more focus to the frontend (the person in need of data) and the back ends (various devices for storing the data), sever which constitutes the cloud. Categorization in cloud computing may also include other components such as undifferentiated hardware versus differentiated hardware, specialized software and software’s for general purpose, virtual images to environs, collection of service and workflow based environments among others (Yang, 2013). One key concept for cloud computing involves virtualization which makes it possible for isolation and abstraction of lower level functionalities and the underlying hardware. By doing this it enables high level functions to be portable. It also enables aggregation and/or sharing of the physical resources. IBM had separated the cloud in three kinds based on their usage. This sub groups include Hybrid cloud, public cloud and private cloud (Yang, 2013). Public clouds can be shared on large scale whereas private clouds are owned by a single organization. The advantage of private clouds is that it facilitates more flexibility and a better control. The hybrid cloud combines both the private cloud and public cloud and is currently being used by most companies.
Issues Facing Cloud Computing
Cloud computing is still an emergent field and various researches are still being conducted to see on how this system can be used more efficiently. The implementation of the cloud computing approach is still very challenging to the cloud developers and the management. Developers have been researching on a way in which they can construct complex environments of resources as well as complex control images used in the same resources. This comprises of the workflow oriented images. There has been a challenge on the spatial and the temporal feedback large scale work flows, this is underlined by the specific amount of the meta-data. Some of this data have permanent attachment to the image whereas others are dynamically attached. Still there are certain metadata which are kept in cloud management database. Another issue with regards to cloud computing is provenance data and the general Meta data management (Atkins, 2003). The current cloud application categorizes data into cloud process provenance, cloud data provenance, cloud workflow provenance and environment or system provenance. The main challenge on provenance include developing a way in which provenance information can be collected in standardized, seamless and yet with minimal overhead. It is also challenging for storage of this information in a permanent system such the information can be accessed at any time. Presentation of this information to users in a logical way is also still challenging as far as cloud computing is involved. Another image and service related issue for cloud computing include optimization of environment loading times and image. Image portability has also been a challenge for cloud developers and by the effects of image format. Accountability of the cloud computing developers is also a challenge. Given the fact very few people have sufficient knowledge on cloud computing most customers are at a risk of being overcharged by the vendors. Data integrity is one key issue that faces cloud computing. Data in the cloud can be accessed by just about any person from whichever location they are in. Currently the cloud is not able to differentiate between common data and those data that are sensitive. What this mean is that the sensitive data is left open and accessible to any person which is the key contributing factor for lack of data integrity in cloud computing. Data stealing has also been a challenge in cloud computing. Most cloud vendors usually try leasing servers from other service providers instead of acquiring their own (Naone, 2007). They do this because leasing is much flexible for operation and is also cost effective. Meanwhile cloud computing customers may not know about these arrangements by the vendors. This makes it possible for some malicious users to steal data from the external server. It is the responsibility of the vendor to ensure that the customer’s data, especially on personal information is well secured. Failure of the vendor to protect the customer’s personal information is likely to result in lack of privacy for the customer. Since most of the servers are external it is important for the vendor to always be up to date on the people accessing the data and those maintaining the serve (Yang, 2013). It is only when the vendor has been able to manage this that the customer can be ensured of privacy of the customer’s personal information. Infected application has also been a major challenge for cloud users. This mostly occurs when the vendor lacks the complete access to the server for maintenance and monitoring. Failure for the vendor to have absolute access to the server gives room for malicious user to upload infectious applications in to the cloud (Naone, 2007). These viruses may cause severe effects to the cloud customers. Data loss is another major issue with cloud computing. If for instances, the vendor is charged with legal issues that requires that the vendor closes then all the customers data is lost. The same also happens when the vendor closes due to financial issues. Loss off data can be very frustrating to the customers especially if it was important information only available on the cloud. Use of cloud computing is also frustrating to the customers due to the location of the data. In cloud computing there is usually no transparency as far as data location is concerned. The location of the customers’ data is not directly available even for the customer. Usually vendors never reveal where storage of all data is located. It is possible that the customers’ data be located far away from his or her country of residence. Another issue is based on data security for the vendor (Atkins, 2003). This implies challenges pertaining to security issues at vendor level. To ensure that there is security at the vendor level it is important for the vendor to ensure that all servers are well secured from any external threat. A good cloud computing system should allow for good security given by the vendor to the customers. Lastly but not least cloud computing has the challenge of security at the customer level. The vendor may make all the necessary arrangements to give the customer a good security ground but customer can also expose the cloud data for some insecurities. It should b the responsibility of the customer to ensure that there is no tampering of information due to their negligence. The customer should at all times be considerate of other customers using the same cloud.
Solutions to the issues mentioned
Many researchers have been done to harmonize the economics of image construction and the economics of scale with an aim of solving cloud computing approach. Though much research has been done the developers are yet to construct sufficiently complex environments of resource complex images that they desire. With regard to the mentioned provenance issues some solutions have already been established and are currently being applied with cloud computing developers. The Visual Computing laboratories (VLC) are currently using image snapshots that are standardized (Erl, 2013). This standardized image snapshots are operates system, hypervisor and platform specific enabling image exchange. This system however needs a much complex mapping and some additional storage. The issue of accountability still remains a big challenge for cloud computing. Clients have no technical knowhow to understand if the charges being made on them are equivalent to the resources they have used in the cloud. However several measures are being taken in place to prevent exploitation of the clients. Some of this function for accountability is performed by the customer support center. Since most of the issues affecting cloud computing has to do with security issues. Some measures have been put in place to ensure that some of these effects are completely curbed or at least they are reduced. One way that has been used curb these challenges include data protection. Some vendors have made it possible for customers to be able to access their data at whatever time. This solves the issue of difficulty for the customers to trace the location of the data. Vendors have also tried the delivery of high performance for the customers. This should ensure efficiency of the data processing and also ensure that the customer’s data is not infected by viruses. Some vendors have also set up some detection system on the cloud. These systems detect any malicious activities occurring around the cloud. Some of the things that the detection system can monitor include uploading or downloading of data by unsupported users. It can also point on customers who try to access information from other customers or in locations where there data is not available. The detection system also ensures that vendors who lease the servers only use them for the agreed purpose and within the signed limits. Support system for customers has also been very relevant in reducing the challenges faced by cloud computing. The customer support system ensures that customers can report any threat on their data. For example if they suspect that their data has been stolen the customer support centre should be able to trace if there was any unauthorized person who pirated on the customers data and stole it. Legal actions against such persons can then be undertaken. The customer support centre is also very important when the customer loses their data. Take for example if the vendor for that specific customer was faced with legal issue which led to the closure of the cloud account. The customers for this vendor should by no means suffer due to ignorance and negligence of the server provider. The customer support team come in and ensures that all customers get their information back. Where stealing of data is involved it is always important to provide encryption. Different encryption strategies should be combined such that malicious users are completing unable to acquire this data. Most vendors are now converting data into ciphertext forms which are note easily understood by unauthorized users. This system has greatly reduced theft in the cloud computing systems. Ciphertext has also ensured that cloud customers enjoy privacy on personal information (Erl, 2013). It is clear that even though there are a lot of researches of making cloud computing more effective by solving the key issues there is still much that has not been achieved. There are still a lot of security issues that are yet to be addressed. Cloud computing lacks a good accountability system and is subject to customers exploitation.
Proposed solution for accountability and security challenge
In order to form a much secured cloud computing system which is also accountable there are several things that need to be put in place. This research proposes a design that will combine use of relevant policies and a network storage service. An act that guides and govern all the ethical and legal responsibilities should be drafted and be made available to all cloud computing users and the potential users. This will ensure that no person claims ignorance of the system and therefore the customers can understand how their money is utilized. A list of all resources and expenditure incurred on managers and developers of cloud computing should be put in place. The act should also give the punishment for every person that maliciously uses cloud computing. A network storage device will be useful when used in conjunction with the policies. This network storage will make it possible for the customers to access shared objects that have been maintained by the server while at the same time ensuing strong accountability. Every request made will contain a digital signature that serves two purposes. First it ensures that the sender can easily be identified and secondly the message is secured. This collection of digital signature technique is referred to as action history. During the audit request the digital signature can then be used. This way no customer or vendor can deny having requested for that information later on. Server accountability all servers will have to generate a different version of a data object for each write request on that object issued by the customer. The server will then organize the total data objects in a system as the Merkle hash tree. Customers can then make sure that they keep in touch in order to observe the requests made by any of them. If incase they notice that the serve does not indicate history of a given customer then immediately they can make reports about it.
Justification for the proposed solution
This solution will go a long way in combating the security issues and ensuring accountability both on the charges and the data. If laws are put in place and are well implemented even in a state setting people are bound to avoid getting themselves on the wrong side of the law. This can be translated in cloud computing and it will function very well. Providing all customers with the basic knowledge and resources available for cloud computing is very relevant as it ensures customers can have increased bargaining powers. They will no longer be overcharged by vendors who do not uphold integrity. The idea of digital signatures unique for every individual just like figure prints are is very important. This will ensure data accountability. Every thief will be caught because the digital signatures are unique and cannot lie. Since people fear facing legal charges this system will scare away all the malicious users. It is important that the server also provides a history of all requests that have been made. This history should indicate biographic data of customers, the unique digital signature and the request made. With this any person with a malicious intention will be traced down even before they access the data that they had wrongfully requested.
The idea of cloud computing when initially invented became so appealing to Information and Technology corporations. This is due to the fact that it offered many advantages such as large data storage facility cost effectiveness, offered increased flexibility among other outstanding advantages (Erl,2013). Every good invention has its pit falls and cloud computing have had lots of disadvantages. This paper discussed some of the key issues facing cloud computing and the present solutions to this challenges. The solutions that are currently available for reducing these issues are considerably insufficient. At the end this paper offered a proposed system of a better solution that can be implemented to prevent some of the major issues.
Atkins D.E. (2003), Revolutionizing science and engineering through cyber infrastructure: Report of the National Science Foundation Blue-ribbon Advisory Panel on Cyber infrastructure,, http://www.nsf.gov/od/oci/ reports/atkins.pdf Erl, T., Mahmood, Z., & Puttini, R. (2013). Cloud computing: Concepts, technology & architecture. Upper Saddle River, NJ: ServiceTech Press. Naone E. (2007) Computer in the Cloud, Technology, Review, MIT, http://www.technologyreview.com/printer friendly article.aspx?id=19397 Yang, X., & Liu, L. (2013). Principles, methodologies, and service-oriented approaches for cloud computing.

Cite this page
Share with friends using:
Removal Request

Finished papers: 298
This paper is created by writer with
ID 285505734
If you want your paper to be:
Well-researched, fact-checked, and accurate
Original, fresh, based on current data
Eloquently written and immaculately formatted
275 words = 1 page double-spaced

Get your papers done by pros!
Other Pages
Gastro esophageal reflux disease research paper, developing professional identity essay example, sample research paper on language variation and social identity, case study on mcdonalds short and long term strategy analysis, good essay about project initiating planning, good essay about birds of paradise and brother im dying, free rafael trujillos policies in the dominican republic essay sample, khosrow essays, cantilevered essays, gothic novel essays.
Password recovery email has been sent to [email protected]
Use your new password to log in
You are not register!
By clicking Register, you agree to our Terms of Service and that you have read our Privacy Policy .
Now you can download documents directly to your device!
Check your email! An email with your password has already been sent to you! Now you can download documents directly to your device.
or Use the QR code to Save this Paper to Your Phone
The sample is NOT original!
Short on a deadline?
Don't waste time. Get help with 11% off using code - GETWOWED
No, thanks! I'm fine with missing my deadline
- TechRepublic
Account Information

Amazon Launches New Generative AI Training Courses for Free
Share with Your Friends
Your email has been sent
The 'AI Ready' initiative offers online classes for developers and other technical professionals as well as for high school and university students.
On Nov. 20, Amazon revealed the “AI Ready” commitment, a set of courses, a scholarship and a collaboration with Code.org to promote generative artificial intelligence skills. Amazon wants to provide 2 million people across the globe with the skills needed for lucrative generative AI-focused careers by 2025. In addition, Amazon is offering cloud computing skills training.
“Amazon is launching AI Ready to help those with a desire to learn about AI and benefit from the tremendous opportunity ahead,” Swami Sivasubramanian, vice president of data and AI at Amazon Web Services, wrote in Amazon’s announcement .
Free generative AI training courses for professionals and beginners
The aws generative ai scholarship assists high school and university students, amazon and code.org team up for hour of code for students, ai ready courses add to existing library of ai and cloud resources.
The following generative AI training courses are available for free from Amazon via AWS Skill Builder for developer and technical audiences:
- Foundations of Prompt Engineering.
- Low-Code Machine Learning on AWS.
- Building Language Models on AWS.
- Amazon Transcribe — Getting Started.
- Building Generative AI Applications Using Amazon Bedrock.
The following generative AI training courses are available for free from Amazon for beginners and students:
- Introduction to Generative Artificial Intelligence via AWS Educate .
- Generative AI Learning Plan for Decision Makers via AWS Skill Builder .
- Introduction to Amazon CodeWhisperer via AWS Educate .
Employers seek AI skills
Many employers (73%) are interested in hiring people with AI-related skills , a November survey from Amazon and Access Partnership found. However, three out of four of the same employers have trouble finding people to meet their AI talent needs.
“If we are going to unlock the full potential of AI to tackle the world’s most challenging problems, we need to make AI education accessible to anyone with a desire to learn,” Sivasubramanian wrote in the announcement post.
Amazon will offer a total of $12 million over 50,000 Udacity scholarships for high school and university students from underserved and underrepresented communities around the world. Scholarship recipients will gain access to free courses, hands-on projects, on-demand technical tutors, coaching industry mentors, career development resources and guidance in creating a professional portfolio.
Interested students can apply on the AWS AI & ML Scholarship program site.
In collaboration with Code.org, Amazon will host an Hour of Code during Computer Science Education Week, Dec. 4–10, for students and teachers involved in kindergarten through 12th grade. The hour-long introduction to coding and AI will invite students to create their own dance choreography using generative AI.
Code.org runs on AWS, and Amazon has provided free AWS Cloud computing credits worth up to $8 million for the Hour of Code.
SEE: Compare Google Cloud and AWS Cloud’s capabilities as cloud hosting services.
These courses, scholarships and events come in addition to Amazon’s existing free cloud computing classes . Amazon has a goal of giving 29 million people the right skills for cloud computing careers by 2025.
Amazon also offers over 80 free and low cost training courses through AWS’s educational artificial intelligence and machine learning content library . Taking some of these courses alongside generative AI training could broaden one’s understanding of how different AWS and Amazon capabilities work together, as well as contextualizing their places in the larger world of AI and ML technologies.
Subscribe to the Innovation Insider Newsletter
Catch up on the latest tech innovations that are changing the world, including IoT, 5G, the latest about phones, security, smart cities, AI, robotics, and more. Delivered Tuesdays and Fridays
- OpenAI's Sam Altman, Removed as CEO, Joins Microsoft (TechRepublic)
- IBM Study: Businesses Work on Adapting to Generative AI, Hybrid Cloud (TechRepublic)
- Microsoft Announces New Maia 100 and Cobalt 100 Chips (TechRepublic)
- Artificial Intelligence: More must-read coverage (TechRepublic on Flipboard)
- See all of Megan's content
- Artificial Intelligence
- Digital Transformation
- International
Editor's Picks

TechRepublic Premium Editorial Calendar: Policies, Checklists, Hiring Kits and Glossaries for Download
TechRepublic Premium content helps you solve your toughest IT issues and jump-start your career or next project.

7 Best AI Art Generators of 2023
This is a comprehensive list of the best AI art generators. Explore the advanced technology that transforms imagination into stunning artworks.

The Best Cheap Payroll Services for 2023
Find the perfect payroll service for your business without breaking the bank. Discover the top cheap payroll services, features, pricing and pros and cons.

NordVPN Review (2023): Pricing, Security & Performance
Is NordVPN worth it? How much does it cost and is it safe to use? Read our NordVPN review to learn about pricing, features, security, and more.

10 Best Free Project Management Software & Tools for 2023
Free project management software provides flexibility for managing projects without paying a cent. Check out our list of the top free project management tools.

Cloud Strategies Are Facing a New Era of Strain in Australia, New Zealand
Australian and New Zealand enterprises in the public cloud are facing pressure to optimize cloud strategies due to a growth in usage and expected future demand, including for artificial intelligence use cases.
Cloud Data Warehouse Guide and Checklist
Choosing a vendor to provide cloud-based data warehouse services requires a certain level of due diligence on the part of the purchaser. This cloud data warehouse guide and checklist from TechRepublic Premium will help businesses choose the vendor that best fits its data storage needs based on offered features and key elements. From the guide: ...
Computer Equipment Disposal policy
Computers, laptops, servers and other IT equipment are regularly replaced and decommissioned. Off-lease systems must be returned to the manufacturer, some equipment is recycled, and other systems are given to employees or donated. Systems must be properly processed, and sensitive, proprietary and all other organization information must be properly removed prior to discarding the systems. ...
Security Risk Assessment Checklist
Organizations, regardless of size, face ever-increasing information technology and data security threats. Everything from physical sites to data, applications, networks and systems are under attack. Worse, neither an organization nor its managers need to prove prominent or controversial to prove a target. A security risk assessment should be performed annually, if not quarterly. Consider using ...
- TechRepublic on Twitter
- TechRepublic on Facebook
- TechRepublic on LinkedIn
- TechRepublic on Flipboard
- Privacy Policy
- Terms of Use
- Property of TechnologyAdvice
Create a TechRepublic Account
Get the web's best business technology news, tutorials, reviews, trends, and analysis—in your inbox. Let's start with the basics.
* - indicates required fields
Sign in to TechRepublic
Lost your password? Request a new password
Reset Password
Please enter your email adress. You will receive an email message with instructions on how to reset your password.
Check your email for a password reset link. If you didn't receive an email don't forgot to check your spam folder, otherwise contact support .
Welcome. Tell us a little bit about you.
This will help us provide you with customized content.
Want to receive more TechRepublic news?
You're all set.
Thanks for signing up! Keep an eye out for a confirmation email from our team. To ensure any newsletters you subscribed to hit your inbox, make sure to add [email protected] to your contacts list.
- Search for: Toggle Search
More Games, More Wins: PC Game Pass Included With Six-Month GeForce NOW Memberships
The fastest way to give the gift of cloud gaming starts this GFN Thursday: For a limited time, every six-month GeForce NOW Ultimate membership includes three months of PC Game Pass.
Also, the newest GeForce NOW app update is rolling out to members, including Xbox Game Syncing and more improvements.
Plus, take advantage of a heroic, new members-only Guild Wars 2 reward. It’s all topped off by support for 18 more games in the GeForce NOW library this week.
Give the Gift of Gaming

Unwrap the gift of gaming: For a limited time, gamers who sign up for the six-month GeForce NOW Ultimate membership will also receive three free months of PC Game Pass — a $30 value.
With it, Ultimate members can play a collection of high-quality Xbox PC titles with the power of a GeForce RTX 4080 rig in the cloud. Jump into the action in iconic franchises like Age of Empires , DOOM , Forza and more, with support for more titles added every GFN Thursday.
Seamlessly launch supported favorites across nearly any device at up to 4K and 120 frames per second or at up to 240 fps with NVIDIA Reflex technology in supported titles for lowest-latency streaming.
This special offer is only here for a limited time, so upgrade today.

With so many games ready to stream, it might be hard to decide what to play next. The latest GeForce NOW app update, currently rolling out to members, is here to help.
Members can now connect their Xbox accounts to GeForce NOW to sync the games they own to their GeForce NOW library. Game syncing lets members connect their digital game store accounts to GeForce NOW, so all of their supported games are part of their streaming library. Syncing an Xbox account will also add any supported titles a member has access to via PC Game Pass — perfect for members taking advantage of the latest Ultimate bundle.
The new update also adds benefits for Ubisoft+ subscribers. With a linked Ubisoft+ account, members can now launch supported Ubisoft+ games they already own from the GeForce NOW app, and the game will be automatically added to “My Library.” Get more details on Ubisoft account linking.
Version 2.0.58 also includes an expansion of the new game session diagnostic tools to help members ensure they’re streaming at optimal quality. It adds codec information to the in-stream statistics overlay and includes other miscellaneous bug fixes. The update should be available for all members soon.
A Heroic Offering

This week, members can receive Guild Wars 2 “Heroic Edition,” which includes a treasure trove of goodies, such as the base game, Legacy Armor, an 18-slot inventory expansion and four heroic Boosters. It’s the perfect way to jump into ArenaNet’s critically acclaimed, free-to-play, massively multiplayer online role-playing game.
It’s easy to get membership rewards for streaming games on the cloud. Visit the GeForce NOW Rewards portal and update the settings to receive special offers and in-game goodies.
Members can also sign up for the GeForce NOW newsletter, which includes reward notifications, by logging into their NVIDIA account and selecting “Preferences” from the header. Check the “Gaming & Entertainment” box and “GeForce NOW” under topic preferences.
Ready, Set, Go

The first downloadable content for Gearbox’s Remnant 2 arrives in the cloud. The Awakened King brings a new storyline, area, archetype and more to the dark fantasy co-op shooter — stream it today to experience the awakening of the One True King as he seeks revenge against all who oppose him.
Catch even more action with the 18 newly supported games in the cloud:
- Spirittea (New release on Steam , Nov. 13)
- KarmaZoo (New release on Steam , Nov. 14)
- Naheulbeuk’s Dungeon Master (New release on Steam , Nov. 15)
- Warhammer Age of Sigmar: Realms of Ruin (New release on Steam , Nov. 17)
- Arcana of Paradise —The Tower ( Steam )
- Blazing Sails: Pirate Battle Royale ( Epic Games Store )
- Disney Dreamlight Valley ( Xbox , available on PC Game Pass)
- Hello Neighbor 2 ( Xbox , available on PC Game Pass)
- Overcooked! 2 ( Xbox , available on PC Game Pass)
- RoboCop: Rogue City (New release on Epic Games Store )
- Roboquest ( Xbox , available on PC Game Pass)
- Rune Factory 4 Special ( Xbox and available on PC Game Pass)
- Settlement Survival ( Steam )
- SOULVARS ( Steam )
- State of Decay: Year-One Survival Edition ( Steam )
- The Wonderful One: After School Hero ( Steam )
- Wolfenstein: The New Order ( Xbox , available on PC Game Pass)
- Wolfenstein: The Old Blood ( Steam , Epic Games Store , Xbox and available on PC Game Pass)
What are you looking forward to streaming? Let us know on Twitter or in the comments below.
h̴e̴r̴e̴'̴s̴ soon's the deal… stay tuned 👀🎄 — 🌩️ NVIDIA GeForce NOW (@NVIDIAGFN) November 15, 2023
NVIDIA websites use cookies to deliver and improve the website experience. See our cookie policy for further details on how we use cookies and how to change your cookie settings.
- Pre-Markets
- U.S. Markets
- Cryptocurrency
- Futures & Commodities
- Funds & ETFs
- Health & Science
- Real Estate
- Transportation
- Industrials
Small Business
Personal Finance
- Financial Advisors
- Options Action
- Buffett Archive
- Trader Talk
- Cybersecurity
- Social Media
- CNBC Disruptor 50
- White House
- Equity and Opportunity
- Business Day Shows
- Entertainment Shows
- Full Episodes
- Latest Video
- CEO Interviews
- CNBC Documentaries
- CNBC Podcasts
- Digital Originals
- Live TV Schedule
- Trust Portfolio
- Trade Alerts
- Meeting Videos
- Homestretch
- Jim's Columns
- Stock Screener NEW!
- Market Forecast
- Options Investing
Credit Cards
Credit Monitoring
Help for Low Credit Scores
All Credit Cards
Find the Credit Card for You
Best Credit Cards
Best Rewards Credit Cards
Best Travel Credit Cards
Best 0% APR Credit Cards
Best Balance Transfer Credit Cards
Best Cash Back Credit Cards
Best Credit Card Welcome Bonuses
Best Credit Cards to Build Credit
Find the Best Personal Loan for You
Best Personal Loans
Best Debt Consolidation Loans
Best Loans to Refinance Credit Card Debt
Best Loans with Fast Funding
Best Small Personal Loans
Best Large Personal Loans
Best Personal Loans to Apply Online
Best Student Loan Refinance
All Banking
Find the Savings Account for You
Best High Yield Savings Accounts
Best Big Bank Savings Accounts
Best Big Bank Checking Accounts
Best No Fee Checking Accounts
No Overdraft Fee Checking Accounts
Best Checking Account Bonuses
Best Money Market Accounts
Best Credit Unions
All Mortgages
Best Mortgages
Best Mortgages for Small Down Payment
Best Mortgages for No Down Payment
Best Mortgages with No Origination Fee
Best Mortgages for Average Credit Score
Adjustable Rate Mortgages
Affording a Mortgage
All Insurance
Best Life Insurance
Best Homeowners Insurance
Best Renters Insurance
Best Car Insurance
Travel Insurance
All Credit Monitoring
Best Credit Monitoring Services
Best Identity Theft Protection
How to Boost Your Credit Score
Credit Repair Services
All Personal Finance
Best Budgeting Apps
Best Expense Tracker Apps
Best Money Transfer Apps
Best Resale Apps and Sites
Buy Now Pay Later (BNPL) Apps
Best Debt Relief
All Small Business
Best Small Business Savings Accounts
Best Small Business Checking Accounts
Best Credit Cards for Small Business
Best Small Business Loans
Best Tax Software for Small Business
Best Tax Software
Best Tax Software for Small Businesses
Tax Refunds
All Help for Low Credit Scores
Best Credit Cards for Bad Credit
Best Personal Loans for Bad Credit
Best Debt Consolidation Loans for Bad Credit
Personal Loans if You Don't Have Credit
Best Credit Cards for Building Credit
Personal Loans for 580 Credit Score or Lower
Personal Loans for 670 Credit Score or Lower
Best Mortgages for Bad Credit
Best Hardship Loans
All Investing
Best IRA Accounts
Best Roth IRA Accounts
Best Investing Apps
Best Free Stock Trading Platforms
Best Robo-Advisors
Index Funds
Mutual Funds
Alibaba's Hong Kong shares drop 10% after it shelves cloud spinoff, citing U.S. chip restrictions

- Alibaba's Hong Kong stock sinks nearly 10% after the Chinese e-commerce giant announced it was scrapping plans to spin out its cloud computing division.
- The company said U.S. chip export restrictions have made it harder for Chinese firms to get critical chip supplies from U.S. companies.
- "We believe that a full spin-off of Cloud Intelligence Group may not achieve the intended effect of shareholder value enhancement," Alibaba said.
In this article
Shares of Alibaba tumbled close to 10% in early Hong Kong trading on Friday, a day after the Chinese e-commerce giant said it would not proceed with the full spinoff of its cloud group due to U.S. chip export restrictions.
U.S.-listed shares of Alibaba closed over 9% lower on Thursday, after having fallen over 10% since the start of this year.
Alibaba's Hong Kong-listed shares are down close to 15% year-to-date, underperforming the broader Hang Seng index's 11.2% decline in the same period.
In its earnings release Thursday, Alibaba said that it would no longer proceed with a spinoff of its Cloud Intelligence Group — the cloud computing arm of Alibaba that competes with Amazon Web Services and Microsoft Azure. Alibaba had planned to list the division publicly.
Alibaba said U.S. chip export restrictions have made it harder for Chinese firms to get critical chip supplies from U.S. companies. The U.S. barred sales of Nvidia 's advanced artificial intelligence-focused H800 and A800 chips in October.
On Thursday, Alibaba said the restrictions have "created uncertainties for the prospects of Cloud Intelligence Group."
"We believe that a full spin-off of Cloud Intelligence Group may not achieve the intended effect of shareholder value enhancement," the company said, adding it would instead focus on developing a sustainable growth model for the unit "under the fluid circumstances."

Ahead of the earnings announcement Thursday, Alibaba announced in a regulatory filing that the family trust of founder Jack Ma was planning to sell down its stake in the business, selling 10 million shares for $870.7 million in cash.
The decision to walk back its cloud unit spinout marks a hitch in Alibaba's plan to reorganize into six individual business units — one of the most radical shake-ups in the company's history.
Alibaba had earlier announced it would put on hold plans to list its Freshippo retail chain for groceries "as we evaluate market conditions and other factors."
The company still intends to list its Cainiao smart logistics division in Hong Kong.
The Thursday results mark the first set of Alibaba earnings since veteran executive Eddie Wu succeeded former boss Daniel Zhang as CEO. As part of a broader management reshuffle, the company's co-founder, Joe Tsai, also took over as chairman, Alibaba said in June .
Alibaba reported net income attributable to shareholders of 27.7 billion yuan ($3.8 billion) for the September quarter, below the 29.7 billion yuan expected by analysts.
Revenue met expectations, however, coming in at 224.79 billion yuan, up 9% year over year.
Tsai, the company's chairman, sought to assuage investor concerns about the roadblock to Alibaba's reorganization on the earnings call Thursday, saying the company had more than enough cash on its balance sheet to support its operating business.
"We ended the quarter with $63 billion in net cash, and we generated $27 billion in free cash flow in the last 12 months," Tsai said. "Alibaba has never been in a better financial position to invest for the growth of our businesses."
He added Alibaba was looking to prove to investors it can can grow its cloud business as part of the Alibaba Group rather than focus on "financial engineering."
"In the AI-driven world, to develop a fully grown business based on a very networked and highly scaled infrastructure, it requires investment," Tsai said. "We would rather show investors through our operations of the cloud business rather than spinning it off."
Wu, Alibaba's CEO, said the firm would embark on a strategic review of its existing businesses, distinguishing between "core" and "noncore" businesses.
The company will give different businesses different levels of priority "based on their market size, business model, and product competitiveness."
Core businesses are where Alibaba will keep a long-term focus, pursue research and development, and evolve its products and services. Noncore businesses are ones where Alibaba wants to realize value by making them profitable, "or through other means of capitalization," Wu said.
First-ever dividend payout
The company also announced it will issue its first-ever annual cash dividend in 2023. Companies use dividends to share a portion of their profit with shareholders.
In the release, Alibaba said that its board of directors had approved an annual $0.125 per ordinary share or $1 per American depositary share cash dividend for the fiscal year.
The aggregate amount of the dividend will be roughly $2.5 billion. Alibaba will pay the sum out to investors at the close of business on Dec. 21, 2023, Hong Kong time and New York time, respectively.
"Going forward, we will continue to review and determine the dividend amount based on factors such as business fundamentals, capital requirements, among others, on an annual basis," Alibaba said in its earnings release.
On the earnings call Thursday, Wu said that Cainiao, one of the remaining divisions still pursuing an IPO, saw "relatively rapid growth this quarter," and that the business was continuing to focus on building out its global smart logistics network.
He outlined a three-year plan for the unit, including scaling up investment in technology, seeking growth in cross-border e-commerce and growing its international business.
Chinese economy
Alibaba's results are often viewed as an indication of the health of the Chinese consumer.

Economists were expecting a boom in China's economy after its emergence from Covid-19 lockdowns last year, but the rebound has proven more tepid, with a property crisis and other structural challenges posing risks to the country's recovery.
On China, Tsai said that, despite volatility in global markets, "we are entering a phase of a more stable operating environment in China."
Alibaba said it recorded healthy year-over-year growth in users of its Taobao and Tmall domestic online shopping sites, however. The two sites saw positive year-over-year order growth during the annual 11:11 Chinese shopping holiday, the company added.
Returning to the future direction of Alibaba's strategy, the Chinese tech giant also said Thursday that it plans to invest in and incubate a number of strategic-level innovative businesses.
They include 1688, Alibaba's online procurement service for Chinese manufacturers, Xianyu, its second-hand goods site, DingTalk, a workplace messaging app, and Quark, a search product for young people.
Alibaba said that AI would be at the heart of its strategic direction going forward, with plans to invest in more tailored product experiences for its users across these platforms.
The company is competing with huge peers in that in that field, both in China with companies like Tencent and Baidu , as well as U.S. technology giants like Meta , Microsoft, Google, and OpenAI.
Correction: This story has been updated to accurately reflect that Alibaba's U.S.-listed stock has fallen 10% year-to-date. An earlier version misstated that figure.

IMAGES
VIDEO
COMMENTS
Research Paper on Cloud Computing June 2021 Authors: Mrs Ashwini Sheth Sachin Shankar Bhosale I.C.S.COLLEGE OF ARTS COMMERCE AND SCIENCE KHED RATANGIRI Mr Harshad Kadam Asst Prof Abstract Cloud...
A COMPREHENSIVE STUDY ON CLOUD COMPUTING Conference: 1st International Conference on Recent Developments in Science, Humanities & Management-2018 Organized By: Amar Singh College, Cluster...
Minglong Xue, Yanyi He, Peiqi Xie, Zhengyang He and Xin Feng Journal of Cloud Computing 2023 12 :160 Research Published on: 18 November 2023 Full Text PDF Minimize average tasks processing time in satellite mobile edge computing systems via a deep reinforcement learning method
IEEE Cloud Computing. IEEE Cloud Computing is committed to the timely publication of peer-reviewed articles that provide innovative research ideas, applicati
With the development of Internet technology, individuals and enterprises have higher and higher requirements for computing power and storage capacity, and cloud computing has emerged. This paper mainly studies the future development direction of cloud computing by analyzing the definition, classification and development process of cloud computing.
Barriers to Cloud Computing deployment can be observed in the work of Jangjou M et al., 2022 where there is a strong focus on the Cybersecurity risks when adopting Cloud Computing technology in both client and server-side layers of Cloud architecture. 26 These risks include Providing vulnerable APIs to Cloud users, lack of awareness of the ...
In a world with intensive computational services and require optimal solutions, cloud security is a critical concern. As a known fact, the cloud is a diverse field in which data is crucial, and as a result, it invites the dark world to enter and create a virtual menace to businesses, governments, and technology that is facilitated by the cloud. This article addresses the fundamentals of cloud ...
The Rise of Cloud Computing: Data Protection, Privacy, and Open Research Challenges—A Systematic Literature Review (SLR) - PMC Back to Top Skip to main content An official website of the United States government Here's how you know The .gov means it's official. Federal government websites often end in .gov or .mil.
The Journal of Cloud Computing: Advances, Systems and Applications (JoCCASA) will publish research articles on all aspects of Cloud Computing. Principally, articles will address topics that are core to Cloud Computing, focusing on the Cloud applications, the Cloud systems, and the advances that will lead to the Clouds of the future.
This paper presents a meta-analysis of cloud computing research in information systems with the aim of taking stock of literature and their associated research frameworks, research methodology, geographical distribution, level of analysis as well as trends of these studies over the period of 7 years.
Cloud computing has taken its place all over the IT industries. It is an on-demand internet-based computing service that provides the maximum result with minimum resources cloud computing provides a service that does not require any physical close to the computer hardware. Cloud Computing is a product of grid, distributed, parallel, and ubiquitous computing. This paper introduces the concepts ...
First, an overview of cloud computing is given. Then, recent studies and developments are summarized, and environmental issues are specifically addressed. Finally, future research directions...
Cloud computing is an excellent alternative to desk computing. Cloud computing technology has a massive number of users as they provide a range of scalability, trustworthiness, and high performance with comparatively low cost. As cloud computing is growing day by day, the energy use in the cloud is also increasing. There is also a need for energy use control in the segment of technology ...
Cloud computing is a dynamic field of information and communication technologies (ICTs), introducing new challenges for environmental protection. Cloud computing technologies have a variety of application domains, since they offer scalability, are reliable and trustworthy, and offer high performance at relatively low cost. The cloud computing revolution is redesigning modern networking, and ...
1 Citations Conference proceedings info: CLOUD 2021. Sections Table of contents Other volumes About this book Keywords Editors and Affiliations Bibliographic Information Table of contents (7 papers) Search within book Front Matter Pages i-xiii PDF A Brokering Model for the Cloud Market
... A theoretical framework has introduced in this paper that is based on IoT architecture. Research (Priyanshu and Rizwan 2018) focuses on the introduction, improvement, types, and components of...
In this research paper we have discussed importance of cloud computing, history, and the latest technical advancements in depth. Also, this study aims to introduce and define Software as a...
green cloud computing Latest Research Papers | ScienceGate green cloud computing Recently Published Documents TOTAL DOCUMENTS 181 (FIVE YEARS 84) H-INDEX 16 (FIVE YEARS 3) Latest Documents Most Cited Documents Contributed Authors Related Sources Related Keywords Unconstrained Power Management Algorithm for Green Cloud Computing
FREE IEEE PAPERS. data storage in cloud computing. cloud computing in android mobile phone 02. cloud computing in android mobile phone. cloud computing in database. cloud computing research paper and project 100. cloud computing research paper 21. cloud computing research paper 20. cloud computing research paper and project 10.
Cloud computing has emerged and grown significantly over the recent decades - from its infancy in 2004, to a $576 billion industry in 2023. FTC's Office of Technology, Bureau of Competition, and Bureau of Consumer Protection worked together to examine four specific areas of cloud computing through a Request for Information (RFI) and a public panel of cloud computing experts.
Cloud computing research paper by kavitha chinnadurai See Full PDF Download PDF Free Related PDFs Secure Attack Measure Selection and Intrusion Detection in Virtual Cloud Networks Venkataravana Nayak 2014 Cloud security is one of most important issues that has attracted a lot of research and development effort in past few years.
Amazon's New Store Utility Computing Case Study Solution. Cloud computing, often known as utility computing, is a service provided by Amazon. Amazon sells processing power to other firms in exchange for utility services such as natural gas, electricity, and water. These firms only pay for what they use.
November 13, 2023. SC23— NVIDIA today announced it has supercharged the world's leading AI computing platform with the introduction of the NVIDIA HGX™ H200. Based on NVIDIA Hopper™ architecture, the platform features the NVIDIA H200 Tensor Core GPU with advanced memory to handle massive amounts of data for generative AI and high ...
This analytic research paper will discuss several issues pertaining to cloud computing. A brief introduction which gives an overview of the cloud computing is given at the beginning of the research paper. After the introduction key issues surround cloud computing will be discussed in details.
Code.org runs on AWS, and Amazon has provided free AWS Cloud computing credits worth up to $8 million for the Hour of Code. SEE: Compare Google Cloud and AWS Cloud's capabilities as cloud ...
November 16, 2023 by GeForce NOW Community. The fastest way to give the gift of cloud gaming starts this GFN Thursday: For a limited time, every six-month GeForce NOW Ultimate membership includes three months of PC Game Pass. Also, the newest GeForce NOW app update is rolling out to members, including Xbox Game Syncing and more improvements.
Alibaba's Hong Kong stock sinks nearly 10% after the Chinese e-commerce giant announced it was scrapping plans to spin out its cloud computing division. The company said U.S. chip export ...
45 likes, 0 comments - info_callforpaper on November 27, 2023: "CALL FOR PAPERS Journal Name: ESISCS - The Eastasouth Journal of Information System and Computer ..."
In this article, characteristics of it will be discussed with several examples, in order to show that how cloud computing will make the business world simpler, more efficient, and more...