Results 1 - 10
of
955
A View Of Cloud Computing
, 2010
"... Clearing the clouds away from the true potential and obstacles posed by this computing capability. ..."
Abstract
-
Cited by 628 (1 self)
- Add to MetaCart
Clearing the clouds away from the true potential and obstacles posed by this computing capability.
CloudCmp: Comparing Public Cloud Providers
"... While many public cloud providers offer pay-as-you-go computing, their varying approaches to infrastructure, virtualization, and software services lead to a problem of plenty. To help customers pick a cloud that fits their needs, we develop CloudCmp, a systematic comparator of the performance and co ..."
Abstract
-
Cited by 175 (5 self)
- Add to MetaCart
(Show Context)
While many public cloud providers offer pay-as-you-go computing, their varying approaches to infrastructure, virtualization, and software services lead to a problem of plenty. To help customers pick a cloud that fits their needs, we develop CloudCmp, a systematic comparator of the performance and cost of cloud providers. CloudCmp measures the elastic computing, persistent storage, and networking services offered by a cloud along metrics that directly reflect their impact on the performance of customer applications. CloudCmp strives to ensure fairness, representativeness, and compliance of these measurements while limiting measurement cost. Applying CloudCmp to four cloud providers that together account for most of the cloud customers today, we find that their offered services vary widely in performance and costs, underscoring the need for thoughtful provider selection. From case studies on three representative cloud applications, we show that CloudCmp can guide customers in selecting the best-performing provider for their applications. Categories and Subject Descriptors C.4 [Performance of Systems]: General—measurement techniques,
Privacy-Preserving Public Auditing for Data Storage Security in Cloud Computing
- In INFOCOM
, 2010
"... Abstract. Cloud Computing is the long dreamed vision of computing as a utility, where users can remotely store their data into the cloud so as to enjoy the on-demand high quality applications and services from a shared pool of configurable computing resources. By data outsourcing, users can be relie ..."
Abstract
-
Cited by 135 (1 self)
- Add to MetaCart
(Show Context)
Abstract. Cloud Computing is the long dreamed vision of computing as a utility, where users can remotely store their data into the cloud so as to enjoy the on-demand high quality applications and services from a shared pool of configurable computing resources. By data outsourcing, users can be relieved from the burden of local data storage and maintenance. However, the fact that users no longer have physical possession of the possibly large size of outsourced data makes the data integrity protection in Cloud Computing a very challenging and potentially formidable task, especially for users with constrained computing resources and capabilities. Thus, enabling public auditability for cloud data storage security is of critical importance so that users can resort to an external audit party to check the integrity of outsourced data when needed. To securely introduce an effective third party auditor (TPA), the following two fundamental requirements have to be met: 1) TPA should be able to efficiently audit the cloud data storage without demanding the local copy of data, and introduce no additional on-line burden to the cloud user; 2) The third party auditing process should bring in no new vulnerabilities towards user data privacy. In this paper, we utilize the public key based homomorphic authenticator and uniquely integrate it with random mask technique to achieve a privacy-preserving public auditing system for cloud data storage security while keeping all above requirements in mind. To support efficient handling of multiple auditing tasks, we further explore the technique of bilinear aggregate signature to extend our main result into a multi-user setting, where TPA can perform multiple auditing tasks simultaneously. Extensive security and performance analysis shows the proposed schemes are provably secure and highly efficient. 1
An early performance analysis of cloud computing services for scientific computing
- TU Delft, Tech. Rep., Dec 2008, [Online] Available
"... Abstract—Cloud computing is an emerging commercial infrastructure paradigm that promises to eliminate the need for maintaining expensive computing facilities by companies and institutes alike.Throughtheuseofvirtualizationandresourcetime-sharing, clouds serve with a single set of physical resources a ..."
Abstract
-
Cited by 134 (8 self)
- Add to MetaCart
(Show Context)
Abstract—Cloud computing is an emerging commercial infrastructure paradigm that promises to eliminate the need for maintaining expensive computing facilities by companies and institutes alike.Throughtheuseofvirtualizationandresourcetime-sharing, clouds serve with a single set of physical resources a large user base withdifferentneeds.Thus,cloudshavethepotentialtoprovide to their owners the benefits of an economy of scale and, at the same time, becomeanalternativeforscientiststoclusters,grids,and parallel production environments. However, the current commercial clouds have been built to support web and small database workloads, which are very different from typical scientific computing workloads. Moreover, the use of virtualization and resource time-sharing may introduce significant performance penalties for the demanding scientific computing workloads. In this work we analyze the performance of cloud computing services for scientific computing workloads. We quantify the presence in real scientific computing workloads of Many-Task Computing (MTC) users, that is, of users who employ looselycoupledapplicationscomprisingmanytaskstoachieve their scientific goals. Then, we perform an empirical evaluation of theperformanceoffourcommercialcloudcomputingservices including Amazon EC2, which is currently the largest commercial cloud. Last,wecomparethroughtrace-basedsimulationtheperformance characteristics and cost models of clouds and other scientific computing platforms, for general and MTC-based scientific computing workloads. Our results indicate that the current clouds need an order of magnitude in performance improvement to be useful tothe scientific community, and show which improvements should be considered first to address this discrepancy between offer and demand.
Achieving secure, scalable, and fine-grained data access control in cloud computing
- In Proc. of IEEE INFOCOM
, 2010
"... Abstract—Cloud computing is an emerging computing paradigm in which resources of the computing infrastructure are provided as services over the Internet. As promising as it is, this paradigm also brings forth many new challenges for data security and access control when users outsource sensitive dat ..."
Abstract
-
Cited by 133 (8 self)
- Add to MetaCart
(Show Context)
Abstract—Cloud computing is an emerging computing paradigm in which resources of the computing infrastructure are provided as services over the Internet. As promising as it is, this paradigm also brings forth many new challenges for data security and access control when users outsource sensitive data for sharing on cloud servers, which are not within the same trusted domain as data owners. To keep sensitive user data confidential against untrusted servers, existing solutions usually apply cryptographic methods by disclosing data decryption keys only to authorized users. However, in doing so, these solutions inevitably introduce a heavy computation overhead on the data owner for key distribution and data management when fine-grained data access control is desired, and thus do not scale well. The problem of simultaneously achieving fine-grainedness, scalability, and data confidentiality of access control actually still remains unresolved. This paper addresses this challenging open issue by, on one hand, defining and enforcing access policies based on data attributes, and, on the other hand, allowing the data owner to delegate most of the computation tasks involved in fine-grained data access control to untrusted cloud servers without disclosing the underlying data contents. We achieve this goal by exploiting and uniquely combining techniques of attribute-based encryption (ABE), proxy re-encryption, and lazy re-encryption. Our proposed scheme also has salient properties of user access privilege confidentiality and user secret key accountability. Exten-sive analysis shows that our proposed scheme is highly efficient and provably secure under existing security models. I.
Controlling Data in the Cloud: Outsourcing Computation without utsourcing Control.
- ACM Computer and Communications Security Workshop. CCSW
, 2009
"... ABSTRACT Cloud computing is clearly one of today's most enticing technology areas due, at least in part, to its cost-efficiency and flexibility. However, despite the surge in activity and interest, there are significant, persistent concerns about cloud computing that are impeding momentum and ..."
Abstract
-
Cited by 125 (4 self)
- Add to MetaCart
(Show Context)
ABSTRACT Cloud computing is clearly one of today's most enticing technology areas due, at least in part, to its cost-efficiency and flexibility. However, despite the surge in activity and interest, there are significant, persistent concerns about cloud computing that are impeding momentum and will eventually compromise the vision of cloud computing as a new IT procurement model. In this paper, we characterize the problems and their impact on adoption. In addition, and equally importantly, we describe how the combination of existing research thrusts has the potential to alleviate many of the concerns impeding adoption. In particular, we argue that with continued research advances in trusted computing and computation-supporting encryption, life in the cloud can be advantageous from a business intelligence standpoint over the isolated alternative that is more common today.
Minimizing electricity cost: Optimization of distributed internet data centers in a multi-electricity-market environment
- In Proc. of INFOCOM
, 2010
"... Abstract—The study of Cyber-Physical System (CPS) has been an active area of research. Internet Data Center (IDC) is an important emerging Cyber-Physical System. As the demand on Internet services drastically increases in recent years, the power used by IDCs has been skyrocketing. While most existin ..."
Abstract
-
Cited by 98 (9 self)
- Add to MetaCart
(Show Context)
Abstract—The study of Cyber-Physical System (CPS) has been an active area of research. Internet Data Center (IDC) is an important emerging Cyber-Physical System. As the demand on Internet services drastically increases in recent years, the power used by IDCs has been skyrocketing. While most existing research focuses on reducing power consumptions of IDCs, the power management problem for minimizing the total electricity cost has been overlooked. This is an important problem faced by service providers, especially in the current multi-electricity market, where the price of electricity may exhibit time and location diversities. Further, for these service providers, guaranteeing quality of service (i.e. service level objectives-SLO) such as service delay guarantees to the end users is of paramount importance. This paper studies the problem of minimizing the total electricity cost under multiple electricity markets environment while guaranteeing quality of service geared to the location diversity and time diversity of electricity price. We model the problem as a constrained mixed-integer programming and propose an efficient solution method. Extensive evaluations based on reallife electricity price data for multiple IDC locations illustrate the efficiency and efficacy of our approach. I.
Secure Ranked Keyword Search over Encrypted Cloud Data
"... As Cloud Computing Becomes Prevalent information are being increasingly centralized into the cloud. For the protection of data privacy, sensitive data has to be encrypted before outsourcing, which makes effective data utilization a very challenging task. Although traditional searchable encryption sc ..."
Abstract
-
Cited by 80 (7 self)
- Add to MetaCart
As Cloud Computing Becomes Prevalent information are being increasingly centralized into the cloud. For the protection of data privacy, sensitive data has to be encrypted before outsourcing, which makes effective data utilization a very challenging task. Although traditional searchable encryption schemes allow users to securely search over encrypted data through keywords, these techniques support only boolean search, without capturing any relevance of data files. This approach suffers from two main drawbacks when directly applied in the context of Cloud Computing. On the one hand, users, who do not necessarily have pre-knowledge of the encrypted cloud data, have to postprocess every retrieved file in order to find ones most matching their interest; On the other hand, invariably retrieving all files containing the queried keyword further incurs unnecessary network traffic, which is absolutely undesirable in today’s pay-as-you-use cloud paradigm. In this
The little engine(s) that could: scaling online social networks
- in ACM SIGCOMM Conference, 2010
"... The difficulty of scaling Online Social Networks (OSNs) has introduced new system design challenges that has often caused costly re-architecting for services like Twitter and Facebook. The complexity of interconnection of users in social networks has introduced new scalability challenges. Convention ..."
Abstract
-
Cited by 76 (5 self)
- Add to MetaCart
(Show Context)
The difficulty of scaling Online Social Networks (OSNs) has introduced new system design challenges that has often caused costly re-architecting for services like Twitter and Facebook. The complexity of interconnection of users in social networks has introduced new scalability challenges. Conventional vertical scaling by resorting to full replication can be a costly proposition. Horizontal scaling by partitioning and distributing data among multiples servers – e.g. using DHTs – can lead to costly inter-server communication. We design, implement, and evaluate SPAR, a social partitioning and replication middle-ware that transparently leverages the social graph structure to achieve data locality while minimizing replication. SPAR guarantees that for all users in an OSN, their direct neighbor’s data is co-located in the same server. The gains from this approach are multi-fold: application developers can assume local semantics, i.e., develop as they would for a single server; scalability is achieved by adding commodity servers with low memory and network I/O requirements; and redundancy is achieved at a fraction of the cost. We detail our system design and an evaluation based on datasets from Twitter, Orkut, and Facebook, with a working implementation. We show that SPAR incurs minimum overhead, and can help a well-known open-source Twitter clone reach Twitter’s scale without changing a line of its application logic and achieves higher throughput than Cassandra, Facebook’s DHT based key-value store database.