Results 1 -
8 of
8
Data-provenance verification for secure hosts
- IEEE Transactions on Dependable and Secure Computing
, 2012
"... Abstract—Malicious software typically resides stealthily on a user’s computer and interacts with the user’s computing resources. Our goal in this work is to improve the trustworthiness of a host and its system data. Specifically, we provide a new mechanism that ensures the correct origin or provenan ..."
Abstract
-
Cited by 10 (4 self)
- Add to MetaCart
(Show Context)
Abstract—Malicious software typically resides stealthily on a user’s computer and interacts with the user’s computing resources. Our goal in this work is to improve the trustworthiness of a host and its system data. Specifically, we provide a new mechanism that ensures the correct origin or provenance of critical system information and prevents adversaries from utilizing host resources. We define data-provenance integrity as the security property stating that the source where a piece of data is generated cannot be spoofed or tampered with. We describe a cryptographic provenance verification approach for ensuring system properties and system-data integrity at kernel-level. Its two concrete applications are demonstrated in the keystroke integrity verification and malicious traffic detection. Specifically, we first design and implement an efficient cryptographic protocol that enforces keystroke integrity by utilizing on-chip Trusted Computing Platform (TPM). The protocol prevents the forgery of fake key events by malware under reasonable assumptions. Then, we demonstrate our provenance verification approach by realizing a lightweight framework for restricting outbound malware traffic. This traffic-monitoring framework helps identify network activities of stealthy malware, and lends itself to a powerful personal firewall for examining all outbound traffic of a host that cannot be bypassed. Index Terms—Authentication, malware, cryptography, provenance, networking. Ç
A Survey on Automated Dynamic Malware Analysis Techniques and Tools
"... Anti-virus vendors are confronted with a multitude of potential malicious samples today. Receiving thousands of new samples every single day is nothing uncommon. As the signatures that should detect the confirmed malicious threats are still mainly created manually, it is important to discriminate be ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
(Show Context)
Anti-virus vendors are confronted with a multitude of potential malicious samples today. Receiving thousands of new samples every single day is nothing uncommon. As the signatures that should detect the confirmed malicious threats are still mainly created manually, it is important to discriminate between samples that pose a new unknown threat, and those that are mere variants of known malware. This survey article provides an overview of techniques that are based on dynamic analysis and that are used to analyze potentially malicious samples. It also covers analysis programs that employ these techniques to assist a human analyst in assessing, in a timely and appropriate manner, whether a given sample deserves closer manual inspection due to its unknown malicious behavior.
CloudFence: Data Flow Tracking as a Cloud Service
"... Abstract. The risk of unauthorized private data access is among the primary concerns for users of cloud-based services. For the common setting in which the infrastructure provider and the service provider are different, users have to trust their data to both parties, although they interact solely wi ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
Abstract. The risk of unauthorized private data access is among the primary concerns for users of cloud-based services. For the common setting in which the infrastructure provider and the service provider are different, users have to trust their data to both parties, although they interact solely with the latter. In this paper we propose CloudFence, a framework for cloud hosting environments that provides transparent, fine-grained data tracking capabilities to both service providers, as well as their users. CloudFence allows users to independently audit the treatment of their data by third-party services, through the intervention of the infrastructure provider that hosts these services. CloudFence also enables service providers to confine the use of sensitive data in well-defined domains, offering additional protection against inadvertent information leakage and unauthorized access. The results of our evaluation demonstrate the ease of incorporating Cloud-Fence on existing real-world applications, its effectiveness in preventing a wide range of security breaches, and its modest performance overhead on real settings.
A.D.: CloudFence: Enabling Users to Audit the Use of their Cloud-Resident Data
, 2012
"... One of the primary concerns of users of cloud-based ser-vices and applications is the risk of unauthorized access to their private information. For the common setting in which the infrastructure provider and the online service provider are different, end users have to trust their data to both partie ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
One of the primary concerns of users of cloud-based ser-vices and applications is the risk of unauthorized access to their private information. For the common setting in which the infrastructure provider and the online service provider are different, end users have to trust their data to both parties, although they interact solely with the ser-vice provider. This paper presents CloudFence, a frame-work that allows users to independently audit the treat-ment of their private data by third-party online services, through the intervention of the cloud provider that hosts these services. CloudFence is based on a fine-grained data flow track-ing platform exposed by the cloud provider to both de-velopers of cloud-based applications, as well as their users. Besides data auditing for end users, CloudFence allows service providers to confine the use of sensitive data in well-defined domains using data tracking at ar-bitrary granularity, offering additional protection against inadvertent leaks and unauthorized access. The results of our experimental evaluation with real-world applica-tions, including an e-store platform and a cloud-based backup service, demonstrate that CloudFence requires just a few changes to existing application code, while it can detect and prevent a wide range of security breaches, ranging from data leakage attacks using SQL injection, to personal data disclosure due to missing or erroneously implemented access control checks. 1
SHRIFT System-wide HybRid Information Flow Tracking
"... Abstract. Using data flow tracking technology, one can observe how data flows from inputs (sources) to outputs (sinks) of a software system. It has been proposed [1] to do runtime data flow tracking at various layers simultaneously (operating system, application, data base, window man-ager, etc.), a ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. Using data flow tracking technology, one can observe how data flows from inputs (sources) to outputs (sinks) of a software system. It has been proposed [1] to do runtime data flow tracking at various layers simultaneously (operating system, application, data base, window man-ager, etc.), and connect the monitors ’ observations to exploit semantic information about the layers to make analyses more precise. This has im-plications on performance—multiple monitors running in parallel—and on methodology—there needs to be one dedicated monitor per layer. We address both aspects of the problem. We replace a runtime monitor at a layer L by its statically computed input-output dependencies. At runtime, these relations are used by monitors at other layers to model flows of data through L, thus allowing cross-layer system-wide tracking. We achieve this in three steps: (1) static analysis of the application at layer L, (2) instrumentation of the application’s source and sink instruc-tions and (3) runtime execution of the instrumented application in com-bination with monitors at other layers. The result allows for system-wide tracking of data dissemination, across and through multiple applications. We implement our solution at the Java Bytecode level, and connect it to a runtime OS-level monitor. In terms of precision and performance, we outperform binary-level approaches and can exploit high-level semantics. 1
3Dataflow Tomography: Information Flow Tracking For Understanding and Visualizing Full Systems
"... It is not uncommon for modern systems to be composed of a variety of interacting services, running across multiple machines in such a way that most developers do not really understand the whole system. As abstraction is layered atop abstraction, developers gain the ability to compose systems of extr ..."
Abstract
- Add to MetaCart
It is not uncommon for modern systems to be composed of a variety of interacting services, running across multiple machines in such a way that most developers do not really understand the whole system. As abstraction is layered atop abstraction, developers gain the ability to compose systems of extraordinary complexity with relative ease. However, many software properties, especially those that cut across abstrac-tion layers, become very difficult to understand in such compositions. The communication patterns involved, the privacy of critical data, and the provenance of information, can be difficult to find and understand, even with access to all of the source code. The goal of Dataflow Tomography is to use the inherent information flow of such systems to help visualize the interactions between complex and interwoven components across multiple layers of abstraction. In the same way that the injection of short-lived radioactive isotopes help doctors trace problems in the cardiovascular system, the use of “data tagging ” can help developers slice through the extraneous layers of software and pin-point those portions of the system interacting with the data of interest. To demonstrate the feasibility of this approach we have developed a prototype system in which tags are tracked both through the machine and in between machines over the network, and from which novel visualizations of the whole system can be derived. We describe the system-level challenges in creating a working system tomography tool and we qualitatively evaluate our system by examining several
1Data-Provenance Verification For Secure Hosts
"... Abstract—Malicious software typically resides stealthily on a user’s computer and interacts with the user’s com-puting resources. Our goal in this work is to improve the trustworthiness of a host and its system data. Specifically, we provide a new mechanism that ensures the correct origin or provena ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract—Malicious software typically resides stealthily on a user’s computer and interacts with the user’s com-puting resources. Our goal in this work is to improve the trustworthiness of a host and its system data. Specifically, we provide a new mechanism that ensures the correct origin or provenance of critical system information and prevents adversaries from utilizing host resources. We define data-provenance integrity as the security property stating that the source where a piece of data is generated cannot be spoofed or tampered with. We describe a cryp-tographic provenance verification approach for ensuring system properties and system-data integrity at kernel-level. Its two concrete applications are demonstrated in the keystroke integrity verification and malicious traffic detection. Specifically, we first design and implement an efficient cryptographic protocol that enforces keystroke integrity by utilizing on-chip Trusted Computing Platform (TPM). The protocol prevents the forgery of fake key events by malware under reasonable assumptions. Then, we demon-strate our provenance verification approach by realizing a lightweight framework for restricting outbound malware traffic. This traffic-monitoring framework helps identify network activities of stealthy malware, and lends itself to a powerful personal firewall for examining all outbound traffic of a host, which cannot be bypassed.
6 A Survey on Automated Dynamic Malware-Analysis Techniques and Tools
, 2012
"... Anti-virus vendors are confronted with a multitude of potentially malicious samples today. Receiving thou-sands of new samples every day is not uncommon. The signatures that detect confirmedmalicious threats are mainly still created manually, so it is important to discriminate between samples that p ..."
Abstract
- Add to MetaCart
Anti-virus vendors are confronted with a multitude of potentially malicious samples today. Receiving thou-sands of new samples every day is not uncommon. The signatures that detect confirmedmalicious threats are mainly still created manually, so it is important to discriminate between samples that pose a new unknown threat and those that are mere variants of known malware. This survey article provides an overview of techniques based on dynamic analysis that are used to analyze potentially malicious samples. It also covers analysis programs that employ these techniques to assist human analysts in assessing, in a timely and appropriate manner, whether a given sample deserves closer manual inspection due to its unknown malicious behavior.