ISSN ONLINE(2320-9801) PRINT (2320-9798)

All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

Further More Investigations on Evolution of Approaches for Cloud Security

R.S.Venkatesh1, P.K.Reejeesh2, Prof.S.Balamurugan3, S.Charanyaa4
  1. Department of IT, Kalaignar Karunanidhi Institute of Technology, Coimbatore, TamilNadu, India1,2,3
  2. Senior Software Engineer Mainframe Technologies Former, Larsen & Tubro (L&T) Infotech, Chennai, TamilNadu, India4
Related article at Pubmed, Scholar Google

Visit for more related articles at International Journal of Innovative Research in Computer and Communication Engineering

Abstract

This paper reviews methods developed for anonymizing data from 2007 to 2012 . Publishing microdata such as census or patient data for extensive research and other purposes is an important problem area being focused by government agencies and other social associations. The traditional approach identified through literature survey reveals that the approach of eliminating uniquely identifying fields such as social security number from microdata, still results in disclosure of sensitive data, k-anonymization optimization algorithm ,seems to be promising and powerful in certain cases ,still carrying the restrictions that optimized k-anonymity are NP-hard, thereby leading to severe computational challenges. k-anonimity faces the problem of homogeneity attack and background knowledge attack . The notion of ldiversity proposed in the literature to address this issue also poses a number of constraints , as it proved to be inefficient to prevent attribute disclosure (skewness attack and similarity attack), l-diversity is difficult to achieve and may not provide sufficient privacy protection against sensitive attribute across equivalence class can substantially improve the privacy as against information disclosure limitation techniques such as sampling cell suppression rounding and data swapping and pertubertation. This paper aims to discuss efficient anonymization approach that requires partitioning of microdata equivalence classes and by minimizing closeness by kernel smoothing and determining ether move distances by controlling the distribution pattern of sensitive attribute in a microdata and also maintaining diversity.

Keywords

Data Anonymization, Microdata, k-anonymity, Identity Disclosure, Attribute Disclosure, Diversity

I. INTRODUCTION

Need for publishing sensitive data to public has grown extravagantly during recent years. Though publishing demands its need there is a restriction that published social network data should not disclose private information of individuals. Hence protecting privacy of individuals and ensuring utility of social networ data as well becomes a challenging and interesting research topic. Considering a graphical model [35] where the vertex indicates a sensitive label algorithms could be developed to publish the non-tabular data without compromising privacy of individuals. Though the data is represented in graphical model after KDLD sequence generation [35] the data is susceptible to several attacks such as homogeneity attack, background knowledge attack, similarity attacks and many more. In this paper we have made an investigation on the attacks and possible solutions proposed in literature and efficiency of the same.

II. SECURITY SERVER IN THE CLOUD

Security issues arise as a result of transforming of information between individual users. Security issues are produced by computer viruses, worms, Trojan horses, etc. The users can install anti – virus programs to overcome security threats. The anti – virus software programs comprises of activity monitoring programs, virus scanning programs and integrity checking programs.
A user machine should possess an anti – malware software to continuously update the anti – virus software. But it seems to be difficult. The idea revealed in this paper gives a secure browsing over the network for multiple users and the security servers filters & removes network traffic and untrusted codes of the user.
There are three steps associated with security server cluster.
i. A proxy request is sent from a third party destination for the third party content.
ii. Bring back the requested third party content from the third party destination.
iii. Checking the requested content for untrusted code.
It is important to note that the security server cluster and remote computer cluster communicates only over a publicly accessible networks. Determining the derived content comprises of
i. Presenting value – added content.
ii. Value – added content is added to the retrieved content.
A tunneling communications protocol can also be used for establishing communication between security server cluster and remote computer cluster. Other protocols like point – to – point, point – to – point tunneling protocol, layer 2 tunneling protocol can also be used.
Initially, the user should be connected to the internet service provider. The user then needs to satisfy the user server authentication process. The user sends a request to remote security server for content belonging to third party destination. The remote security server resends the content request to IP authentication service. The request content is received from the third party destination to the remote security server. Remote security server checks the requested content for untrusted code. Then the remote security server requests and associates the value added content from value added content server. Finally, the remote security server sends the user requested content to the user computer cluster.

III. CLOUD COMPUTING – ISSUES, RESEARCH AND IMPLEMENTATIONS

Cloud computing was build on the basis of “virtualized resources”. It includes cyber infrastructure. It suggests a service oriented architecture with improved flexibility and reduced cost. Cyber infrastructure makes possible range of applications with fixed budget and conditions. It also improves the efficiency, quality and reliability of sharing services.
Technology should be designed in such a way that it should provide a good end- user productivity and decreased technology- driven overhead. One such architecture is SOA(Service- Oriented architecture) of computing. Remote procedure calls and Grid computing are examples of network based SOA. Component-based method posses the following features:
1. Reusability
2. Sustainability
3. Extensibility
4. Scalability
5. Customizability
6. Compose-ability
7. Reliability
8. Availability
9. Security
A workflow is a graph representing the connection between the loosely and tightly coupled components. A workflow reveals an integrated representation of service-based activities. Virtualization is now built with wireless, highly distributing and pervasive computing because it provides more recent and complex IT resources.
A solution should satisfy an end-user requirements and categories of user. A user category includes service user, service author and CI developer. User categories also include domain specific groups, policy makers and stakeholders. A cyber infrastructure developer is used for maintaining and developing a cloud structure. These developers combine system hardware, interfaces ,networks, storage, administration, communications, management software, service authoring tools, scheduling algorithms, workflow generation and resource access algorithms. Service authors are responsible for producing separate base-line images and services. Every user will initially go for a base-line image as soon as he receives right to produce an image. An author can also produce image if it required resource sharing with another user. To produce images, cloud provides some supporting structures like image creation tools, image and service management tools, service brokers, service registration and discovery tools, security tools, provenance collection tools, resource mapping tools, license management tools, fault tolerance, etc.
End-users need exact time delivery of services, easy-to-use interfaces, collaborative support, information about services, etc, complexity of task, required schedules and resource conditions will decide the service distribution. Virtual Computing Laboratory (VCL) is an open source used for secured implementation of cloud. The implementation on cloud computer using VCL has lead to many researches. VCL is used for implementing complex control images of resources in complex environments. Some of the major drawbacks seen in cloud are:
1. Management of metadata
2. Collection of Provenance information
3. Permanent Storage
4. Logical representation of information to the user
5. Optimization of images
6. Image portability
7. Implication of image format
Some considerable solutions are there to solve the above problems. But still they are under research to produce a secure way of sharing resources.
A cloud is most commonly used in virtualization, distributed computing, utility computing and networking. Research is done to include additional features in VCL technology.

IV. 2010

The author through this paper discusses about an important term called security issue for cloud computing. The authors have also presented a model to have secure clouds and it is focused on two layers namely storage layer and data layer. The main difference is on a scheme which is used for securing third party document publication on cloud. The paper also focuses on secure federated query processing and Hadoop. The final work and the implementation of XACML for hadoop which is a major aspect of secured cloud computing. The security issues can be classified as middleware/storage/data/network/application. The main challenge discussed is to have secured transactions even if some part of the cloud fails.

V. 2011(I)

According to the authors, cloud computing is an architecture with distributed nature which serves and provides resources on demand. The authors through this paper focuses on the various models of cloud deployment and security issues that are in the cloud computing industry. The main security issues is that with he people is scaling of data. Cloud providers believe that encryption is the only way of security but it is not so. Trust plays an important role. So, trust will always remain an important factor and these are external security standards (ISO27001) which can audit to ensure compliance only when a company conforms to this standard. This is an added advantage.

VI. 2012(A)

Cloud computing has become an important concept and technology in many concerns. It provides us many utilities at low cost. Security in cloud computing is an important factor since users store very sensitive information. This issue is not popular among single cloud but when it comes to multi-clouds, security constraints have more impact. The author through this paper surveys and proposes a new research that is related to both single and multi clouds with possible solutions. The paper focuses on the ability of multi clouds to reduce security risk which affects the user. The rapid growth in the field of cloud computing is great but the security is still a major issue. The proposed work is to survey recent research on single/multi clouds and its security issues and also to ensure security of multi-clouds which has its own ability to reduce risk.

VII. 2012(I)

The author through this paper proposes a concise and a very detailed explanation on data security and privacy preservation issues that are related to cloud computing in all stages. Since cloud security concerns, data security and privacy protection are the important factors that are to be considered before adopting the technology. This paper also discusses about the current solution and future work regarding this issue. Although cloud computing has more advantages, the drawbacks are also there that needs to be solved. Security of data and privacy protection are the

VIII. CONCLUSION AND FUTURE WORK

Various methods developed for anonymizing data from 2007 to 2012 is discussed. Publishing microdata such as census or patient data for extensive research and other purposes is an important problem area being focused by government agencies and other social associations. The traditional approach identified through literature survey reveals that the approach of eliminating uniquely identifying fields such as social security number from microdata, still results in disclosure of sensitive data, k-anonymization optimization algorithm ,seems to be promising and powerful in certain cases ,still carrying the restrictions that optimized k-anonymity are NP-hard, thereby leading to severe computational challenges. k-anonimity faces the problem of homogeneity attack and background knowledge attack . The notion of ldiversity proposed in the literature to address this issue also poses a number of constraints , as it proved to be inefficient to prevent attribute disclosure (skewness attack and similarity attack), l-diversity is difficult to achieve and may not provide sufficient privacy protection against sensitive attribute across equivalence class can substantially improve the privacy as against information disclosure limitation techniques such as sampling cell suppression rounding and data swapping and pertubertation. Evolution of Data Anonymization Techniques and Data Disclosure Prevention Techniques are discussed in detail. The application of Data Anonymization Techniques for several spectrum of data such as trajectory data are depicted. This survey would promote a lot of research directions in the area of database anonymization.
 

References