Tuesday, 3 January 2017

Analyse the Encryption and Decryption Conversion Time of Various Algorithms on Different Settings of Dataset

Vol. 5  Issue 3
Year:2016
Issue:Jun-Aug
Title:Analyse the Encryption and Decryption Conversion Time of Various Algorithms on Different Settings of Dataset
Author Name:A. Usha and A. Subramani 
Synopsis:
Security is the most difficult angle in the system correspondence framework. In system correspondence framework, trade of information data for the most part happens on arranged PCs, remote telephone, and other web based electronic contraptions. Cryptography is the one of the primary PC securities that proselytes data from its ordinary structure into a disjointed structure by utilizing encryption and decoding strategies. The security of remote systems principle acknowledges and the procedure of encryption and decoding assumes as an imperative part to give security to the remote systems. Unsecured information goes through numerous kinds of result. The cryptography strategies used to just an approve individual can have the capacity to open and read the first message. Various cryptography methods are created to accomplish the security in the system correspondence framework. There are essentially two sorts of cryptography systems, viz. symmetric key and Asymmetric key. In Symmetric key systems, the same key is utilized to encryption and unscrambling information. An awry key procedure is utilized to illuminate the key circulation. In Asymmetric keys, are utilized in two keys particular open key and private key. In this, the Public key is utilized for encryption and the private key is utilized for decoding. This paper gives a contrast between the cryptography calculations on various settings of information parcels and utilizing the different calculations (DES, AES, Blowfish, RSA, RC6, RC4, and TRIPLEDES). In this paper relative investigation of the DES, AES, Blowfish, RSA, RC6, RC4 and TRIPLEDES calculations, for example, distinctive size of information squares, diverse key sizes, and encryption /decoding speed are done. Various symmetric and lopsided calculations are proposed in reason situation. Numerous favourable position and weakness are in calculation. This paper is a review of the better than ever Security based calculations in system security.

Linear Regression Analysis and Validation Studies of Insulin-Like Growth Factor (IGF-1) Receptor Inhibitors

Vol. 5  Issue 3
Year:2016
Issue:Jun-Aug
Title:Linear Regression Analysis and Validation Studies of Insulin-Like Growth Factor (IGF-1) Receptor Inhibitors
Author Name:R. Rambabu and P. Srinivasa Rao
Synopsis:
To delineate the dependency of physico-chemical properties on activity of inhibitors, a python program was written to study the linear regression analysis on a set of Insulin Growth Factor – 1R inhibitors. About 87 IGF-1 receptor inhibitors were selected from published literature with few independent variables such as Molecular weight, Hydrogen bond donors, acceptors, logP and a number of freely rotatable bonds, 5- and 6- membered aromatic rings, Randic, Balaban indices, LUMO, HOMO, Dipole, Lipole and molar refractivity were selected. The relationship between dependent variable (log1/IC ) and independent variables was established by linear regression analysis using python code. A linear regression 50 2 analysis resulted in F-test: 8.812, r value: 0.794 and r value of 0.631, respectively. Inter-correlation between variables of the proposed regression model was checked to know about their independence. It is observed that compounds 13, 53 and 63 has standardized residuals -2.011, 2.309 and 2.227 respectively and can safely be excluded from the data set. Leverages and standardized residuals resulted in similar outcomes, but leverage analysis was able to find three more outlying data, finally 6 compounds were omitted from the dataset of 87 compounds, and the remaining 81 were divided as 76 molecule training set and a 5 molecule validation set. The predicted values of test set data (actual vs predicted) 2 2 when Equation (2) was applied and the regression coefficient (r ) obtained was 0.9686 and regression coefficient (r ) o passing through origin was plotted, which is 0.9326 within the limits. Regression plot between actual vs. predicted values 2 2 of compounds and vice versa of test data set showed r = 0.9686 and r = 0.9465, which suggests the predictive ability of o 2 the regression equation. From the analysis of test set, the regression equation is said to have predictive ability with R = cv,ext 2 2 2 2 2 2 2 0.99, R = 0.97, (R – R ) / R = 0.03 and (R – R ) / R = 0.02 and k = 1.01 and k' = 0.99 respectively.

Achieving High Accuracy in an Attack-Path Reconstruction in Marking on Demand Scheme

Vol. 5  Issue 3
Year:2016
Issue:Jun-Aug
Title:Achieving High Accuracy in an Attack-Path Reconstruction in Marking on Demand Scheme
Author Name:P. Banu Prakash and E.S.Phalguna Krishna
Synopsis:
A source of a Distributed Denial-of-Service (DDoS) attacks are one of the major threats to the Internet today. DDoS attacks can be recognized by the traffic they make using the IP traceback technique [5]. In general, there are only a limited number of routers and the computers involved in an attack session are noticed. Therefore, only marking those involved nodes for traceback purpose is needed, rather than marking each node of the Internet, as the existing schemes does. Based on this finding, a novel Marking on Demand (MOD) scheme based on the DDPM mechanism to dynamicaly distribute marking IDs in both temporal and space dimensions is available. The available MOD scheme can traceback to all probable sources of DDoS attacks, which is not feasible for the existing DDPM schemes. However, the existing MOD framework needs to be extended since it suffers from both false positive and false negative rate. This paper aims to extend the existing MOD scheme by using a 32-bit marking field in order to reduce the shortcomings and to avoid the problem caused by packet fragmentation due to increase in marking length.

Effective Bug Triage with Software Data Reduction Techniques Using Clustering Mechanism

Vol. 5  Issue 3
Year:2016
Issue:Jun-Aug
Title:Effective Bug Triage with Software Data Reduction Techniques Using Clustering Mechanism
Author Name:R.Nishanth Prabhakar and K.S. Ranjith
Synopsis:
Bug triage is the most important step in handling the bugs which occur during a software process. In manual bug triaging process the received bug is assigned to a tester or a developer by a triager, hence when the bugs are received in huge numbers it is difficult to carry out the manual bug triaging process, and it consumes much resources both in the form of man hours and economy, hence there is a necessity to reduce the exploitation of resources. Hence a mechanism is proposed which facilitates a much better and efficient triaging process by reducing the size of the bug data sets. The mechanism here involves techniques like clustering techniques and selection techniques. Their approach proved much efficient than the manual bug triaging process when compared with bug data sets which were retrieved from the open source bug repository called bugzilla.

A Prevention Approach to Counter DDoS Attacks in Application Layer

Vol. 5  Issue 3
Year:2016
Issue:Jun-Aug
Title:A Prevention Approach to Counter DDoS Attacks in Application Layer
Author Name:A. Siva Kumar and M.Ganesh Karthik
Synopsis:
Distributed Denial of Service (DDoS) attack includes by sending large volume of enormous requests to the server or site and it will unable to handle the enormous requests, and the server remains offline for some time depending upon the attack. Flooding attack is one type of the DDoS attack. For the detection of DDoS flooding attack, one of the approaches is used as multi-dimensional sketch design by integrating with Hellinger Distance (HD) analysis. The HD analysis can be used to monitor the network flows. Sketch technique can be used to detect the attack and it is having the capability of selectively discarding the offending messages only, but not the attack resource. Mainly this can be applied to the VOIP networks, but not to all type of attacks. To prevent DDoS attacks, one of the metrics used is the New Cracking Algorithm. This algorithm includes whenever the client enters into the particular web resource exceeding more than the given threshold, then the client's IP address can be added to the blocked list as an attacker, and the security can be applied by using Message Authentication Code (MAC) for the client's IP address. So this algorithm can protect the web resource even for large volumes of DDoS attack traffic.

Design and Analysis of Secure and Efficient Image with Embedded Sensitive Information Transferring Technique using Blowfish Algorithm

Vol. 5  Issue 3
Year:2016
Issue:Jun-Aug
Title:Design and Analysis of Secure and Efficient Image with Embedded Sensitive Information Transferring Technique using Blowfish Algorithm
Author Name:Kaki Leela Prasad, P. Anusha , G. Jyothi and K. Dileepkumar
Synopsis:
The usage of internet is in all aspects like Railway Reservations, Air Ticketing etc. Drastically, there is a lack of security in real time operations. Hackers crack the passwords usually by using few attacks like shoulder surfing attacks, phishing attacks, online guessing attacks, and brute force attacks. To overcome these attacks there already exists image based passwords like captcha technologies pass points and cued recall in providing solutions up to certain range, but it reduces the efficiency by loading huge images while generating the passwords. To overcome this, the authors proposed a new secure and efficient image with Embedding Sensitive Information Transferring Technique using Blowfish Algorithm for Key Generation.

A Review of Leading Databases: Relational & Non-Relational Database

Vol. 5  Issue 2
Year:2016
Issue:Mar-May
Title:A Review of Leading Databases: Relational & Non-Relational Database
Author Name:Navneet Kumar Kashyap, B.K. Pandey, H.L. Mandoria and Ashok Kumar
Synopsis:
In this paper, the authors have done a comparison between the leading database systems currently used in industry as well as in academics. Relational & Non-relational are the two leading Databases currently used in both academic & professional industry. Database stores data, which generates rapidly these days, but database is not only about storage information. Database is also concerned about managing huge mass of data in a consistent & stable manner, which is also quickly recoverable or accessible when it is needed. The prominent features of both the databases with all specifications & comparisons are analyzed here. Relational databases are around us for so many years and are a choice of most technologies, but current growth of data and the internet market with the new emerging of web technologies leads us toward new trends like web 3.0. These technologies is new, leading us to a new challenge & management concept. NoSQL database, has become a very popular database, because it gives us an alternative to the relational database especially in dealing with massive data. As we already know, this is a main problem of DB management with high availability & scalability for distributed systems as they need fast access with no down time during problems. This paper present the concept of Non-relational database motivations & movement, and the needs behind it and also reviews the different types of Non-RDBMS and the issues related to both databases with application & security issues and their comparison with relational databases.

Security Issues and Challenges in Multimedia Systems

Vol. 5  Issue 2
Year:2016
Issue:Mar-May
Title:Security Issues and Challenges in Multimedia Systems
Author Name:E. Sabarinathan and E. Manoj 
Synopsis:
Due to rapid development in the network and communication field, it has become necessary to protect the unauthorized duplication of a confidential image. In today’s internet era, fortification of digital gratified during communication is a penurious. The Development of multimedia applications makes digital media to bring about conveniences to the people by easy processing of data. At the same time, it enables the illegal attackers to attack the works. For the protection of data, there has been a growing interest in developing effective techniques to discourage the unauthorized duplication of digital data among Cryptography, Watermarking and Steganography. This paper is a comprehensive review of diverse image processing methods and enormous number of interrelated solicitations in various disciplines, including various cryptography, Steganography, watermarking techniques. In this paper, different existing techniques are discussed along with their drawbacks future scope.

Biometric Sender Authentication in Public Key Cryptosystem to Overcome Man-In-The-Middle Attack and to Provide High Security

Vol. 5  Issue 2
Year:2016
Issue:Mar-May
Title:Biometric Sender Authentication in Public Key Cryptosystem to Overcome Man-In-The-Middle Attack and to Provide High Security
Author Name:Jitendra Singh Laser and Vinaykumar Jain 
Synopsis:
The objective of this work is to develop Biometric based sender authentication in public key cryptosystems to overcome the man-in-the-middle attack or hacking issues and provide a high level of security. Biometric Technologies are automated methods for verifying or recognizing the identity of a person based on physiological or behavioural characteristics. In public key cryptosystems, authentication is provided either by HASH function, MAC (Message Authentication Code) function or Digital Signature. In this work, the authors have used a Biometric based sender Authentication using Speech Feature in public key Cryptosystem. Here the public key cryptosystem uses Diffie Hellman (DH) algorithm. DH generates a shared secret key between two parties over an insured communication, but DH does not provide Encryption/Decryption and authentication. The man-in-the-middle attack or Hacking is possible in DH, because it does not authenticate the participants. In this work, the authors overcome the authentication and the man-in-middle attack problem by using Biometric based sender authentication. Also the speech encryption and decryption are performed with the message for high security.

Additional Authentication Technique: An Efficient Approach to Prevent Cross-Site Request Forgery Attack

Vol. 5  Issue 2
Year:2016
Issue:Mar-May
Title:Additional Authentication Technique: An Efficient Approach to Prevent Cross-Site Request Forgery Attack
Author Name:Bharti Nagpal, Naresh Chauhan and Nanhay Singh 
Synopsis:
Cross Site Request Forgery (CSRF) attack is a one-click attack, which is very common and widely known. The CSRF attack involves exploitation of session cookies when the victim is in the active session of their account on a website. The CSRF attack allows the attacker to perform unauthorized activities, which is unknown to the user. An attack is a forged HTTP request which exploits the current session of user in the browser. The attack makes the browser act on the forged HTTP without the knowledge of the user so the most important prevention is browser-based solution. The browser-based solution cannot always work because browser allows third party websites to perform a request to trusted websites. The CSRF attack exploits the trust that a website has in the user's browser. In this paper, the authors have proposed an additional authentication technique to prevent the CSRF attack.

Changes in Virtual Team Collaboration With Modern Collaboration Tools

Vol. 5  Issue 2
Year:2016
Issue:Mar-May
Title:Changes in Virtual Team Collaboration With Modern Collaboration Tools
Author Name:Viju Raghupathi
Synopsis:
The evolution of information technology that transcends time and space has enabled organizations to leverage virtual team collaboration. New collaborative technology, such as Google Apps/Google Docs, has disrupted traditional assumptions about collaboration. This empirical study on virtual teams examines how social and technical influences shape collaboration. It analyzes the performance and process of collaboration to detect changes, if any, brought about by modern collaboration tools, as well as the implications to organizations that deploy virtual teams. The findings show that new tools promote changes in the process of collaboration from a sequential to an iterative cycle of discussion and production. Practitioners can benefit from the insights of this study by evaluating the nature of teams and the support provided for their collaboration. In addition, collaborative technology has differential effects based on the social dynamics that occur within virtual teams.

An Introduction to Data Lake

Vol. 5  Issue 2
Year:2016
Issue:Mar-May
Title:An Introduction to Data Lake
Author Name:K.V.N. Rajesh and K.V.N. Ramesh 
Synopsis:
Now-a-days companies are concentrating on more data to take informed decisions. Companies that are able to effectively use data are the world leaders in terms of wealth, development and growth. Even to survive, operate and compete in this age, organizations need to be able to effectively use their data. Huge amount of investment is made in storing and processing large amounts of data to make better decisions. Data lake is a massive, easily accessible data store/repository that allows for collecting large volumes of structured and unstructured data in its native format from disparate data sources. This paper describes Data Lake, Schema-on-Write, Schema-on-Read, Characteristics and implementation of data lake.

A Survey on Operating System Virtualization Methods and Challenges

Vol. 5  Issue 1
Year:2016
Issue:Dec-Feb
Title:A Survey on Operating System Virtualization Methods and Challenges
Author Name:Abhilash C.B and D.V. Ashoka
Synopsis:
Computational world is turning out to be substantial and complex. Distributed computing has risen as a well registering model to bolster handling substantial volumetric information utilizing groups of product PCs. Working framework (OS) virtualization can give various imperative advantages, including straightforward relocation of utilizations, server combination, online OS upkeep, and improved framework security. Nonetheless, the development of such a framework introduces a bunch of difficulties, not withstanding for the most wary engineer, that if neglected may bring about a frail, deficient virtualization. We exhibit exchange of key execution issues in giving OS virtualization in a merchandise OS, including framework call intervention, virtualization state administration, and race conditions. The authors discussed about their encounters in executing such usefulness over two note worthy variants of Linux altogether in a loadable bit module with no portion adjustment. The author exhibit trial results on both uniprocessor and multiprocessor frameworks that show the capacity of our way to deal with furnish recapture virtualization with low overhead. In this paper, the authors first developed a comprehensive taxonomy for describing operating system architecture. Then they use this taxonomy to survey several existing operating system virtualization services and challenges.

Functioning of Intelligence Intrusion Multi Detection Prevention Systems (IIMDPS)

Vol. 5  Issue 1
Year:2016
Issue:Dec-Feb
Title:Functioning of Intelligence Intrusion Multi Detection Prevention Systems (IIMDPS)
Author Name:S. Murugan and K. Kuppusamy 
Synopsis:
This paper focuses on functioning of Intelligence Intrusion Multi Detection Prevention Systems (IIMDPS). It describes the prevention of unknown malware with the help of mathematical scheme and few models with newly designed algorithm. This is designed to provide a deeper understanding of existing intrusion detection principles with intelligence strategies, that will be responsible for acquiring unknown malware, which compare the false positive rate and the false negative rate. That will be proven by conducting different experiments with WEKA simulation.

A Substitution Based Encoding Scheme to Mitigate Cross Site Script Vulnerabilities

Vol. 5  Issue 1
Year:2016
Issue:Dec-Feb
Title:A Substitution Based Encoding Scheme to Mitigate Cross Site Script Vulnerabilities
Author Name:Bharti Nagpal, Naresh Chauhan and Nanhay Singh 
Synopsis:
Most of the attacks made on the web, target the vulnerability of web applications. These vulnerabilities are researched and analyzed at OWASP [1]. The Open Web Application Security project, OWASP, tracks the most common failures. Cross Site Scripting (XSS) is one of the worst vulnerabilities that allow malicious attacks such as cookie thefts and web page defacements. Testing an implementation against XSS vulnerabilities can avoid these consequences. Obtaining an adequate test data set is essential for testing of XSS vulnerabilities. These inputs are interpreted by browsers while rendering web pages. When an attacker gets a user's browser to execute his/her code, the code will run within the security context (or zone) of the hosting website. With this level of privilege, the code has the ability to read, modify and transmit any sensitive data accessible by the browser. Cross-site scripting attacks essentially compromise the trust relationship between a user and the website. XSS occurs when a web page displays user input typically via JavaScript that is not properly validated. This paper uses an encoding scheme that scans the starting tag present in a HTML tag and encodes it such that, the script written inside the starting and closing tags will not work as a HTML element thus, rendering the attack useless.

Ethical Hacking and Security Against Cyber Crime

Vol. 5  Issue 1
Year:2016
Issue:Dec-Feb
Title:Ethical Hacking and Security Against Cyber Crime
Author Name:Neeraj Rathore
Synopsis:
This paper explores the fast growing Cyber world and its components over the internet. The fast growing Internet has benefited the modern society in the form of e-commerce, e-mail, online banking or system, advertising, vast stores of reference material, etc. But, there is also a dark side, where internet becomes a common and easy tool for the criminal activity using a weak link and vulnerability of internet. In this paper, the author concentrated over several hacking activity that come under Cyber crime. It also highlights the role of ethical hacker to evacuate from the culprits and cyber crime and illustrate on proactive approach to minimize the threat of hacking and Cyber crime.

Cyber Security Challenges on Academic Institutions and Need For Security Framework Towards Institutional Sustainability Growth And Development

Vol. 5  Issue 1
Year:2016
Issue:Dec-Feb
Title:Cyber Security Challenges on Academic Institutions and Need For Security Framework Towards Institutional Sustainability Growth And Development
Author Name:Wali Mohammad Dar
Synopsis:
The growing dependence on computer networks and internet based applications in all areas of human involvement (Health, Education, Transportation and energy) makes it a big challenge to treat Cyber security as a separate dimension. For the sustainable development and existence of Academic Institutions, a secure and comprehensive framework is the need of the hour to ensure the sustainability and existence in the digital world. Cyber security consists of 'cyber space' which is a collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment, organization and user's assets. Cyber security endeavors to ensure the attainment and maintenance of the security properties of the organization and user's assets. Therefore strong initiatives like implementation of security policies and strategic framework of procedures/plans to secure the future of Institutions are enforced. In the present era, digital information is at the core of almost all of a university activities and the safety and security of this information is vital for growth and development. This paper discusses the Security Framework as a means to protect information and technology resources, throughout the University.

A Python Based Regression Script to Evaluate Dependency of H-Index of Journals on Various Citation Parameters

Vol. 4  Issue 4
Year:2015
Issue:Sep-Nov 
Title:A Python Based Regression Script to Evaluate Dependency of H-Index of Journals on Various Citation Parameters
Author Name:P. Varaprasada Rao and A. Govardhan
Synopsis:
The scientific output of researchers has become an important achievement in scientific community. In most recent years, several bibliographic indices were proposed to assess the quality of the academic research publications, h-index being more prominent among all indices. Considering h-index value of a journal as dependent variable and citation parameters as independent variables, a python based regression algorithmic approach was reported in this study to delineate the dependency of h-index on citation parameters such as Total Docs., Total Cites, Citable Docs., Cites/Doc. and Ref./Doc., respectively. From regression analysis, it is observed that high value of TC3, CD3 and CD2 contributes positively to enhance h-index factor of journals, whereas on the other hand, TD3 and RD would contribute negatively to hindex.

Secure Photo Sharing in Online Social Network

Vol. 4  Issue 4
Year:2015
Issue:Sep-Nov 
Title:Secure Photo Sharing in Online Social Network
Author Name:V.S. Krithikaa Venket and R. Chithra Devi 
Synopsis:
A popular feature of many Online Social Network (OSN) is photo tagging and photo sharing that allows users to annotate the images who are present in the uploaded images. To overcome the user's privacy, a Facial Recognition (FR) system has been designed effectively during sharing, posting and liking of the photos. An increasing number of personal photographs are uploaded to Online Social Networks, and these photos do not exist in isolation. Photo tagging is a popular feature of many social network sites. The FR system is superior to some possible approaches in terms of increase in recognition ratio and efficiency. To achieve this, OSN specifies a privacy policy and an exposure policy. By these policies, individuals are enabled in a photo by providing permissions before posting a co-photo [11]. Exploring computational techniques and confidentiality of training sets that takes advantage of these trends seems a worthwhile endeavor. To share our photo safely we need an effective FR system, which can recognize everyone in the photo. We also attempted to develop users' private photos for designing an adaptive Face Recognition system specifically used to share a photo with their permission. Finally, the system protects user's privacy in photo sharing over Online Social Network.

Preparing Data Sets by Using Horizontal Aggregations in SQL for Data Mining Analysis

Vol. 4  Issue 4
Year:2015
Issue:Sep-Nov 
Title:Preparing Data Sets by Using Horizontal Aggregations in SQL for Data Mining Analysis
Author Name:K.Sentamilselvan, S.Vinoth Kumar and A.Jeevanantham
Synopsis:
Data Mining is one of the emerging fields in research. Preparing a Data set is one of the important tasks in Data Mining. To analyze data efficiently, Data Mining systems are widely using datasets with columns in horizontal tabular layout. Building a datasets for analysis is normally a most time consuming task. Existing SQL aggregations have limitation to build data sets because they return one column for aggregated group using group functions. A method is developed to generate SQL code to return aggregated columns in a horizontal tabular layout, returning a set of numbers instead of one number per row. This new class of functions are called horizontal aggregations. This method is termed as BY-LOGIC. SQL code generator generates automatic SQL code for producing horizontal aggregation. A fundamental method to evaluate horizontal aggregation called CASE (exploiting the case programming construct) is used. Basically, there are three parameters available namely: grouping, sub-grouping and aggregating fields for creating horizontal aggregation. Query evaluation shows that CASE method responses faster than BY-LOGIC method.

Students’ Absence Registration and Reporting

Vol. 4  Issue 4
Year:2015
Issue:Sep-Nov 
Title:Students’ Absence Registration and Reporting
Author Name:Rana D. Alanni, Atheer M. Al-saraaf and Mohammed M. Noori 
Synopsis:
Whole world and administrators of Educational institutions are concerned about the regularity of student;s attendance. Students overall academic performance is affected by the student's presence in his/her institute. Due to that, keeping track of student’s attendance becomes an important job in an academic environment. This paper presents a software design and implementation for taking student’s attendance and extract different kinds of absence reports. The software system is based on inserting, deleting, updating and querying of a database management system. In this system, the teachers engaging different classes are required to submit the attendance of the students present in their class regularly. The administrator monitoring this information, can extract different reports about a student’s weekly absence or whenever he/she wants the details using this system.

Audio Steganography and Security by using Cryptography

Vol. 4  Issue 4
Year:2015
Issue:Sep-Nov 
Title:Audio Steganography and Security by using Cryptography
Author Name:Hamsa A Abdullah, Aalaa A. Abdulameer and Israa F. Hussein 
Synopsis:
Audio steganography is a security technique by which a secret data to be transmitted over a public insecure channel is embedded into an audio cover object that hides it in such way that third party cannot detect the presence of the message. In this proposed system, DES cryptographic encryption algorithm is used to encrypt data before hiding it into the cover object for additional security. For the purpose of data encoding into audio samples, LSB coding algorithm is used. The overall system gives a relatively secure system that hides the secret message, a feature is provided by steganography, beside deformations the structure of the message, a feature provided by cryptography, which together leads to prepare the message travel through public channels.

Privacy Issues Surrounding Wearable Technology

Vol. 4  Issue 4
Year:2015
Issue:Sep-Nov 
Title:Privacy Issues Surrounding Wearable Technology
Author Name:Thomas Page
Synopsis:
Wearable devices give people the ability to track almost every facet of the lives through being embedded with a multitude of sensors. The collection of data through the use of these sensors is called 'personal metrics' - the quantification of everyday activity in order to change, improve or understand human behaviour. In order to deliver meaningful insight to the user, these personal metrics need to be sent to companies for analysis. This collection of data from companies ultimately causes complex concerns for consumer's privacy, most notably among young consumers, who are widely reported as having an increased acceptance for the sharing of their data. The study was achieved through first gaining-an understanding into the research area through reviewing literature, and then conducting primary research through an online survey. Overall, it was found that education into app privacy regulation and companies' use of data alone didn't have much effect on young adult consumer's behaviours. Furthermore, it was concluded that, young adult consumers appear to have an acceptance for the loss of their privacy, however some behaviours appear to show a level of concern. Nonetheless, due to limitations in the methodology of the research undertaken, it was concluded that further studies would be required in order to ensure the validity of the data.

Analysis of TCP Variants Over Variable Traffic

Vol. 4  Issue 3
Year:2015
Issue:Jun-Aug
Title:Analysis of TCP Variants Over Variable Traffic
Author Name:Varun Chauhan and Rajesh Kumar
Synopsis:
In today’s scenario most of the work carried out is accomplished through the internet. Internet plays a vital role in day-today’s regular activities and hence has become the backbone of our society. In internet the data is transferred from one place to another with the help of most commonly used protocol known as Transmission Control Protocol (TCP). TCP possesses various variants and each one has different criteria such as TCP, Tahoe, Reno, New-Reno and so on. Each variant performs in a different routine for different networks based upon the various key parameters. In this paper we have simulated various TCP variants with respect to different network parameters such as variable packet size, bandwidth and buffer size. This paper analyses the throughput, packet drop, delay and jitter of TCP variants evaluated over the above mentioned network parameters.

Defending Against Remote File Inclusion Attacks on Web Applications

Vol. 4  Issue 3
Year:2015
Issue:Jun-Aug
Title:Defending Against Remote File Inclusion Attacks on Web Applications
Author Name:Bharti Nagpal, Naresh Chauhan and Nanhay Singh 
Synopsis:
Web applications are the fundamental pillars of today's world. Society depends on them for business and day to day tasks. Because of their extensive use, web applications are under constant attack by hackers that exploit their vulnerabilities to disrupt business and thus access confidential information. Remote File Inclusion (RFI) is a type of vulnerability most often found on websites. It allows an attacker to include a remote file, usually through a script on the web server. The attackers operate independently of one another with the goal of seeking exploitable vulnerabilities on the web. Different reasons were found for attack such as use of user-supplied input without proper validation. This can lead to something as minimal as outputting the contents of the file or more serious events. From observations, it is apparent that the detection and blocking of such attacks can be prevented by creating a blacklist of attack sources and a black list of URLs of remotely included malicious scripts. Remote file inclusion is a technique which is used to attack web applications mainly php applications from a remote server. RFI attacks are extremely dangerous as they allow a client to force a vulnerable application to run their own malicious code by including a reference pointer to code from a URL located on a remote server. When an application executes the malicious code it may lead to a backdoor exploit or technical information retrieval. Attackers attempt to remotely include these within the web applications. While the scripts are hosted at many locations, many of them are duplicates of each other, so the number of actual scripts that are used in the attacks are very small.

Secure Data Hiding by Optimal Placement of Queen Along Closed Knight Tour

Vol. 4  Issue 3
Year:2015
Issue:Jun-Aug
Title:Secure Data Hiding by Optimal Placement of Queen Along Closed Knight Tour
Author Name:Abhishek Bansal, Sunil Kumar Muttoo and Vinay Kumar
Synopsis:
A knight tour starts from any square of the chessboard. A tour in which a knight visits every square on the board exactly once is called closed knight tour. A queen can move along column, row and diagonal in both forward and backward direction. In proposed method, the authors divide the cover image into non-overlapping 8x8 pixel blocks. For each block, they place a queen on a square and find the number of prime attacking positions by the queen along knight tour. Then the authors remove these attacking positions from the sequence of knight tour in 8x8 bytes block. They compute number of mismatches between bits in LSB position in 8×8 bytes blocks and corresponding number of bits from encrypted message to be hidden. The queen position that results in minimum distortion is chosen. The process is applied on each block in the cover image. The queen positions determined during embedding phase is recorded as the key and the same is used to extract the hidden message from stego cover. Experimental results performed on different images reveal that this method maintains high degree of imperceptibility. Randomization achieved through knight tour and queen moves provide another level of security against detection.

Web Based Archiving System

Vol. 4  Issue 3
Year:2015
Issue:Jun-Aug
Title:Web Based Archiving System
Author Name:Khalid W. Hameed , Samer A. Ahmed and Amir A.Mohammed
Synopsis:
Today, people are more interested in efficiency, speed and more importantly in cost, in every walk of life. The management of work, industries, trade etc. which is called "Office work" is one of these. The office work may include saving information on paper in an ordered manner such that it may return to them in future. In order to overcome the difficulty in managing paper inventory and difficulty in searching for a particular document, the authors have designed and implemented an archiving system to save the burden of hard copies. The requirements had been first identified by analyzing the work nature. Upon these requirements for both the manager and user sides, the system was designed by creating a facility for each requirement, which in overall comprises the system. The designed system is easy, simple and web based. It was designed primarily based on XAMPP application and PHP language.