Face recognition from image or video is a popular topic in biometrics research. Many public places usually have surveillance cameras for video capture and these cameras have their significant value for security purpose. It is widely acknowledged that the face recognition have played an important role in surveillance system as it doesn’t need the object’s cooperation. The actual advantages of face based identification over other biometrics are uniqueness and acceptance. As human face is a dynamic object having high degree of variability in its appearance, that makes face detection a difficult problem in computer vision. In this field, accuracy and speed of identification is a main issue. The goal of this project is to evaluate various face detection and recognition methods, provide complete solution for image based face detection and recognition with higher accuracy, better response rate as an initial step for video surveillance. Solution is proposed based on performance tests on various face rich databases in terms of different emotions
Anself-governing ATM host has aright to use any bank. There is no security layer is implemented in the ATM card except pin number. It is very costly for the bank to include the fingerprint and Iris scanner. In this paper, we monitor the location of the ATM usage, time taken for the user to accessing the ATM machine, sequence of events processed by the user and expected amount of withdrawal by the user. All these four factors are verified for the authentication purpose of the user along with password. If any of the above said, parameter are differing and then the One Time Password is generated to the User’s Mobile number for further more secure authentication system. In the modification phase, an automation user Internet recognition model is designed to enhance the user comfort and detection of the time span spend by the user in the ATM machine. If due to signal problem of the mobile One Time Password will not be received in that cause secret process is used to private ATM users.
6LoWPAN (IPV6 over Wireless Personal Area Network) is a wireless personal area network that contain devices compatible with IEEE 802.15.4. Routing is the major issue in 6LoWPANs as the nodes are characterised by scarce memory, limited power, low battery life, limited resources and less cost. Routing protocols for such networks should be designed such that they make efficient use of available resource and should exhibit high performance. This paper presents a detailed survey of 6LoWPAN routing protocols, their comparison on several metrics such as memory power, power consumption, scalability, routing type, location information and many other factors.
Optical wireless communications offer a viable alternative to radio frequency communication for rural and urban where high performance links are needed. This paper presents a review of the most significant issues related to optical communication technology, which will enable the realization of future high performance and cost-effective optical wireless systems. Several possible configurations for optical wireless systems, modulation, and multi-access techniques are presented as well as their advantages and limitations discussed
Music Recommendation System Using Association Rule Mining and Clustering Technique To Address Coldstart Problem
V.Manvitha, M.Sunitha Reddy
Recommender system based on data mining is very useful, more accurate and provides worldwide services to the user. Recommender systems are becoming very popular in recent years. More and more people rely on online sites for purchasing songs, apparels, books, rented movies etc. The competition between the online sites forced the web site owners to provide personalized services to their customers. So the recommender systems came into existence.Recommender systems are active information filtering systems that attempt to present to the user, information items in which the user is interested in.The recommender systems also suffer from issues like cold start, sparsity. Cold start problem is that the recommenders cannot draw inferences for users or items for which it does not have sufficient information. This paper attempts to propose a solution to the cold start problem by combining association rules and clustering technique.
Survey on Recent trends in Computational & Experimental Technique to Evaluate Performance of Air Flow in Defrost/Demist System for Automobile
Saiprasad N. Patil, Prof. P. D. Sonawane
the safety and comfort aspects of passenger car are significant sales argument and it become a topic of rising importance during development process of new car. Maintaining adequate visibility at all times through a vehicle windshield is important. Vehicle Defrost systems are required to defrost the windshield and side windows in considerable time after the blower has been switched on. The ability of windshield defrosting and demisting system to quickly and completely melt ice on the outer windshield surface and mist formed on the inner surface is therefore is important.
The current study gives methodology for Computational techniques used in industries to evaluate actual performance of defrost/demist system of automobile at manufacturing stage. Some experimental and computational efforts are elaborated in this paper from previous studies so as to study current scenario in Windshield defroster system. This paper summarizes state of knowledge for improving evaluation efforts. The objective of this study is to enhance computational techniques to get improved outcome.
To ensure reliability and high accuracy testing of energy meters has been imperative. The conventional testing of digital energy meters poses some drawbacks viz. slow process, power loss and requirements of resistor and capacitor banks. This paper proposes the modified single phase energy meter, making it standard for the testing of Digital Energy Meters. For the purpose 72 holes are drilled on the disc of electromechanical energy meter and then recalibration of the same is done before using it as “standard”. The holes on the disc shall be detected by the sensor which is connected with the microcontroller which counts the pulses obtained. The number of pulses obtained from electromechanical meter and Digital Energy meter are compared to gauge the error in Digital Meter.
A Review of Cloud Based Schedulers on Cloud Computing Environment
Renu Bala, Gagandeep Singh, Sahil Vashist
Cloud computing is Internet-based computing that share resources that are software and hardware resources on-demand from computers and other devices. The consumers request for available services according to their desired Quality of Service and they are charged on a pay-per-use basis. One of the most challenging problems in cloud computing is task scheduling to satisfy the Quality of service as well as minimizing the task execution time. Scheduling strategy is the key technology in cloud computing. This paper surveyed different types of scheduling algorithms and compare their various parameters. Traditional scheduling algorithms are not able to provide scheduling in the cloud environments. Therefore, new scheduling strategies are needed to overcome the problems between users and resources posed by network properties.
A Modified K-Medoid Method to Cluster Uncertain Data Based on Probability Distribution Similarity
Aliya Edathadathil, Syed Farook, Balachandran KP
Clustering on uncertain data, one of the essential tasks in data mining. The traditional algorithms like K-Means clustering, UK-Means clustering, density based clustering etc, to cluster uncertain data are limited to using geometric distance based similarity measures, and cannot capture the difference between uncertain data with their distributions.Such methods cannot handle uncertain objects that are geometrically indistinguishable, such as products with the same mean but very different variances in customer ratings [6]. In the case of K-medoid clustering of uncertain data on the basis of their KL divergence similarity, they cluster the data based on their probability distribution similarity. Several methods have been proposed for the clustering of uncertain data. Some of these methods are reviewed. Compared to the traditional clustering methods, K-Medoid clustering algorithm based on KL divergence similarity is more efficient. This paper proposes a new method for making the algorithm more effective with the consideration of initial selection of medoids.
Mobile Application Protection System: A Secured Architecture
B Meenakumari, Sandeep Karnam
Mobile application market is gaining a lot of attention with the introduction of increase in speed, accessibility and capacity to download huge amount of software applications from internet to the mobile devices. Mobile application protection is a critical issue for mobile network operators, content providers and all players who involved in the mobile software business chain. This paper proposes an architecture where trusted mobile software environment is designed to control the execution of mobile application. A software ID which is dynamically updated and shared key are created and used for authorization of mobile application execution requests. This solution can prevent many mobile application crack issues which include copy of the mobile application to unauthorized mobile devices and modification of mobile application.
Requirements elicitation is the process of seeking, uncovering, acquiring, and elaborating requirements for computer based systems. It is generally understood that requirements are elicited rather than just captured or collected. This implies there are discovery, emergence, and development elements to the elicitation process. Requirements elicitation is a complex process involving many activities with a variety of available techniques, approaches, and tools for performing them. The relative strengths and weaknesses of these determine when each is appropriate depending on the context and situation. One of the most important aspects of developing a large Web-based project is getting the correct requirements from the client. Time and money can be lost if the requirements are incomplete or inaccurate. Traditional Web design sources tend to gloss over this important activity. Requirement elicitation is a critical activity in the requirement development process and it explores the requirements of stakeholders. The success or failure of this process is based on identifying the relevant stakeholders and discovering their needs as well as the quality of requirements. The quality of the requirements is greatly influenced by methods applied during requirements elicitation process. Only complete and structured requirements make these projects more reliable. The common challenges that analysts face during elicitation process are to ensure effective communication between stakeholders as well as the acquisition of tacit knowledge. Mostly errors in the systems are due to poor communication between user and analyst, and these errors require more resources (time and money) to correct them. The understandability problems during elicitation process of large web projects can lead to requirements ambiguous, inconsistent, incorrect and unusable. Different methods (Conversational, Observational, Analytical and Synthetic) are available to deal with the problems during requirement elicitation process. The challenge for analysts is to select an appropriate method or set of methods and apply them for the clear, consistent and correct requirement gathering. This study based on the results of interviews conducted to the professionals, who have industrial experience in development of web systems. The elicitation problems that are identified in literature and interview along with applicability of elicitation methods for requirement gathering in large web projects development are documented in this report. Software engineering is a mature field that can help in the quest for more complete and accurate requirement gathering. The objectives of this chapter are to present a comprehensive survey of important aspects of the techniques, approaches, and tools for requirements elicitation, and examine the current issues, trends, and challenges faced by researchers and practitioners in this field.
A light weight PLGP based method for mitigating vampire attacks in Wireless Sensor Networks
Farzana T, Mrs.Aswathy Babu
Deployment of sensor network in hostile environment makes it mainly vulnerable to battery drainage attacks because it is impossible to recharge or replace the battery power of sensor nodes. The motivation of a large portion of research efforts has been to maximize the network lifetime, where the lifetime of network is measured from the instant of deployment to the point when one of the nodes has exhausted its limited power source and becomes in-operational commonly referred as first node failure. But there is a class of resource consumption attack called vampire attack which permanently disables the whole network by quickly draining nodes battery. In this novel approach, forwarding as well as discovery phase of the protocol are considered to avoid attack. Here algorithm overhead is reduced and discovery phase is considered to avoid vampire attack.
Performance Evaluation Of Fuzzy Based DWT Approach To Enhance CT Image Using Image Fusion
Sarabdeep singh, Shikha khera
Image fusion is about to combine the features of two or more partial occluded or damaged images to generate a new effective image. The proposed algorithm is about to improve the quality of CT image by using the concept of image fusion. In this work, an effective image analysis model is been presented under multiple parameters. The parameters that will be considered in this work are contrast analysis, entropy analysis and frequency analysis. The proposed algorithm begin, with the decomposition of images using DWT. These decomposed image parts will be analyzed using multiple parameters defined above. To perform the effective pixel area selection, the fuzzy logic will be applied. In this work, a two layered analysis is been performed for image fusion. The first level fusion will be done using DWT and second level fusion will be done using parametric fuzzy logic approach. As the presented work is defined on the basis of multiple parameters so that more effective results are expected from the defined approach. The work will be implemented in matlab environment. The obtained results shows the effective generation of fusion image. The analysis of work is done under different parameter that shows the work has improved the visibility of medical image.
Detection of Crash Transient Failure during Job Scheduling using Replication Technique
Inderpreet Kaur, Sarpreet Singh
Grid computing or computational grid is always a vast research field in academic. Computational grid provides resource sharing through multi-institutional virtual organizations for dynamic problem solving. Various heterogeneous resources of different administrative domain are virtually distributed through different network in computational grids. Thus any type of failure can occur at any point of time and node running in grid environment might fail. Hence fault tolerance is an important and challenging issue in grid computing as the dependability of individual grid resources may not be guaranteed. In order to make computational grids more effective and reliable fault tolerant system is necessary. The objective of this paper is to test the crash and omission transient failure in resource scheduling. This paper presents an overview of fault tolerance and its techniques, task replication and most fitting resource allocation algorithm.
Optimal Implementation of Graph Kernel Matching in Matrimonial Database Using Graph Mining Technique
S. A. Amala Nirmal Doss, Dr.S.P.Victor
The Graph mining provides a systematic way implement real-time data with different level of implications. Our conventional setup initially focuses with dataset and its entity. This paper perform a detailed study of graph kernel matching towards variant clusters in the field of graph mining which can be carried out with request to response matching strategies. We will implement our integrated graph mining techniques with real time implementation of Matrimonial database Domains. We will also perform algorithmic procedural strategies for the successful implementation of our proposed research technique in several sampling domains with a maximum level of improvements. In near future we will implement the cluster mining techniques for predicting the Graph sub structure behaviors
Mr. Rajendra B. Mohite, Prof. Miss Rupali R. Jagtap
Design of Intelligent Mobile Vehicle Checking System Based On ARM 7
Mr. Rajendra B. Mohite, Prof. Miss Rupali R. Jagtap
With ARM 7 as the core, the new Intelligent Mobile Vehicle Checking System integrated with a lot of hardware & software module such as image capturing, number plate recognition, GSM, GPS etc. the design of the system software used the embedded software developing platform as Keil, met the traffic auditing department’s needs about Mobile Vehicle Checking.
A New Technique Called Annotation Used To Categories The Content Of Web Pages
U. Madhavi, S. Hrushikesava Raju , P. Nirupama
Now-a-days databases become web accessible; these databases having data units are encoded. To separate data units assign meaningful labels. To assign labels there is an automatic annotation approach which automatically assign labels to data units within the search result records returned from web databases. For that it provides an annotation wrapper for the search site to automatically constructed and annotate the new result pages from the same web databases. There are six basic annotators, for every basic annotator we produce label for the data units within their group. A probability model is selected to determine the most appropriate label for each group. This annotation wrapper generate an annotation rule that describes how to extract the data units from result page .Once the annotation wrapper annotate the data there is unnecessary to perform the alignment and annotation phases again. We construct an annotation wrapper by using data types, data contents, presentation styles and adjacency information. In this paper the data annotation problem is studied and proposed a multiannotator approach to automatically constructing an annotation wrapper for annotating the search result records recover from any given web databases.
Numerical Solutions of Ordinary Differential Equations Using Mathematica
Adoh, A. C., Ojobor, S. A.
This paper investigated the numerical solution of linear ordinary differential equations using Mathematica. The computational software (Mathematica) automates tedious numerical computations, making it easier to generate accurate numerical solutions. Several programming paradigm can be used to implement these numerical algorithms (methods) via Mathematica, but this paper briefly featured two of the programming paradigm, the Recursive and Functional paradigm. The software to generate the necessary solution to a given ordinary differential equation, plot its graph and compare the different numerical methods for higher accuracy using the plotted graphs. We compare the NDSolveapproachin Mathematica with that of Euler and Runge-Kutta method. We observe that the NDSolve and Runge-Kutta produces similar results.
An Efficient Job Scheduling Algorithm using Min-Min and Ant Colony Concept for Grid Computing
Davinder Kaur, Sarpreet Singh
Grid computing has emerged as an important field from the distributed and parallel computing where the resources of various computers in the network are used to solve a particular problem, because of high demand of computational power and need of high performance. To achieve the promising potential of grid computing, an effective and efficient job scheduling is required.Job scheduling is used to schedule the user jobs to appropriate resources in grid environment. Min-Min is the most simple and well known scheduling algorithm in grid computing that is used to minimize the makespan. The drawback of min-min algorithm is that the schedule produced by min-min is not optimal with respect of load balancing and it is not effective for resource utilization and one resource can execute one job at a time, the number of resources is known in prior. In this paper, a Min-Min Ant Colony (MMAC) algorithm is proposed that reduces the makespan and maximize the resource utilization using the features of both min-min algorithm and ant colony optimization. It is a two phase algorithm.
Multiple Quality Parameters for Image Quality assessment
Esha Meshram, Suyash Agrawal
There has been an attention is boosted towards extending the IQA which follows two stage structures; in first stage, the reference and compressed images are taken as input by local quality measurement and the second stage advances it to the pooling stage and provides quality score. In the initial time, various methods such as MSE, PSNR, Correlation and Entropy are most used for the quality evaluation but this does not recommend any structural information of images. To diminish this restriction and offer structural information, initiative behind organizing this paper to accomplish most excellent overall performance is to progress the pictographic quality by merging information content weighting with structural similarity measurement.
A combined approach of using DWT-DCT watermarking and AES encryption to improve the security of satellite images
Naida.H.Nazmudeen, Jismi.K
With the large-scale research in space sciences and technologies, there is a great demand of satellite image security system for providing secure storage and transmission of satellite images. As the demand to protect the sensitive and valuable data from satellites has increased and hence proposed a new method for satellite image security by combining DWT-DCT watermarking and AES encryption. Watermarking techniques developed for normal multimedia data cannot be directly applied to the satellite images because here the analytic integrity of the data, rather than perceptual quality, is of primary importance. To improve performance, combine discrete wavelet transform (DWT) with another equally powerful transform; the discrete cosine transform (DCT). The combined DWT-DCT watermarking algorithm’s imperceptibility was better than the performance of the DWT approach. Modified decision based unsymmetrical trimmed median filter (MDBUTMF) algorithm is proposed for the restoration of satellite images that are highly corrupted by salt and pepper noise. Satellite images desire not only the watermarking for copyright protection but also encryption during storage and transmission for preventing information leakage. Hence this paper investigates the security and performance level of joint DWT-DCT watermarking and Advanced Encryption Standard (AES) for satellite imagery. Theoretical analysis can be done by calculating PSNR and MSE. The experimental results demonstrate the efficiency of the proposed scheme, which fulfils the strict requirementsconcerning alterations of satellite images.
A Review on Dealing Uncertainty, imprecision and Vagueness in Association Rule Mining Using Extended and Generalized Fuzzy
Vivek Badhe, Dr. R.S. Thakur, Dr. G.S. Thakur
The common approaches to association rule mining focus on generating rule by using correlation among data and finding frequent occurring patterns. The main technique uses support and confidence measures for generating rules. There are few other approaches that have already been induced to improve association rule mining. The purpose of this paper is to review the existing approaches based of fuzzy and its variants and extensions namely Rough Set, Vague Set and Soft Set and their combinations that have been used to increase the effectiveness of association rule mining techniques dealt with the uncertainty, approximation, vagueness, and imprecision theories.
A Survey of various Routing Algorithms based on Shortest Path
Neha Rathod, Ms. Rupali Bhartiya
Conventional optimization methods were needed huge computational attempts, which grow exponentially as difficulty dimension enhances with the networks of independent nodes. Researchers are continuously working upon the challenges like communication link failures, low memory, calculating constraints, and maximum valued energy. A lot of problems may formulated and approached through multidimensional optimization problems, in which most nodes are not neighbors of one another, but can be reached from every other by a small number of hops. Optimization problems under uncertainty are complex and difficult, and often classical algorithmic approaches based on mathematical and dynamic programming are able to solve only very small problem instances. For this reason, in the recent years meta-heuristic algorithm such as Ant Colony Optimization, Evolutionary Computation etc. are emerging as successful alternatives to classical approaches
Improved Gradients & Global Mean Based Switching Median Filter
Lovepreet Kaur, Dr. Arun Khosla
Noise in images has become one of the significant concerns in digital image processing. Many digital image based techniques produce inaccurate results when noise is presented in the digital images. So much researchers has proposed new and modified techniques so far to reduce or remove noise from images. Different kind of enhancement in the filters has been proposed so far. But most of filters put artefacts while doing their work. Many filters fails when noise density in the images is very high. Some filters results in over smoothed image i.e. poor for edges. This paper has proposed a new improved global mean based switching median filter which has the capability to decrease the high density of the noise from images and also outperforms over others when input image is noise free. The proposed method has also ability to preserves the edges by using the gradient based smoothing. The proposed technique has been designed and implemented in MATLAB tool using image processing toolbox. Different kind of the digital images has been taken for experimental purpose. Comparative analysis has shown that the proposed algorithm is quite effective over the available techniques.
Optimizing the Ad-hoc applications in Vehicular Network using Max-Min Ant system
Sumeet Sekhon, Dinesh Kumar
VANET is the most popular mobile Ad-hoc network which is called Vehicular Ad Hoc Network. The researchers work a lot on this network on various parameters of the same. From the Literature review, VANETs works on the basis of real time system where the vehicles act as mobile nodes and travel with a very high speed on the roads in the urban areas. There are a number of security issues like authentication, intelligent system approach, collision detection, jamming avoidance, communication system approach etc. We basically deal with the delay issue in case of accident on the route. Here we are presenting an intelligent route identification approach in case of accident occurrence for V2V communication. If some accident takes place over the network, the neighbor node information flow will be performed to perform the route analysis. An optimal path is selected from the existing one using the min and max variants of ACO. In this work a bio inspired V2V communication approach is suggested to identify the safe path over the network. We have used the Max-Min ant system (MMAS) to achieve our goal of finding the optimal path for vehicles thus reducing the delay caused due to collisions.
A Reactive Hierarchical Trust Management Scheme for Wireless Sensor Networks
Reshmi V, Sajitha M
Wireless sensor network consists of spatially distributed autonomous sensors to monitor physical or environmental conditions such as temperature, sound, pressure etc. and cooperatively pass their data through the network to a main location. However, individual sensor nodes are vulnerable to some types of attacks because they are usually deployed in open and unprotected environments. Trust management, which models the trust on the behavior of the elements of the network, can be especially useful for a sensor network environment to enhance security. Various methods have been proposed for trust management in wireless sensor networks. A reactive hierarchical trust management scheme is proposed here which reduces the energy consumption rate of sensor nodes by calculating the trust values on demand.
Security Assessment Automation Framework: Web Applications
Gopal R. Choudhari, Prof Madhav V. Vaidya
The security of the Web applications has increased rapidly over the last years. At a same time, the quantity and impact of the vulnerabilities in the Web applications have grown as well. Since the manual code reviews are time-consuming, costly and error-prone, the need for the automated solutions has become evident.
In this paper, we proposed an automated security assessment framework for Web applications. The purpose of this system is to improve the security standard of software products and applications. The end-end framework gathers information from potential clients; helps determine the scope of assessments, tools to use and the methodology for conducting assessments.
It also generates a report showing graphs and provides actionable intelligence about identified vulnerabilities. It can also be integrated with build systems used for developing and deploying applications so that security issues are caught in early phases of the SDLC. The system works as a single point of control for running security tools and scripts and managing information about security projects. By providing a single point of control, the system automates delivery of security solutions.
The Explosive development and the broad availability of the web has prompted surge of exploration movement in the zone of data recovery on the World Wide Web. Positioning has dependably been a vital part of any data recovery framework. If there should be an occurrence of web hunt its vitality gets basic. Because of the span of the web, it is basic to have positioning capacities that catch the client needs. To this end the Web offers a rich setting of data which is communicated through the hyperlinks. This paper presents the idea of page rank utilizing another Formula “Upgraded Page rank utilizing Time Component” which has careless limit as contrasted with Conventional Page Rank Algorithm.
Energy-Efficient Attribute-Aware Data Aggregation Using Potential-Based Dynamic Routing in Wireless Sensor Networks
Shini K, Shijukumar P S
Sensor nodes gather data in Wireless Sensor Networks (WSNs), which is pieced together at the sink. Most of the energy of SNs is consumed in transmission and reception of the data packets. Reduction in transmission of data packets greatly improves the lifetime of WSNs as it not only reduces the energy of transmitting nodes but also of the receiving nodes. Since the network consists of low-cost nodes with limited battery power, power efficient methods must be employed for data gathering and aggregation in order to achieve long time network usage. Attribute-aware data aggregation uses the concept of packet attribute, can make the packets with the same attribute spatially convergent as much as possible and improves efficiency of data aggregation. To improve network efficiency, the proposed mechanism focuses on efficient data aggregation in tree based network by constructing energy efficient duty cycled data aggregation trees. In this method the number of transmissions is reduced and the energy consumption also get reduced.
A Multiple Queue Management Scheme to Optimize the Job Sequencing in Cloud Computing
Navdeep Singh, Ghanshyam Verma
A cloud System is considered as one of most required shared distributed system in which multiple clients are connected with multiple servers in an integrated environment. The intermediate layer is formed between clients and server to arrange the order of their execution. The Server side is defined with specific cloud servers and integrated virtual machines along with memory and IO capabilities. Multiple client requests are generated under the request time parameters. As a user enters to the system, a particular VM is allocated to it for the process execution. It the VM is not capable to execute the process under defined constraints, the migration of the request is performed. In cloud system, one of the critical issues is to handle these multiple request in effective way so that the overall wait time will be reduced. In this present work, multiple queue system is suggested to reduce the wait time and to reduce the chances of request migration. The presented work is divided in three main stages. In first stage, the Virtual machines are set in order of their capacity parameters. Once the machines the ordered, in next stage, the user requests are set in specific queue based on dead line criticality. In this work, two queue middle layers are composed under dead line criticality analysis called hard deadline queue and soft deadline queue. Once the processes are allocated to these queues, the next work is to perform the allocation of the processes to virtual machines. This allocation is performed under capacity and load parameter. After this allocation, the execution of the processes on particular virtual machine is done under process time and deadline parameters. The processes that are not been executed before the deadline are considered as migrated processes. The obtained result from system shows the effective execution of processes on cloud system along with reduced deadline and process migration probability.
Comparison of Various Queuing Techniques to Select Best Solution for Eradicating Congestion in Networks
Sahil Kochher, Gurnam Singh
Congestion is an aggregation of huge amount of data in the networks that results in delay in packet delivery ratio and even huge loss of data. Queue management is a technique to minimise the congestion rate so that the data is successfully transferred from the source and destination. There are two types of queue management techniques (a) In active queue management techniques the intimation is given by the queue to the senders to slow down the packet rate as its queue buffer is about to full (b) Passive Queuing the drops rate is more in comparison with active queuing because the senders does not have any idea about queue buffer size so they are not able to lower its delivery rate and queue buffer drops the received packet when the buffer is full.
Anjali Vats, R.K Bhatnagar, Shikha Vats, Antara Sarkar and Ankit
FACIAL DETECTION SYSTEM (Using MATLAB)
Anjali Vats, R.K Bhatnagar, Shikha Vats, Antara Sarkar and Ankit
The face is our primary focus of attention in social life playing an important role in conveying identity and emotions. We can recognize a number of faces learned throughout our lifespan and identify faces at a glance even after years of separation. This skill is quite robust despite of large variations in visual stimulus due to changing condition, aging and distractions such as beard, glasses or changes in hairstyle.
Face detection is used in many places now a days especially the websites hosting images like picassa, photobucket and facebook. The automatically tagging feature adds a new dimension to sharing pictures among the people who are in the picture and also gives the idea to other people about who the person is in the image. In our project, we have studied and implemented a pretty simple but very effective face detection algorithm which takes human skin colour into account. Our aim, which we believe we have reached, was to develop a method of face detection that is fast, robust, reasonably simple and accurate with a relatively simple and easy to understand algorithms and techniques. The examples provided in this thesis are real-time and taken from our own surroundings.
A Generosity Model to Study Data Center Performance and QoS in IaaS Cloud Computing Systems
D.S.Selvi, A.Balasubramani, P.Nirupama
Cloud computing is a general term for system architectures that involves delivering hosted services over the Internet, made possible by significant innovations in virtualization and distributed computing, as well as improved access to high-speed Internet. A cloud service differs from traditional hosting in three principal aspects. First, it is provided on demand, typically by the minute or the hour; second, it is elastic since the user can have as much or as little of a service as they want at any given time; and third, the service is fully managed by the provider – user needs little more than computer and Internet access. Typically a contract is negotiated and agreed between a customer and a service provider; the service provider is required to execute service requests from a customer within negotiated quality of service (QoS) requirements for a given price.
Due to dynamic nature of cloud environments, diversity of user’s requests, resource virtualization, and time dependency of load, providing expected quality of service while avoiding over-provisioning is not a simple task. To this end, cloud provider must have efficient and accurate techniques for performance evaluation of cloud computing centers. The development of such techniques is the focus of this thesis. This thesis has two parts. In first part, monolithic performance models are developed for cloud computing performance analysis. Poisson task arrivals, generally distributed service times, and a large number of physical servers. Later on, to extend the model to include finite buffer capacity, batch task arrivals, and virtualized servers with a large number of virtual machines in each physical machine. However, a monolithic model may suffer from intractability and poor scalability due to large number of parameters. Therefore, in the second part of the thesis we develop and evaluate tractable functional performance sub-models for different servicing steps in a complex cloud center and the overall solution obtain by iteration over individual sub-model solutions. Also extend the proposed interacting analytical sub-models to capture other important aspects including pool management, power consumption, resource assigning process and virtual machine deployment of nowadays cloud centers. Finally, a performance model suitable for cloud computing centers with heterogeneous requests and resources using interacting stochastic models is proposed and evaluated.
Design and Development of Micro Vertical Axis Wind Turbine for Rural Application
Nilesh N Sorte, S M Shiekh
In this paper we describes what is micro vertical axis wind turbine and its importance in energy production. This paper describes briefly the design consideration of vertical axis wind turbine with the aim to start or work in very low speed of wind. The model has been develop and fabricated in this project. The aim is to make micro axis wind turbine which can be installed at rural areas where electricity crises is the main problem. The effort has been taken to design the VAWT which can work at very low wind speed of 2 m/s.
In this paper we will characterize different ways to keep track of all the activity done on computer and how these tracked records are saved and further accessible to administrator. Computer Surveillance System gives parent’s opportunity to keep an eye on their kid to check what their kid is doing in their absence. Also Computer Surveillance System gives opportunity to manager to check their employees whether they are doing their work properly or busy in doing chatting etc
Robotics is the fastest growing technology in the scientific world. Robotics today is being applied to various industries for different applications. Also it replaces human process and where people are required to do monotonous job repeatedly. At these places robots are found ideal replacements. In this paper we will discuss one such robot localization algorithm called as SIFT algorithm. Scale Invariant Feature Transform (SIFT) is an algorithm in Computer vision to detect local features of image. Identification of target in image processing using SIFT algorithm is proposed in this paper. This SIFT feature descriptor is invariant to uniform scaling, orientation and invariant to illumination changes.
Improvement in SIFT has also been proposed in this paper using the fusion of Discrete Wavelet Transform (DWT), Singular Value Descriptor (SVD) and SIFT algorithm. This combination of algorithm preserves the features from the images. With the help of this proposed technique improved efficiency is obtained in terms of resolution and contrast
A New Color Image Compression Based on Fractal and Discrete Cosine Transform
K. Sharmila, K. Kuppusamy,
Fractal Image Compression is a compression method in which self similar nature of an image is used. It is a way to encode images that require less storage space. In this paper, an implementation based on fractal with quadtree and Discrete Cosine Transform is proposed to compress the color image. Initially the image is segmented and DCT is applied to each block of the image. Then the block values are scanned in zigzag manner to prevent zero coefficients. The resultant image is partitioned as fractals by quadtree technique. Finally the image is compressed using Run Length encoding Technique. Experimental results show that the proposed technique compresses the image effectively with high PSNR value and SSIM index.
Healthcare Measurement Analysis Using Data mining Techniques
Dr.A.Shaik Abdul Khadir , S.Menaka
Data mining as one of many constituents of health care has been used intensively and extensively in many organizations around the globe as an efficient technique of finding correlations or patterns among dozens of fields in large relational databases to results into more useful health information. In healthcare, data mining is becoming increasingly popular and essential. Data mining applications can greatly benefits all parties involved in health care industry. The large amount problem generated by healthcare transactions are too complex and voluminous to be processed and analyzed by traditional methods The paper provides a brief review of the existing performance measurement frameworks. On the basis of review, performance measurement system criteria are identified and accordingly a framework has been proposed for measuring performance in healthcare processes. Data mining provides the methodology and technology to transform huge amount of data into useful information for decision making. This paper explores data mining applications in healthcare in it discusses data mining and its applications in major areas such as evaluation of treatment effectiveness, management of healthcare itself and lowering medical costs
Image processing is a method to convert an image into digital form and perform some operations on it, in order to get an enhanced image or to extract some useful information from it. Image processing usually refers to digital image processing, but optical and analog image processing also are possible. This article is about general techniques that apply to all of them. The acquisition of images (producing the input image in the first place) is referred to as imaging. Image segmentation is a process of partitioningan image into sets of segments to change the representation of an image into something that is more meaningful and easier to analyze.
Implementation of Database Synchronization Technique between Client and Server
Naveen Malhotra, Anjali Chaudhary
The objective of this paper is to provide an algorithm to solve the problem that when all clients are relying on a single server. If that database becomes unavailable due to planned server downtime or from server failures, all of the remote workers will be disconnected from their data. Data is stored on their system (user system). When the user connected to the internet data automatically sink from their client system to the server in serial order. It also works on file handling. When the system is disconnected from network all the files(images) uploaded by user,saved on client machine folder when it is again connected with the server,automatically files(images) transferred from client to server.
Light energy consumption minimization has become the important aspect of concern today . Here the Luminaires illuminance is controlled without the help of sensors. The approach is regarding feed forward neural networks implementation to model the relationships between the controller, luminaires and their luminance. The scheme is flexible to be implemented on microcontrollers unlike the conventional aproaches. This implementation has great modeling accuracy approximately by 95%. The energy saving with its optimal nonlinear multiple-input multiple-output control is also improved by more than 28%. Resulting in ease of installation and cost effective alternate for reduction in lighting energy consumption.
The “Digital Image Watermarking” is the process of embedding information into a digital media without compromising the media’s value in a way that it is difficult to remove. This is basically used to hide the information from attackers.
Content Based Video Retrieval by Genre Recognition Using Tree Pruning Technique
Amit Fegade, Vipul Dalal
Recent advances in technology have made tremendous amount of multimedia content available. The amount of video content is increasing, due to which the systems that improve the access to the video is needed. Efficiency of the retrieval system depends upon the search method used in the system. The use of inappropriate search method may degrade the performance of the retrieval system. During past years, multimedia storage grows and cost of storing digital data is also cheaper. So there is huge amount of videos available in the multimedia database. It is very difficult task to retrieve the relevant videos from the large database as per the user needs. Hence an effective video retrieval system is required to retrieve the relevant videos from the large database. In order to create an effective video retrieval system, visual perception must be taken into account. In this paper, we have proposed an algorithm for video retrieval based on genre of the video. This algorithm extracts the key frames based on motion detection, Regions of interest with the objects are detected using bounding box method and are annotated over the video and compared with the similar objects from the knowledge based prepared for various genre videos for recognition of objects. It also uses a tree based classifier to identify the genre of the query video and to retrieve the same genre videos from the video database.
Authentication is the process of verifying credentials of user and guarantees the user what it claims to be. Many ways of authentications are proposed starting from Textual Passwords, Graphical passwords, Biometrics etc. But all have certain limitations or drawbacks. Hence I propose a new way of Authentication which has capability of all existing authentication systems and is also suited for Client Server Architecture. This technique is based on real world simulation hence called Virtual World. User is given freedom to interact with virtual things and select his authentication mode i.e textual, graphical, Biometrics, OTP, Voice etc. This scheme provides more options and freedom to user along with have higher password space and hence difficult to break and thus more secured.
Design and Implementation of a Backtracking Wormhole Switch using MARX units for Mesh Topology
Sharon A., Mrs.Yogeshwary B.H.
Network-on-Chip (NoC) is supposed to be adopted by the designers as a substitute for buses and dedicated wire interconnection schemes in future complex SoCs. Compact and power efficient design of basic routing element (router or switch) is an important part of the NoC design. Several switch implementations for different topologies of NoCs are proposed. This project presents the design of Backtracking Wormhole Switch. The switching mechanism used is circuit-switching approach. The proposed circuit-switched router, based on a Backtracking probing path setup, operates with new MARX units, and allows for efficient low latency Wormhole switch implementation. The switch can support a dead- and live-lock free dynamic path-setup scheme. The backtracked routing Network-on-Chip provides guaranteed and energy-efficient data transfer. The backtracking wormhole switch architecture is considered for use under a mesh/torus topology because of its good path diversity.
Feature Subset Selection Algorithm for Elevated Dimensional Data By using Fast Cluster
B.Swarna Kumari, M.Doorvasulu Naidu
Feature selection involves recognizing a subset of the majority helpful features that produces attuned results as the unique set of features. Feature selection algorithm can be evaluated from mutually efficiency and effectiveness points of vision. FAST algorithm is
Proposed and then experimentally evaluated in this paper. FAST algorithm mechanism considering two steps. In the primary step, features are separated into clusters by means of graph-theoretic clustering methods. In the subsequent step, the majority delegate feature that is robustly connected to target classes is chosen from each cluster form a subset of features. The Features in unusual clusters are relatively self-governing; the clustering-based approach of FAST has a elevated possibility of producing a subset of useful features. in the direction of guarantee to the efficiency of FAST, we implement the efficient minimum-spanning tree clustering technique. general experiments are approved to contrast FAST and some delegate feature selection algorithms, namely, FCBF, ReliefF, CFS, Consist, and FOCUS-SF, by admiration to four types of famous classifiers, specifically, the probability-based Naive Bayes, the tree-based C4.5, the instance-based IB1, and the rule-based RIPPER and following feature selection.
Analysis of Reactive Attacks on Mobile Communication Protocol Network
Praveen Kumar Maurya, Akhilesh Panday
A MANETs (Mobile Ad-Hoc N/W) are De-Centralized wireless linkages networks through the Self-configuring mobile ad hoc nodes. These networks are vulnerable to protection threats, caused by the nonattendance of trusted central authority or directness of network topology. Black hole attack is among the path interruption attacks that origin a better harm to the network. A hateful node disprove that it is having shortest pathway and traps packets in that way demeaning network performance in this attack. MANETs create a better confront for routing protocols. Mainly of the routing protocols for MANETs are therefore susceptible to a variety of categories of the attacks. Ad hoc on-demand space (distance) vector routing is extremely accepted routing algorithm. Nevertheless, it is exposed to the renowned black hole attack, where a hateful node mistakenly advertises fine paths to an objective node throughout the route detection procedure .This attack takes a form of extra ruthless when a group of hateful nodes collaborate every one other. It is a proficient routing protocol however it require with protection issues in this paper that’s why AODV (Ad hoc On Demand Distance Vector Routing) protocol is utilized for route founding. A protection method is obtainable in opposition to a harmonized attack by numerous black hole nodes in a MANET. The simulation performed on the projected scheme has formed outcomes that express the efficiency of the method in discovery of the attack whereas maintaining a rational level of through position in the network.
BMM Filtering Approach for Image Enhancement of Indian High Security Registration Number Plate
Reena Gupta, R. K. Somani, Manju Mandot
The population of India is growing expeditiously with a national average growth rate of 1.41per cent per annum (Census of India, 2011). Increase in population and economic activities the travel demand has increased many folds. The inadequate public transport and the easy availability of financing facilities for private vehicles have resulted in increased vehicle ownership levels and their usage.
Crumbling road infrastructure coupled with the increase in vehicle population has hurled the city's traffic problem to ungovernable levels. There could be two possible viewpoints to solve this problem. First viewpoint is to come up with an infrastructure which involved wider roads, expressways, flyovers and bypasses. But in developing age of developing countries like India, Malaysia, Sri Lanka, etc money and space are very big concerning problem. Second viewpoint is to manage the existing traffic load on the same available infrastructure, with the use of technology. This calls for the vital need of Intelligent Transportation Systems (ITS), which helps in managing billion vehicles that are running on the roads.
This paper mainly focused on image enhancement and segmentation phase of high security registration number plate recognition system. We present the Hybrid approach of the Boat operator filter and Montane filter, which is very useful and successful filtering technique for image enhancement and edge detection.
Universal Data Management through Web Base Configurable System
Vrishali Patil, Biradar M. S
This paper presents universal data management through web base configurable system. Universal data stands for most of analog signal which are useful for mankind and industrial system. 24bit ADC provides high resolution data. We have tried data acquisition for multiple applications. Data frequency and channels can be customized through web base configurable system. Data can be logged with set frequency with real time stamp and this data is available on web. Remote access of the data, makes the system very useful and low cost solution. data can be collected from remote locations. HTML/.NET script is used to develops web pages and their GUI application. Embedded hardware contains Ethernet [10/100 base-T, RJ45 connector] base microprocessor RCM3700 having 512KB memory for storing the web pages. By using a low cost network communication module (RCM 3700) as a web server, one can achieve better network security, lower power consumption, compact size, and easier to use at remote places. Results show the remote monitoring and control system [RMACS] with data log. Low cost web server for collecting high resolution data with multiple channels provides best solution for many applications
HPBDMS: High Performance Big Data Management System on Massive & Complex Data
Manoj Kumar Singh, Dr. Parveen Kumar
Big Data refers to the large amounts of data calm over time that's troublesome to analyze and handle application accepted info administration tools. The data area unit analyzed for business trends in business as well as within the fields of producing, anesthetic and science. The categories of data cowl business transactions, e-mail messages, photos, police investigation videos, action logs and dishevelled argument from blogs and amusing media. The large amounts of data that may be calm from sensors of all varieties. During this analysis paper, we'll assay the impressive all-inclusive info systems with high action and the way these systems abode the charm of consistency, high availability and potency. We have a tendency to attempt to beam however architectures of info systems amendment as their goals of their applications vary. Our research arrangement based mostly on- consistency, availability and partition tolerance. We’ll attain our analysis support of few best practices for design superior info applications.
The main aim of this project is to use a wireless technology for automobiles using GSM modem. The main scope of this project is to stop the automobile engine with the help of GSM modem when any person tries to steal the vehicle. When unauthorised person tries to unlock the door of car, then a programmable microcontroller 8051 gets an interrupt and order to GSM modem to send a SMS. GSM modem that stores owner’s number upon a miss call for the first time, sends an alert SMS to that authorized number. If owner reply to “stop the engine” then the control instruction is given to the microcontroller through interface that the output from which activates a relay driver to trip the relay that disconnects the ignition of the automobile resulting in stopping the vehicle.
Mohamed M. Dessouky, Mohamed A. Elrashidy, Taha E. Taha, Hatem M. Abdelkader
Dimension and Complexity Study for Alzheimer’s disease Feature Extraction
Mohamed M. Dessouky, Mohamed A. Elrashidy, Taha E. Taha, Hatem M. Abdelkader
This paper discusses the problem of dimensionality and computational complexity analysis for feature extraction proposed algorithms of Alzheimer’s disease. The effective features are very useful for some of the discrimations and to assist the physicians in the detection of abnormalities. This paper concern two main issues that must be confronted which are: The first one concern the study of how the classification accuracy depends on the dimensionality (i.e. the number of features). The second issue is the computational complexity of designing the classifier. As the number of features increases, the classification error decreases which consequently improve the accuracy of the classifier.
How virtualisation helps smart phones in cloud computing: A survey
Joshi Rakhi, Prof. V.B. Maral
The limited capability and energy constraint of smart phones leads to the usage of cloud computing. Features provided by the cloud aims to augment Smartphone capabilities by providing vast pool of computation power and unlimited storage space. However they are still constrained by limited computing, memory utilization and battery capacity. These limitations can be minimized in smart phones with the help of computation of f loading: sending heavy computation to resourceful servers and receiving the outcome from these servers. Offloading is the latest trend in mobile computing. In this survey paper we present how offloading can helps to increase the performance of smart phones by providing an overview of techniques, systems, and research areas for offloading computation. We will also describe scope for future research
We propose an approach to produce high quality, visually cognitive QR codes, which we call QArt – a new generation of QR codes that are machine-readable and visually perceivable by humans simultaneously. First, a pattern readability function is constructed wherein a probability distribution is learned which identifies replaceability of modules which are the collections of data inside QR code. Then, given a text tag, an input image is taken and based on the Grayscale label on that image, a high quality, visual appearing QR code is generated. A user just needs a phone with picture taking capability to extract the encoded data from these QR codes which may include URL, music, images, or a plethora of other digital content. Some advertisers use QR codes to promote their web presence since the smartphones can read and immediately access the URL encoded in it. As an instance of application of the technique proposed in this paper, such advertisers may generate a QR code on their company logo which contains the encoded URL, for advertising or promoting purpose.
Structural properties of NSC and HSC beams bonded by GFRP wraps
Vivek Singh, Nandini, Sanjith J
The use of fiber reinforced polymer (FRP) is becoming a widely accepted solution for repairing and strengthening in the field of civil engineering around the world. The shear strength of a reinforced concrete beam can be extensively increased by application of carbon (CFRP), glass (GFRP) and aramid (AFRP) FRP sheets adhesively bonded to the shear zone of the beam. This paper deals with Theoretical and experimental investigation for enhancing the shear capacity of RC beams using Glass fiber reinforced polymers (GFRP) polymers
In this modern world Smartphones and Tablets are need for every group from Children, teens, Parents, Business executives, Professionals to old age people. With this techy gadgets, location based applications are widely used as it makes life very easier, faster and keeps track of our near and dear ones as well as our business associates. The SMART Travel Alarm is designed to provide high tech services to users based on the information of their current geographic locations. Some of these services include “Reached to destination alarm”, Email someone on reaching the destination etc. In this paper we illustrates the development of SMART Travel Alarm - a Android application based on Sencha 2.2 framework using Google Web Services to offer multiple services to the users on the move. The developed app works perfectly fine on major smartphones and tablets running on various versions of android OS.
An Advanced Approach for Ranking in Query Recommendation
Rinki Khanna, Asha Mishra
Search engines are programs that search documents for specified keywords and return a list of the documents where the keywords were found. They return long list of ranked pages, finding the relevant information related to a particular topic is becoming increasingly critical and therefore, Search Result Optimization techniques come in to play. In this work an algorithm has been applied to recommend related queries to a query submitted by user. Query logs are important information repositories to keep track of user activities through the search results. Query logs contain attributes like query name, clicked URL, rank, time. Then the similarity based on Keyword and Clicked URL’s is calculated. Clusters have been obtained by combining the similarities of both keyword and clicked URL’s to perform query clustering. Most favored queries are discovered within every query cluster. The proposed result optimization system presents a query recommendation scheme towards better information retrieval to enhance search engine effectiveness to a large scale.
Enumeration Sort on OTIS k-Ary n-Cube Architecture
Abdul Hannan Akhtar, Keny T. Lucas
Many researchers have been motivated to propose parallel algorithm on Optical Transpose Interconnection System (OTIS) because of its hybrid nature. OTIS exploits both the electronic links as well as free space optical links for connecting processing nodes of interconnection network. In this paper, we have proposed a parallel algorithm for sparse enumeration sort on OTIS k-ary n-cube parallel computer with on a network size of k2n.. In this sorting, the number of keys to be sorted is pα, for some constant α≤½ and we have assumed α=½ for our proposed algorithm. The algorithm has two variants based on data population techniques. The time complexity of the algorithm has been observed to be 4n(k-1) electronic moves + 3 OTIS moves for the first case. In the second case also it requires the same number of electronic moves but needs only two OTIS moves.
Efficient Load Balancing in Clusters in Hierarchical Structure
Viney Rana, Sunil Kumar Nandal
Load balancing is a process of transferring load from one node to another to increase the performance of the whole system as each node is equally loaded in structure. The aim of this paper is to implement load balancing in hierarchical structure by taking different size of cluster at different level and compare its results with simple load balancing approach. The results show that our proposed hierarchical algorithm is more efficient than simple load balancing.
In today world there begins a huge competition between the browser to reach the requirements of the people as, there are so many things that a user required as per their need. In this paper we have taken two browsers into consideration that shows the use, fulfillment, differentiation and various techniques that work with two different types of browsers such as “Text-based” and “Graphical-based”. The conclusion is made by considering both the techniques pros and cons as per the basic features provided by both types of browsers. We had research basically considering two browsers i.e. “IE” for graphical based and “LYNX” for text-based.
In this paper, our objective is on the study and analysis of Sensitive Information in Relational Database. In recent days we have seen that various information or data is hack by the unauthenticated person. So based upon that concept our main approach is to provide secrecy and privacy. For that purpose we characterize sensitive data or information as the extensions of secrecy views. The database, before returning the answers to a query fired by any restricted user, is updated to make the secrecy views empty or a single tuple showing only null values. Then, a query about any of those views returns no meaningful information because of these the database is not physically change but whatever updates are done is only virtual and minimal. Minimality makes sure that query answers, while being privacy preserving, are also maximally informative. Whatever virtual updates are proportional to the null values as used in the SQL standard. We provide the semantics of secrecy views, virtual updation, and secret answers to queries.
Context Aware Driver’s Behavior Detection System: A Survey
Priyanka B. Shinde, Vikram A. Mane
Dedicated short range Communication to allow vehicles in close proximity to communicate with each other, or to communicate with roadside equipment. Applying wireless access technology in vehicular environments has led to the improvement of road safety and a reduction in number of fatalities caused by road accidents, through the development of road safety applications and facilitating information sharing between moving vehicles regarding the road. This paper focuses on developing a novel and non-intrusive driver behavior detection system using a context aware system in wireless to detect abnormal behaviors exhibited by drivers, and to warn other vehicles on the road so as to prevent accidents from happening. In real time inferring four types of driving behavior (normal, drunk, reckless and fatigue) by combining contextual information about the driver, vehicle and the environment is presented. The evaluation of behavior detection using synthetic data proves the validity of our model and the importance of including contextual information about the driver, the vehicle and the environment.
In Many Real time application like sensors monitoring system, location based system and data integration are inexact in nature. It is difficult to extract frequent Item sets from these kind of application because of uncertainty. We study the problem of mining frequent itemsets from uncertain data under Possible Semantic Worlds (PSW). Uncertain database contain exponential large number of possible semantic worlds. We consider transactions whose items are associated with existential probabilities and give a formal definition of frequent patterns from uncertain data model.
By observing that the mining process can be modelled as a Poisson binomial distribution, an algorithm was developed, which can efficiently and accurately discover frequent item sets in a large uncertain database. We are adopting mining algorithm which identifies Probabilistic Frequent Itemset (PFI) from evolving database. Traditional algorithms for mining frequent itemsets are inapplicable or computationally inefficient for uncertain database. Implementing incremental algorithm which can efficiently accurately discovered frequent item set in large uncertain database.
In this paper, a general approach for human action recognition is applied for classifying human movements into action classes. The propose method uses Kinect for capturing depth stream. The system performs preprocessing on depth information for reducing noisy pixels and getting depth information in appropriate format. The background subtraction method is used for extracting region of interest i.e. human. The system extracts contours of person. The Hu moments are extracted from contours of person for training action classifier. The Support Vector Machine (SVM) is used for classifying human activities.
Complexity & Performance Analysis of Parallel Algorithms of Numerical Quadrature Formulas on Multi Core system Using Open MP
D.S. Ruhela, R.N.Jat
Analysis of an algorithm is to determine the amount of resources such as time and storage necessary to execute it. The efficiency or complexity of an algorithm is based on the function relating the input length to the number of operations to execute the algorithm. In this paper, the computational complexities and execution time for sequential and parallel algorithms used Numerical Quadrature Formulas on Multi Core system Using Open MP are analyzed.To find the integral value of various function using Trapezoidal Rule, Simpson 1/3 Rule, Simpson’s 3/8 Rule, Boole’s Rule.We have to calculate estimated execution time taken by the programs of sequential andparallel algorithms and also computed the speedup. Accuracy of thequadrature formulas has been found in the order- Simpson’s three-eighth rule > Simpson’sone-third rule > Boole’s rule > Trapezoidal rule.
Sequential clustering algorithms for anonymizing social
P.MohanaLakshmi, P.Balaji, P.Nirupama
The privacy-preservation in social networks is major problem in now-a-days. In distributed setting the complex data is divide between several data holders. The target is to appear at an anonymized view of the unified network without illuminating to any of the data holders information about links between nodes that are hold by other data holders. To that finish, in centralized setting two variants of an anonymization algorithm are offered which is based on sequential clustering (Sq). Proposed algorithms substantially break the SaNGreeA algorithm due to Campan and Truta which is the primary algorithm for achieving anonymity in networks by means of clustering and then secure distributed versions of algorithms. To the top of awareness, this is the earliest study of privacy preservation in distributed social networks. Finally, conclude by outlining potential research proposals in that path.
A Review on Evaluation Measures for Data Mining Tasks
B. Kiranmai, Dr. A. Damodaram
The cutting edge of technology is data mining. We apply data mining techniques in various fields and the results are enormous. Assessment is important to check your quality, and these metrics or measures play a vital role in guiding the work. This paper summarizes various evaluation measures criteria for assessment of data mining tasks. In this paper data mining functionalities (classification, clustering, Association rule mining) assessment metrics are discussed.
PCA Based Classification of Relational and Identical Features of Remote Sensing Images
Balaji. T, Sumathi. M
Principal Component Analysis (PCA) technique is useful in reducing dimensionality of a data set in order to obtain a simple dataset where characteristics of the original dataset that contributes most to its variance are retained. This method is to transform the original data set into a new dataset, which may better capture the essential information. Remote sensing images from orbiting satellites are gaining ground in recent years in inventory, mapping and monitoring of earth resources. These images are acquired in different wavelengths of the electromagnetic spectrum and therefore there exist correlation between the bands. The developed algorithm can not only reduce the dimensionality of remote sensing image but also extract helpful information for differentiating the target feature from other vegetation types more effectively. In this paper the usefulness and innovative of PCA in processing of multispectral remote sensing images have been tinted. It has been observed that PCA effectively summarize the dominant modes of spatial, spectral and temporal variation in data in terms of linear combinations of image frames. It provides maximum visual separability of image features thus improving the quality of ground truth collection and also turn to improving the image classification accuracy. Here, we propose a fast alternative to iterative PCA that makes it suitable for remote sensing applications while ensuring its theoretical convergence illustrated in the challenging problem of urban monitoring.
A Study of Quantum Cryptographic Architectures and It’s An Efficient Implementations
B.Sujatha, S.Nagaprasad, G.Srinivasa Rao
Quantum Cryptography could also be a Promising Approach visible of final Quantum Computers existence. it is a singular science Technique that provides us a singular secret Protocol that cannot be glorious by anybody. It depends on Physical laws of physics Principle, Un-like our today’s our Most Spectacular Mathematical Puzzles used in PGP (Pretty wise Privacy).Quantum cryptography works within the following manner (this read is that the "classical" model developed by Bennett and plate armour in 1984 - another models do exist): Assume that 2 individuals would like to exchange a message firmly, historically named Alice and Bob Let's say that Alice transmits gauge boson range 349 as associate UPRIGHT/LEFTDOWN to Bob, except for that one, Eve uses the one-dimensional polarizer, which might solely live UP/DOWN or LEFT/RIGHT photons accurately. Together don't directly imply that given a commitment and a quantum channel one will perform secure multi-party computation.àIn the quantum setting, they might be significantly useful: Crépeau and Kilian showed that from a commitment and a quantum channel, one will construct associate flatly secure protocol for playacting questionable oblivious transfer. [1] Oblivious transfer, on the opposite hand, had been shown by Kilian to permit implementation of virtually any distributed computation in a very secure manner (so-called secure multi-party computation). [2] (Notice that here we tend to square measure a trifle imprecise: The results by Crépeau and Kilian [1] and Kilian[2] In the classical setting, similar results is achieved once forward a certain on the quantity of classical (non-quantum) knowledge that the individual will store. [9] it absolutely was verified, however, that during this model conjointly the honest parties have to be compelled to use an oversized quantity of memory (namely the square-root of the adversary's memory bound). [10] This makes these protocols impractical for realistic memory bounds. (Note that with today's technology like laborious disks, associate individual will cheaply store giant amounts of classical knowledge. ) Position-based quantum cryptography[edit] The goal of position-based quantum cryptography is to use the geographical location of a player as its papers. Another fascinating exercise is to plot the inverse of the amount of points attributed to scrabble letters and compare this to the antecedent distribution The Enigma Cipher Machine As every cipher is broken, cryptographers set regarding planning stronger systems for coding. as an example, rather than subbing individual letters, the sender may substitute pairs of letters, that is understood as alphabetic character substitution. higher still, there's the book cipher, that permits any book to be the key, and that provides multiple substitutions for an equivalent letter. this kind of cipher was wont to encipher the disreputable Beale papers, that apparently contain the placement of a multi-million dollar treasure.
Reducing Blocking Probability of 3G Mobile Communication System Using Time Multiplexing NOVSF Codes
Dr. Mahesh Kumar, Mr. Pradeep Kumar
Orthogonal variable spreading factor (OVSF) codes have been proposed for the data channelization in W-CDMA access technology of IMT-2000. OVSF codes have the advantage of supporting variable bit rate services which is important for emerging multimedia with different bandwidth requirements. OVSF codes are employed as channelization codes in W-CDMA. Any two OVSF codes are orthogonal if and only if one of them is not a parent code of the other. Therefore, when an OVSF code is assigned, it blocks entire ancestor and descendant codes from assignment because they are not orthogonal to each other. This code-blocking problem of OVSF codes can cause a substantial spectrum efficiency loss. Efficient channelization code management results it high code utilization and increased system capacity.
The common purpose of all of them is to minimize the blocking probability andthe reallocation codes cost so that more of new arriving call requests can be supported. The probability of code blocking due to the inappropriate resource allocation will be thus minimized using NOVSF code of spread factor 32.[1]
Role of Data Mining in the Manufacturing and Service Application
Rajesh Kumar
In the today scenarios, the manufacturing and service industries are focusing on the highly effective tools which are mainly concentrated over the data mining. Several organisations and researchers have accomplished their efforts in the reviewing of the data mining tools and the surveying the data miners. In this present paper, the notion of machine learning and the data mining in the various manufacturing and service application is introduced. On the result, the idea of building the decision making systems are more clear as the algorithms of the machine learning helps in extracting the knowledge from the databases which are in the diverse form. As in the case where the operational engineering data is based, we can determine the optimal control parameters, faults detection in the equipment. So, the framework is designed which uses constructs such as decision maps, decision tables, etc. for the decision making system that helps in applying and organizing knowledge in the manufacturing and service application
Wireless communication is a communication between more than one points without the use of any interconnection networks. It permits the long range communication. The best example of wireless communication is radio system. In early days most of the systems transmit the analog signals. But now a days ,radio systems transmit the digital datas composed of binary bit streams which are obtained directly from the signals or by digitizing the analog signals. The architecture used in this type of communication is multimode multichannel RF transmitter and receiver using Delta sigma modulator. This paper deals about the single channel and multichannel transmitter sections .These two architectures are implemented using VHDL programming language.
Mobile video streaming and sharing in social network using cloud by the utilization of wireless link capacity
Pradeep M, Anil Kumar
As and when the mobile users are increasing in day by day the demands of video traffic over mobile networks keep on rising, so the wireless link capacity cannot satisfying the demands of mobile user due to the variation in the link status and many disruptions, it causes long time buffering to mobile user. To overcome from this a framework is proposed for the efficient utilization of wireless link capacity by the combination of Adaptive mobile video streaming [AMoV] and Efficient social video sharing [ESoV] techniques using cloud. In AMoV, there is a combination of adaptability and scalability features. So that it provides efficient utilization of bandwidth. Likewise ESoV provides efficient social video sharing by use of same combination of features. So that user experiences continuous flow of video streaming by avoid of buffering.
Assessment of Sustainability Indicators for Off Grid Energy System
Rahul Lanjewar, Subroto Dutt
Due to the fact that reliable and sustainable electricity supply has been regarded as essential for economic development, Renewable energy resources are a favorable alternative for energy supply. Commonly Renewable energy systems use solar, wind, and hydro energy sources In order to handle their fluctuating nature, these sources can be integrated together. These systems use different energy generators in combination, by this maintaining a stable energy supply in times of shortages of one the energy resources. For instance, winds are usually relatively strong in winter and solar radiation is higher in summer. A balanced system provides stable outputs from sources such as these and minimizes the dependence of the output upon seasonal changes Main hope attributed to these systems is their good potential for economic development.
In cryptography, MD5 known as Message Digest 5 was developed to overcome the cryptanalytic attacks on MD4 algorithm. Most of the hash functions are based on Merkle-Damgard construction, such as MD2, MD4, MD5, and SHA1 etc are often utilized with Digital signature algorithm, Keyed Hash Message Authentication Codes, Key Derivation Functions and Random Number Generators. MD5 takes as input any message of arbitrary length and produces as output a 128 bit message digest. Though MD5 came as a modification of MD4 to overcome its weaknesses but in 2004 some serious flaws were discovered and advances were made in breaking MD5 in 2005, 2006 and 2007[1][2].This paper presents a new algorithm which is a modification of existing MD5 algorithm to overcome the various attacks possible on MD5.
STUDY OF REACTIVE POWER COMPENSATION USING STATIC VAR COMPENSTOR
Prof. Ameenudin Ahmad, Mukesh Kumar Tanwer
This paper presents an application of a Static Var Compensator (SVC). A SVC is based on Power Electronics and other static devices known as FACTS (Flexible AC Transmission Systems) Controllers which it could be used to increase the capacity and the flexibility of a broadcast network. The effect of wind generators on power quality is an important issue; non uniform power invention causes differences in system voltage and frequency. Therefore wind farm requires high reactive power compensation; the advances in high power semiconducting devices have led to the development of FACTS. The FACTS device such as SVC inject reactive power into the system which helps in maintaining a better voltage profile.
A Review on Multiple Single Hop Clustering Based Data Transmission in Wireless Sensor Network
Pratistha Sharma, Mr. Abhishek Gupta
The most important objectives of wireless sensor network are to enhance or to increase the lifetime of the sensor network and also to use the energy of the network effectively. Many traditional approaches had been proposed in wireless sensor network (WSN) to achieve these objectives. But, they are not so efficient and reliable in terms of utilization of energy on the network nodes. However, nodes in network are typically considered to be homogeneous in nature since the researches in the field of wsn have been evolved but in real world, homogeneous sensor networks hardly been considered for research. Thus, we require a clustering technique which will work in heterogeneous environment which are more closely relates with real life environment. In this paper, there isreview of energy protocolswhich are used in wsn and also defined the keyfactors required for efficient data transmissioon or communication. Along with it this paper also include the improvement proposal in sensor network in energy conservation. This will consume less energy in long distance communication than previous single hop protocols which also help in lifetime enhancement.
Content based video retrieval has a wide spectrum of promising applications, motivating the interests of the researchers worldwide. This paper represents an overview of the general strategies used in visual content-based video retrieval. It focuses on the different methods for video structure analysis, including shot segmentation, key frame extraction, scene segmentation, feature extraction, video annotation, and video retrieval method. This work helps the upcoming researchers in the field of video retrieval to get the idea about different techniques and methods available for the video retrieval.
Wireless Equipment for Localization Using an Unmanned Mini-Helicopter-Based Airborne
N.Anusha, P. Ramesh Babu, P.Nirupama
It is fully functional and highly portable mini Unmanned Aerial Vehicle (UAV) system, HAWK, for conducting aerial localization. HAWK is a programmable mini helicopter—Draganflyer X6—armed with a wireless sniffer—Nokia N900. We developed custom PI-Control laws to implement a robust waypoint algorithm for the mini helicopter to fly a planned route. A Moore space filling curve is designed as a flight route for HAWK to survey a specific area. A set of theorems were derived to calculate the minimum Moore curve level for sensing all targets in the area with minimum flight distance. With such a flight strategy, we can confine the location of a target of interest to a small hot area. We can recursively apply the Moore curve-based flight route to the hot area for a finegrained localization of a target of interest. We have conducted extensive experiments to validate the feasibility of HAWK.
WIMAX stands for Worldwide Interoperability for Microwave Access. WiMAX refers to broadband wireless networks that are based on the IEEE 802.16 standard, which ensures compatibility and interoperability between broadband wireless access equipment .WiMAX, which will have a range of up to 31 miles, is primarily aimed at making broadband network access widely available without the expense of stringing wires (as in cable-access broadband) or the distance limitations of Digital Subscriber Line.
Chroma-Key Effect by Optimizing Coarse and Fine Filter
Pallavi P. Kotgire, Dr. S. K. Shah
Keying has become a common image processing operation in TV and film production to separate elements from a background. Chroma keying is a robust and important technique for image processing or video which is widely used in magazine covers, cinema films, video game industries and also television programs such as live talk show, weather forecast. Chroma-key method is proposed in this paper using technique Coarse and fine filter in Real-time. Based on K-means clustering algorithm, the improved method is proposed namely Coarse and Fine Filter. Hardware Architecture for proposed method is implemented on Virtex 5 FPGA board. Later optimization of coarse and fine filter is done for reduction of resources utilization. Experimental results shows that proposed design can give better quality of composite image, requires less buffer size within very less time.
A Mobile Approach Applied To Public Safety In Cities
S.P.Maniraj, R.Thamizhamuthu
As the Population is increasing day by day natural and man-made disasters have become an important factor for public safety. The ultimate goal of defining crime and safety indexes is to provide users with safety advisory information. People are however not equally exposed and vulnerable to all crime types. Age, gender and an array of personal features, preferences and choices play a central role on the perception of an individual’s safety. In this paper we design, implement and deploy an application that retrieves and conveys to the user relevant information on the user’s surrounding. We propose to achieve this vision by introducing a framework for defining public safety. These information may not be readily accessible, we use the localization capabilities of a user’s mobile device to periodically record and locally store the trajectory traces with which future crime index may be predicted. Time series analysis is one of the forecasting techniques has been used in order to predict future safety values. The combination of space and time indexed crime datasets, with mobile technologies has been investigated to provide personalized and context aware safety recommendations for mobile network users. The trajectory trace of the user is used to define the chance of crime to occur around the user and generalize this approach to compute the chance of a crime to occur around groups of users.
Three-Port Full-Bridge Converters with Varied Voltage Input for Solar Power Systems
M.Priya, K. Babu
A systematic method for deriving three-port converters (TPCs) from the full-bridge converter (FBC) is proposed in this paper. The proposed technique splits the two switching legs of the FBC into two switching cells with different sources and allows a dc bias current in the transformer. By using this systematic technique, a novel full-bridge TPC (FB-FBC) is developed for renewable power system applications which feature simple topologies and control, a reduced no. of devices, and single-stage power conversion between any two of the three ports. The proposed FB-TPC contains of two bidirectional ports and an isolated output port. The main circuit of the converter functions as a buck-boost converter and provides a power flow path between the ports on the primary side. The FB-TPC can adapt to a varied source voltage range, and tight controller over two of the three ports can be achieved while the third port provides the power balance in the system. Also, the energy stored in the leakage inductance of the transformer is utilized to achieve zero-voltage switching for all the primary-side switches. The FB-TPC is analyzed in detail with working principles, design considerations, and a pulse-width modulation scheme (PWM), which aims to decrease the dc bias of the transformer.
The “Digital Image Watermarking” is the process of embedding information into a digital media without compromising the media’s value in a way that it is difficult to remove. This is basically used to hide the information from attackers.
A Novel Approach for Data Hiding by integrating Steganography & Extended Visual Cryptography
Megha Goel, Mr. M. Chaudhari
This paper proposes a novel approach for transmitting secret data securely by integrating steganographic method & visual cryptography. Steganography is the art & science of hiding data in the images. In the older age, our ancestors had also used the steganography. For eg., they use invisible ink, tattooing etc. for hiding secret information. In the digital age, secret data is hidden within the image, file, audio, video, text etc. Cryptography is the art & science of converting plain text or data into an unreadable form which is called as cipher text. The advantage of steganography over cryptography is that in cryptography everybody knows about the existence of the message but in steganography only the sender & the receiver knows about the existence of the message so it does not attract unwanted attention. Although there has been an extensive research in the past related to cryptography & steganography but neither of them provide enough security. So the proposed system combines steganography & visual cryptography for hiding data.
Single Image Haze Removal Algorithm Using Edge Detection
Ms. Ghorpade, Dr. Shah S. K P. V
Images of outdoor scenes are usually degraded under bad weather conditions and bad environment which results in a hazy image. The most of haze removal methods based on a single image have ignored the effects of sensor blur and noise. Therefore, in this paper we propose a simple but effective image priodark channel prior to remove haze from a single input image. We employ an extremum approximate method to extract the atmospheric light and propose a contour preserving estimation to obtain the transmission by using edge preserving and mean filters alternately. The dark channel prior is a kind of statistics of outdoor haze-free images. Using this prior with the haze imaging model we can directly estimate the thickness of the haze image and recover a high-quality dehaze image which is similar to natural image. Our method can efficiently avoid the halo artifact generated in the recovered image. This paper describes haze removation technique using Haze Removal algorithm. Software reference model for haze removal method has been modeled in MATLAB/ Simulink.
Prof. P.A.Bailke, Mrs. Sonal Raghvendra Kulkarni, Dr.(Prof.)S.T.Patil
Multilabel Associative Text Classification Using Summarization
Prof. P.A.Bailke, Mrs. Sonal Raghvendra Kulkarni, Dr.(Prof.)S.T.Patil
This paper deals with the concern of curse of dimensionality in the Text Classification problem using Text Summarization. Classification and association rule mining can produce well-organized as well as precise classifiers than established techniques [1]. However, associative classification technique still suffers from the vast set of mined rules. Thus, this work brings in advantages of Automatic Text Summarization. Since text summarization is based on identifying the set of sentences that are most important for the overall understanding of document(s). These techniques use the dataset to mine rules and then filter and/or rank the discovered rules to help the user in identifying useful ones. Finally, for experimentation, the Reuter-21578 dataset are used and thus the obtained outputs have ensured that the performance of the approach has been effectively improved with regards to classification accuracy, number of derived rules and training time.
Mr. Vishal Kasat , Mr. Rakesh Pandit , Mr. Sachin Patel
A Review of Voice over IP Mobile Telephony Using WIFI
Mr. Vishal Kasat , Mr. Rakesh Pandit , Mr. Sachin Patel
Voice telephony over mobile is currently supported at a cost using service provider such as GSM, or using IP service provider at cheaper cost. The purpose of this research is to design and implement a telephony program that uses WIFI in p2p (Peer -to- Peer) or WLAN (Wireless Local Area Network) as a means of communication between mobile phones at no cost. The system will allow users to search for other individuals within WIFI range and to establish free p2p voice connections, or to establish virtual connection through Access Points (AP), as well as giving the option to user to use GSM in the case of no WIFI connectivity is available. The system will use a novel algorithm to convert mobile number into IP address and use it as a mean for contacting other mobile over p2p or AP using WIFI technology. The software will use a correlation between current address books available in mobile phones to convert phone numbers into IP addresses. The system will allow user to make voice conversation, sending SMS (Short Message Service) as well as MMS. Inbox and outbox services, message delivery reports, and message drafts will be used for SMS and MMS management. The current system will only allow for one call per connection, and no call waiting, or conference calls.
Comparative approach on load balancing algorithm by partitioning of public cloud
Anisaara Nadaph, Prof. Vikas Maral
Load Balancing Concept in cloud computing has an significant effect on the performance, If load balancing is not done then numerous drawbacks occurs. As there are tremendous increase in traditional use of internet due to which uneven distribution of work load can occur then can cause some server overloaded and other under loaded, which can cause server crash. The efficiency of cloud computing can be increased by the technique provided here. This paper introduces an improved load balancing model for the cloud partition on cloud. In this paper a comparative study of an Ant colony and Honey Bee algorithm is done to check the best load balancing technique. The system has main controller chooses the suitable balancer for the incoming job. The balancer further selects the server based on the two algorithm(Ant Colony or Honey Bee). Hence, this system will help to select an optimal balancer and the subsystem(server) to allocate jobs (data).
Heuristic based optimization of users request in cloud computing
Loveneesh Singla, Sahil Vashist
Cloud Computing is one of the best compliment to information technology. Traditionally if we see the size of an instance or CPU time we will often have to be satisfied with computing approximate solutions but in the case of cloud computing we have to move one step ahead in the form of virtual machines. In cloud computing there are a numerous problems are present while scheduling requests from users. However in practice there is a need to be solving the issue efficiently even if not optimal. In this paper we have tried to obtain the solution based on existing heuristic approaches. Simulation results see the effectiveness of the approach
Steganography is becoming an important area of research in recent years. It is an art and the science of embedding information into cover image viz., text , video, audio or multimedia content for military communication, authentication and many other purposes. It deals with the ways of hiding the communication message and its existence from the unintended user. In image steganography, secret communication is achieved through embedding a message into an image as cover file and generates a stego-image having hidden information. There are several image steganography techniques are used each have its pros and cons. This paper discusses various image steganography techniques such as Least significant bit, Discrete wavelet transformation, Pixel value differencing, Discrete cosine transformation, Masking and filtering etc.
A Study On Diffie-Hellman Algorithm in Network Security
Vinothini, Saranya, Vasumathi
Communication is the important part in any type of network for making it possible to transfer data from one node to another. Communication needs quality and security for better performance and for acceptance of users and client companies. Quality is dependent on size and some other factors of network but security is very concern parameter in network as it is independent of network size and complexity. Security has become more important to personal computer users, organizations, and the military. Security became a major concern with the advent of internet and the history of security allows a better understanding of the emergence of security technology. The internet structure itself allowed for many security threats to occur. The modified architecture of the internet can reduce the possible attacks that can be sent across the network. Knowing the attack methods, allows for the appropriate security to emerge. Data integrity is quite a issue in security and to maintain that integrity we tends to improve as to provides the better encryption processes for security. In our proposed work we provide harder encryption with enhanced public key encryption protocol for security and proposed work can be implemented into any network to provide better security. We have enhanced the hardness in security by improving the Diffie-Hellman encryption algorithm by adding some more security codes in current algorithm.
A Comparative Study of Classification of Image Fusion Techniques
Sukhpreet Singh, Rachna Rajput
Image fusion is a process by which complimentary details from multiple input images are integrated into a single image, where the output fused image provide more information and more suitable for the purpose of human visual perception. Several situations in image processing require high spatial and high spectral resolution in a single image, to achieve this one solution is image fusion. There are several image fusion techniques are present those can improve the quality of image and provide more extended depth detail. This paper presents classification of some of the image fusion techniques such as Spatial Domain Fusion(Averaging method, Brovey Method, Principal Component Analysis, IHS) and transform domain fusion(multiresolution, Laplacian pyramid, Curvlet transform based, Discrete Wavelet transform ). Comparison of all the techniques concludes the better approach for its future research.
Comparative Study of Automated Testing Tools: Selenium and Quick Test Professional
S.Rajeevan, B.Sathiyan
Software testing is one of the most important phase of software development. Automated testing is an effective testing process that reduces the effort of manual testing. But it is important to select a best suitable tool for testing. The main objective of the paper is to conduct a comparative study of automated tools such as Selenium and Quick test professional (QTP). The main objective of this paper is to evaluate and compare the automated software testing tools to determine their usability, maintenance and effectiveness. There is wide variety of software testing tool in the market with features like web Testing, Window application, etc.
Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. However, despite the fact that cloud computing offers huge opportunities to the IT industry, the development of cloud computing technology is currently at its infancy but there are many issues still to be addressed.
Large scale distributed systems such as Cloud Computing applications are becoming very common these days. These applications come with increasing challenges on how to transfer and where to store and compute data. The most prevalent distributed file systems to deal with these challenges are the Hadoop File System (HDFS) which is a variant of the Google File System (GFS). However HDFS has two potential problems. The first one is that it depends on a single name node to manage almost all operations of every data block in the file system. As a result, it can be a bottleneck resource and a single point of failure. The second potential problem with HDFS is that it depends on TCP to transfer data. As has been cited in many studies, TCP takes many rounds before it can send at the full capacity of the links in the cloud. These results in low link utilization and longer download times. To overcome these problems of HDFS, a new distributed file system is presented in this thesis. The scheme of this distributed file system uses a light weight front end server to connect all requests with many name nodes i.e. Triple Security.
A Comprehensive Comparison between WiMAX and Wi-Fi
Shally Sharma, Navdeep Singh Randhawa
WiMAX and Wi-Fi are rapidly progressing technologies and now days, it is common to find them not just in laptops and personal digital assistants, but in equipment as mobile phones, parking meters, security cameras and home entertainment equipment. A combination of Wi-Fi and WiMAX deployment, will offer more cost-effective Solution than a sole WiMAX or Wi-Fi implementation and a new technology that solves many of the difficulties in last-mile implementations. WiMAX extends the benefits of Wi-Fi networks to deliver the next-generation mobile Internet. This paper describes two technologies Wi-Fi (Wireless Fidelity) & WiMAX (Worldwide Interoperability for Microwave Access) used in wireless communication along with their comparison.
Distribution of cloud resources dynamically by using virtualization
D.Sireesha, M.Chiranjeevi, P.Nirupama
In Cloud, data are stored on servers at a remote location. Cloud computing provides the resources to the bussiness customers based on their needs. In cloud computing, Resource Allocation (RA) is the process of assigning available resources to the needed cloud applications over the internet.The Cloud providers deliver application via the Internet, which are accessed from web browser. Resource allocation starves services if the allocation is not managed precisely. Resource Allocation Strategy (RAS) is all about integrating cloud provider activities for utilizing and allocating scarce resources within the limit of cloud environment so as to meet the needs of the cloud application. By using virtualization technology the resource multiplexing is done in the cloud environment.To allocate the data center resources dynamically,we present a system that uses virtualization technology.And minimizing the number of servers used,we can support the green computing.And to measure the unevenness in the multidimensional resource utilization of a server we introduce the skewness concept. By minimizing skewness, we can improve the overall utilization of server resources. To prevent the overload in the system effectively while saving energy used, We develop a set of heuristics.
Defense against Shoulder Surfing Attack for Recognition Based Graphical Password
Swati Kumari, Ruhi Kaur Oberoi
Recognition based Graphical password is secure against shoulder surfing attack, guessing and capture attacks. In this paper we propose the technique for defense against shoulder surfing attack. This technique is based on grid of images, where the user first give his username and then identifies the password images eliminate that row and column dost not contain password image apply this process twice and then mapped the password into another blank grid.
In the recent years, the Optical Transpose Interconnection System (OTIS) has attracted researchers for solving computational and communicational intensive problems. OTIS is a hybrid interconnection network that exploits the advantages of electronic link as well as optical links for connecting processors in the network. Many variants of OTIS model and several parallel algorithms for different problems have been proposed on those models. In this paper, we have proposed shortest path routing and sparse enumeration sort on OTIS hyper hexa-cell. In the sparse enumeration sort, the number of keys to be sorted is px, for some constant x≤½ and for our proposed algorithm x=2. The proposed shortest path routing on OTIS-HHC is optimal as the number of steps required is equal to its diameter (2dh+3). The time complexity of algorithm proposed for parallel sparse enumeration sort is 4(dh+1) electronic moves + 4 OTIS moves for case I and 4(dh+1) electronic moves + 3 OTIS moves for case II.
Discover and Verifying Authentication of Nearest Nodes in Mobile Ad hoc Networks
N.R.Anitha, E. Murali
A large number of temporary nodes consisting of set of rules and destination based services require that mobile nodes learn the nearest places. A process can be easily improper usage or interrupt by opposition nodes. In being away of fixed nodes the discover and verifying of nearest places that have been hardly inquired in the existing system .In this thesis by introducing a complete distributed answer that is strong and secret against adjacent nodes and can be damaged only by an huge number of neighbor nodes . Results that a set of rules can occur more than 99 percent of the threats . under the best possible state the original nodes are to be searched