Personalized Web Search with Custom Privacy Preservation
Swapna Desale ,Jyothi Rao
One of the regular job done by users on the Internet is searching. Information is rapidly growing on the world wide web, hence users are facing difficulties to get effective and efficient search results for their searching terms. Personalization technique is used for providing more effective and relevant search results to user. We study Personalized Web Search engine which refers to the results based on the users’ preferences. But users are not ready to share their personal information while searching process, which has become a main hurdle for large use of Personalized Web Search. Proposed system generalizes profile containing all interest areas related to query from complete user profile while satisfying the privacy requirements defined by users. Here system focuses to maximize the profile utility and minimize privacy risk. Click-log is used for user profile updating and providing page recommendations
Secured Icmp Based Ip Traceback Scheme To Trace The Spoofed Ip Locations
V. Mariyammal , Dr.K.Thamodaran
The Internet Protocol (IP) is the basic protocol for sending data over the Internet and many other computer networks. IP spoofing is the creation of Internet Protocol (IP) packets with a forged source IP address, with the purpose of thrashing the identity of the sender or an attacker can make it appear that the packet was sent by a different machine. The objective of this paper is to devise the passive IP trace back (PIT) scheme using Internet Control Message Protocol(ICMP) to avoid the operational obstacles of IP trace back schemes. PIT investigates ICMP error messages (path backscatter) triggered by spoofing traffic, and tracks the spoofers based on public available information (eg.topology). This scheme(PIT) discover the spoofers without any operational requirement and exhibits the causes, collection, and the statistical results on path backscatter, demonstrates the processes and effectiveness of PIT, and shows the captured locations of spoofers throughout applying PIT on the path backscatter data set. These results are capable of expose the spoofed IP locations.
A Novel approach to prevent launching of attacks in MANETS using Bait tracing setup
Sushma K R Mrs.Renuka malge
In mobile ad hoc networks (MANETs), an essential necessity for the foundation of correspondence among hubs is that hubs ought to coordinate with each other. Within the sight of malicious hubs, this prerequisite may prompt genuine security worries; for occasion, such hubs may disturb the steering procedure. In this connection, counteracting or distinguishing pernicious hubs propelling gray hole or shared black hole assaults is a test. This paper endeavours to determine this issue by outlining a dynamic source steering (DSR)- based directing component, which is alluded to as the agreeable draw discovery plan (CBDS), that coordinates the upsides of both proactive and responsive guard structures. Our CBDS strategy actualizes an opposite following method to help in accomplishing the expressed objective. Recreation results are given, demonstrating that within the sight of noxious hub assaults, the CBDS beats the DSR, 2ACK, and best-exertion flaw tolerant steering (BFTR) conventions (picked as benchmarks) as far as parcel conveyance proportion and directing overhead (picked as execution measurements).
Comparative Analysis of Soft Computing Based Load Balancing Techniques in Cloud Environment: A Review
Sapna Pooja Nagpal
Cloud computing is a computing paradigm for managing and delivering services over the internet. The fundamental guidelines of cloud computing model are computing, storage & programming as an administrator. But the size of computation & demand for higher computation is growing very rapidly which is causing an uneven and heavy workload on cloud resources. One of the core problems which cloud resources scheduling need to solve is the load balance. In the cloud resources scheduling process, if load changes suddenly, this may cause resources scheduling tilt. In this paper, we are presenting a review on the existing soft computing based techniques for the load balancing in cloud computing environment. Also a comparison is made for these existing techniques.
Intrusion Detection System Using Machine Learning Approach
P.Akshaya, .
In this paper, we present an intrusion detection model based on genetic algorithm and neural network. The key idea is to take advantage of classification abilities of genetic algorithm and neural network for intrusion detection system. The new model has ability to recognize an attack, to differentiate one attack from another i.e. classifying attack, and the most important, to detect new attacks with high detection rate and low false negative. This approach uses evolution theory to information evolution in order to filter the traffic data and thus reduce the complexity. To implement and measure the performance of this System. We used the KDD99 benchmark dataset and obtained reasonable detection rate
Optimized Secure Image Transmission approach Lossless Secret Image Recovery Based on Secret Fragmented Mosaic Images
Ankoji Pavan , K. Ramanjaneyulu
The prominence of the digital image processing is growing in the field of security, as now s days the transfer of images from one entity to another entity using internet as source of medium is increased. The digital image transmission has advanced applications in the field of security such as security related the confidentiality of medical databases, preventing leakages from the military databases, privacy protection the enterprise related documents and online document storage systems. Although lot of research has been done on secure image transmission but still it too has some unresolved issues such as leakages while transmissions and loss of the confidentiality is another drawback. An innovative approach is implemented in this paper and the main idea of the proposed work is create meaningful secret fragment mosaic image system and the important implementation in the proposed work is the secret fragment mosaic image size is same as that of preselected target image which gives scope to provide more payload capacity. The generation of the mosaic image is the resultant of the block fragments of the selected secret image and the mosaic image in the process is looks alike of the target image which is used as the source to hide the secret image by successfully transforming the color characteristics of the secret image similar to the blocks of the target image. Finally the simulation results show the performance in terms of presenting the meaningful secure image transmission technique for the lossless recovery and the necessary data is embedded into mosaic image for the recovery of the securely transmitted secret image.
An Efficient Approach for Secure Data Hiding Using Cryptography
Reetu, Ms.Mamta Mr. Satish Dhull
— Cryptography is a technique used to avoid unauthorized access of data. It has two main components; a) Encryption algorithm, and b) Key. Sometime, multiple keys can also be used for encryption. A number of cryptographic algorithms are available in market such as DES, AES, TDES and RSA. The strength of these encryption algorithms depends upon their key strength. Strong encryption algorithms and optimized key management techniques always help in achieving confidentiality, authentication and integrity of data and reduce the overheads of the system. The long key length takes more computing time to crack the code and it becomes difficult for the hacker to detect the cryptographic model. In this paper we suggest an innovation in the age-old conventional cryptographic technique of HILL-CIPHER using the concept of self repetitive matrix. A numerical method has been stated, mathematically proved and later implemented in generating a random matrix of given periodicity. The method of self-repetitive matrix has then been used to simulate a communication channel with proper decompression techniques to facilitate bit saving.
Prinkesh Soni, Ankush Yadav , Rakesh Singh Lodhi, Amit Solanki
MATLAB/Simulink of a Solar Photovoltaic System through Cuk Converter
Prinkesh Soni, Ankush Yadav , Rakesh Singh Lodhi, Amit Solanki
In parallel to developing technology, demand for more energy makes us seek new energy sources. The most important application field of this research is renewable energy resources. Solar energy is being popular ones owing to abundant, ease of availability and convertibility to the electric energy. This paper covers a detailed analysis of solar PV system with Cuk converter such that it gives constant and stepped up dc voltage to the load. A Cuk Converter is used for examine the performance of solar PV system. Simulink of cuk converter in PV system and analyzing output voltage through MATLAB
An Efficient and Scalable Technique for Texture Analysis to Obtain in IR Model
Akanxa Mishra, Namrata Sharma .
Nowadays the popularity of multimedia contents search is increasing rapidly such as images, videos, and the audio. Among them, the images are much popular for search and data retrieval processes. In literature there are a number of techniques are exist for efficient and precise image retrieval but according to the conclusion of literature the content based techniques are a much promising approach. In content-based image retrieval systems, the image internal descriptors or features are used for representing the image contents. In these descriptors the shape, colour and texture are primary image content descriptors. A number of works are available for different combinations of these features but there is work exist for the texture analysis. In order to recognize the feature based similar objects, the texture feature played an essential role.
Reinforcement of Event Streams by Obfuscation over Multitudinous Correlations
N. Pavani , P. Pothuraju
Event processing is an approach of capturing and processing the data about events. The data may come from multiple origins in complex event processing systems and transmitted through multiple security authorities. Present event processing systems are failing in conserving the privacy constraints of incoming event streams in a sequence of eventually applied stream operations. This problem emerges in large-scale distributed applications like a logistic chain where event processing operators may be escalated over multitudinous security domains .This paper presents a frail access management in multi-hop event processing networks. Literally this paper offers a solution to maintain privacy constraints even when the events turn to correlated complex events. The obfuscation value calculated using Bayesian Network is used to decide whether inheritance of access requirement is needed. The implementation offers methods to enhance the obfuscation calculation and to increase the Bayesian Network size to measure obfuscation over multitudinous correlations that reinforces the event streams.
An Enhanced Secure Image Cryptography based on RC6 and RSA to Minimize Entropy and Improve Correlation
Pooja Chaturvedi, Chetan Gupta
a secure environment would not be possible without the existence of encryption technology. Image, which covers the large percentage of the multimedia data, its protection is very important in a current scenario. So focusing the security possibility an efficient image cryptography system based on RC6 and RSA have been proposed in this paper. The specialty of this approach is the number of keys. There are four different keys are needed for decryption process along with the extra shifting of pixels by XOR. The key size and number of variable rounds makes this framework more secure. The key size is variable up to 2040 bits. The results are achieved in terms of entropy and correlation coefficients. The less variation in entropy is achieved from our approach which shows the efficiency of our approach.
Recognition of emotions from speech is one of the most important sub domains in the field of affective computing. Sometimes, a person speaks the sentence while stay in some emotion which makes the tone of speech changes the meaning of the sentence completely. Speech signal consist not only the words and meaning but also it consist of the emotions. The emotion expressed by speech is one of the major influencing factors for the low recognition accuracy achieved during the development of speech based systems. When it comes to human speech emotions affects the tone and the speaking style of the person. The research in this area is needed to overcome these problems of emotion recognition from speech. However the problem is usually deals with the following basic emotion categories: Happy, Sad, Angry, Afraid, Surprise, Neutral. From the literature survey for the proposed study, it is observed that there is no proper emotional speech corpus in any of the Indian languages for carrying out the research on emotional speech processing. No any standard emotional speech database as well as the real life emotional speech database is available in the context of Indian languages. It is also observed from the literature that excitation source information is not thoroughly investigated for the purpose of emotion recognition task. Most of the researchers have used frame-wise spectral features extracted from entire utterance for speech emotion classification. Most of the existing emotion recognition systems are developed using only gross prosodic features extracted from the entire utterances. This paper would be helpful for the researchers to find the brief overview of emotion speech recognition systems developed in different languages around the world and the purpose and approach of the research
Publish/Subscribe Systems Security through Distinct Identity Encryiption
R. Kiranmayi, P. Pothuraju
The publish/subscribe communication paradigm has gained high popularity. Publishers inject information into the pub/sub system and subscribers specify the events of interest by means of subscription. Traditional Content based pub/sub provide expressiveness and asynchronous nature but gives little attention to security. Existing approaches rely on traditional broker network that address security under restricted expressiveness or rely on network of semi-trusted brokers. This paper shows a new approach to provide authentication and confidentiality in BROKER_LESS pub/sub system by allowing subscribers to maintain credentials. Additionally we present Extensions of cryptographic methods, Multi credential routing that strengthens the weak subscription confidentiality.
The number of peoples who rely on internet for information seeking is doubling in every seconds.as the information produced by the internet is huge it have been a problem to provide the users relevant information.so the developers developed algorithms for identifying users nature from their clickstream data and there by identifying what type of information they like to receive as an answer for their search. Later they named this process as Web Personalization. As a result of web personalization the users are looped Eli pariser an Internet Activist named this loop as “Filter Bubble”. This preliminary work explores filter bubble, its pros and cons and techniques to burst filter bubble
Simulation of Steerable Gaussian Smoothers using VHDL
Sharanabasava, Syed Gilani Pasha
Smoothing filters have been widely used in image and video analysis. Also directional smoothers useful in motion analysis, edge detection, line parameter estimtion, and texture analysis. Such particular applications require the use of several different angles oriented directional filters. For real time applications, hardware devices having capability of parallel processing can be used. The steerability is the property, in which several filtering operations outputs are linearly combined to achieve output of a directional filter which is arbitrarily oriented. Smoothing filters has a property of steerability, which implies that the outputs of several filtering operations can be linearly combined in order to produce the output of a directional filter at an arbitrary orientation. There are several efficient FPGA implementations of the convolution operation for non-separable and separable have been presented in the literature, research on steerable filter implementations on FPGA is limited. In this paper, we present implementations of steerable Gaussian smoothers using VHDL and simulation is carried out using Model SIM software.
Study of Unified Power Quality Conditioner for Power Quality Improvement
Anupam Ojha , Amit Solanki, Rakesh Singh Lodhi
These In a powers system network there are many problems related to power quality. So to improve power quality of a system we use different devices such as active power filters. Active power filters are classified into two types that is Shunt Active Power Filter (APF) and Series Active Power Filter (APF) and combination of both is known as UPQC (Unified Power Quality Conditioner). Here we have done simulation of Shunt Active Power Filter, Series Active Power Filter and Unified Power Quality Conditioner. Shunt APF is used to mitigate the problems due to current harmonics which is because of non-linear load and make source current sinusoidal and distortion free. The control scheme used is hysteresis current controller using “p-q theory”. Series APF is used to mitigate problems caused due to voltage distortion and unbalance present in source voltage and make load voltage perfectly balanced and regulated. The control scheme used is Hysteresis voltage controller by using a-b-c to d-q transformations. Then Shunt APF and Series APF is combined for designing UPQC and by this current harmonics in load current and voltage unbalances in source voltage both are removed and source current becomes sinusoidal and load voltage becomes perfectly balanced.
K.Sridhar Dr. Syed Abdul Sattar Dr. M.Chandra Mohan
Loss Less Data Embedding In the Motion Predicted Vectors for Video Sequences
K.Sridhar Dr. Syed Abdul Sattar Dr. M.Chandra Mohan
A lossless data hiding scheme in the predicted frames during the motion estimation is proposed in this paper. The main objective of this work is to provide an efficient, low complex and faster data hiding scheme for the video sequences in compressed domain. This work utilized the concept of motion estimation from which the predicted frames are used to hide the data. This method also provides high embedding capacity while preserving the quality of the frames. Experimental results shows that the proposed approach is providing good results in terms of extracted message bits and the quality of the embedded frames.
Privacy of data is one of the major issues in every network. The network may be wired or wireless and the network may be ad-hoc, pear to pear or distributed network privacy of data is an important factor in each case. In each case in a distributed or shared network all people connected in a network work with the protocol. But there is no guarantee of security threat and data lost or stolen. So precautions needed in each case to maintain the network and to maintain the private data. An Ad-hoc network is a Local Area Network (LAN) which is built spontaneously as devices connect. Instead of depending upon a base station to manage the dataflow, each node in a network forward data packets to each other. As all most all nodes are responsible for data flow, so the probable data lost or stolen of information cannot be neglected. In order overcome these problems in data mining there should be some proper protocols as well as some special precautions should be taken. There are many experiments and developments already done in this aspect to secure data as well as to secure the network. This paper provides a review of various security challenges that are already taken for protection of privileged information in various peer to peer network, shared or distributed network. We have also proposed a technique for privacy preservation of privileged information in case of an Ad-hoc network. This technique may provide a better security for the privileged information in an Ad-hoc network.
Cone metric spaces and fixed point theorems of contractive mappings
C Vijender .
In this paper we introduce cone metric spaces, prove some fixed point theorems of contractive mappings on cone metric spaces.
In this paper, we replace the real numbers by ordering Banach space and define cone metric spaces (X, d ). We discuss some properties of convergence of sequences. We prove some fixed point theorems for contractive mappings. Our results generalized some fixed point theorems in metric spaces.
Characterization of Advanced Encryption Standard (AES) for Textual and Image data
S. Rehman *, S.Q. Hussain , W.Gul and Israr
— Safekeeping of multimedia data is an imperious concern for the reason that fast progression of digital data exchanges (DDE) over unsafe network. Multimedia data safety is accomplished by methods of cryptography, which is concerned with encryption of data. Standard symmetric encryption algorithms are responsible to enhance security for the multimedia data. The problem of computational head is always on the way. To overwhelmed that delinquent, we examine the Advanced Encryption Standard (AES) and amend it, to diminish the calculation of algorithm and for enlightening the encryption performance. In this paper, an improved AES algorithm has been investigated. As an alternative of using Mix column we practice with modified step, takes a stable modified step key Mk, and perform expansion routine by using Randal’s key schedule to generate a modified step schedule. When the Mk length is 128 bit, then the Expansion generates a total 11 sub Mk arrays of 128 bits, and Xor with state array in every rounds except final round. Hypothetical analysis and experimental results provide evidence that this technique has the capability of high speed as well as fewer overheads on data. Modified-AES algorithm is a quick trivial encryption algorithm for security of multimedia data. All above advantages put together algorithm highly appropriate for the images and plain text transfer as well.
Automated diagnosis of Lungs Tumor Using Segmentation Techniques
S.Piramu Kailasam ,Dr. M. Mohammed Sathik
The Objective is to detect the cancerous lung nodules from 3D CT chest image and classify the lung disease and its severity. Although so many researches has been done in this stream, the problem still remains a challenging one. To extract the lung region FCM segmentation is used. Here we used six feature extraction techniques such as bag of visual words based on the histogram oriented gradients, the wavelet transform based features, the local binary pattern, SIFT and Zernike moment . The Particle swarm optimization algorithm is used to select the best features.
Mobile Printer with Bluetooth Compatibility Using Raspberry PI
G.Jhansi, S.Saraswathi .
This paper proposes a design of a device which directly prints the data stored inside the Mobile without the assistance of a desktop computer. The common data printing procedure has to use desktop computer as a relay medium, first receiving the data from mobile and then sending it to printer using an appropriate printing Method. This is rather cumbersome.The design compromises Raspberry pi having Debian flavor of Unix. The Common Unix Printing system installed on Raspberry Pi do the print job.
Enhanced Way of Association Rule Mining With Ontology
T.Bharathi, Dr. A.Nithya .
– For a data mining process, association rule mining is one of the key components. In the realm of the mining of the data, association rules play a key role to condense interesting correlations, frequent patterns, associations or casual structures from the set of items present in the databases. The current paper focuses on the aspect of the competent mining of the association rules from larger databases. The problem of unearthing of large item sets can be sorted by the creation of an ontology tree. In the domain of instructional design and in the evolution of the course content, ontology plays a very vital part. The understanding about the content can be depicted with the help of ontology trees that in turn would aid the instructors in development of the content and the learners to get permission to use the content in an apt way. Even though ontologies are there for many domains, their fittingness for other subjects is still vague. Further, for many other domains, the ontologies even don’t exist. Many have attempted to devise methods to enhance many dimensions of the ontology, namely, representation languages and inference mechanisms. But, unfortunately very less effort has been taken to improvise the practical results of development method application. In this paper, a discussion on the technique of Association rule mining with ontology (ARMO) is presented that is employed to find the most precise association rules in the area of ontology, ontology analysis, ontology tree and frequent item sets. The spotlight is more on the relationship type
Scheduling Virtual Machines across Data Centres in accordance to availability of Renewable Sources of Energy
G Krishna Vasudeva Rao, K Premchand
This paper presents a whole new concept or approach for Green Computing or Environment friendly computing. Model of cloud computing has enabled convenient and ubiquitous network access to a shared pool of configurable computing resources that can be quickly provisioned and released with minimal interaction with the service providers. In this paper we have developed an algorithm which will migrate virtual machines in data centres in accordance to the availability of non-conventional and renewable sources of energy i.e. wind energy, hydro power, solar power, geothermal power & etc. The proposed migration algorithm is general in nature but goes beyond conventional approach of best fit heuristic. Experimental verification shows the ability of the VMs to migrate it in accordance to the availability of different sources of renewable energy.
Vein Pattern Recognition: A secured way of Authentication
Navjot Kaur , Paritosh Chopra
In today’s world the technology is growing and improving day by day and hence the chance of threat to personal information as well as nationalized data is also increasing. No doubt there are a lot of methods which developed either from inside or outside to secure the information but those methods didn’t produce satisfactory results so we need a technology that secure our information more efficiently from an unauthorized access. For the purpose Palm vein authentication is the newest biometric technique with high level of accuracy came into existence and becoming popular these days. In this technique the vascular patterns of an individual’s palm used as personal identification data. This technique can be used in various fields like ATM, hospitals, for Attendance records, driver identification, on construction sites, banking and financial institutions etc. Business growth can also be achieved by reducing the size of the palm vein sensor and shortening the authentication time. This paper presents a review on the palm vein technology and its development, working principle, its applications, and advantages of using this technology.
This paper reports the design and modelling of triangular shaped microspring which is proposed to be used for munition purpose. The simulation results are obtained for Safe and Arm mechanism for different ‘g’ for MEMS based Rocket SAM. Design for MEMS based technology for Rocket SAM chip is structured in such a way that it has both movable and non-movable parts which utilizes mechanical properties, structural properties and material properties that may be and may not be electronically controlled. The design proposed for Rocket SAM in this research is aimed to achieve reliability, safety and cost effectiveness for mass-production. Our motive is to design, simulate and derive its results using metal alloy MEMS based Rocket SAM using SOLIDWORKS Multiphysics software is used which propounds a prodigious environs to solve this problem. Thus, this research presents the design analysis, modelling and nonlinear 3D dynamic simulation of MEMS based metal alloy for Rocket SAM using SOLIDWORKS Multiphysics GUI tool.
This paper presents the basic idea of speech recognition and its progress till date. Speech recognition is essential for a communication between human and machine.The ultimate goal of the technology is to able to produce a system that can recognize with 90-100% accuracy all words that are spoken by any user in different language .This paper helps in selecting the techniques along with their relative advantage and disadvantage. Under the speech recognition using e-speaking product,we create a small and efficient program to take humans voice as input and convert it into keyboard ,mouse system and program events and even speak to you to let you know what it has performed.
Pranav Swaminathan, Tejas Dani, Ronak Bhatia, Shubhankur Jain, Paulami shah
Visual Crypto-Steganography in Images
Pranav Swaminathan, Tejas Dani, Ronak Bhatia, Shubhankur Jain, Paulami shah
The advent of modern technology has provided us with a platform to exchange information on a massive scale. There is always a threat of intercepting this confidential information which ultimately calls for some sort of security measure to allow uninterrupted flow of information from the sender to the receiver. Already devised methods like cryptography and steganography have been referred to as the flag bearers in this field. Cryptography is a process where a person changes the meaning of the original data by converting it into a cipher text. And steganography, on the other hand helps in hiding the cipher text or plain text in a medium thereby obscuring its existence. Both these methods are tested methods of providing security. In today’s information age, information sharing and transfer has increased exponentially. The concern around making secret information completely threat free has been a daunting problem for the experts. Cryptography combined with steganography can certainly be used to overcome this threat. The two methods of cryptography and steganography when combined help in achieving significant levels of data security thereby proving to be a better method than these processes being used as standalone ones for transmitting data over an insecure channel which is susceptible to intrusion. One of the most secure forms of steganography is Visual Steganography which is implemented commonly in image files. There might be some changes in the colour frequency of the image when the data text is embedded which would become very obvious to the person seeing it. In order to overcome the conspicuous behaviour of the image, we propose layers of data protection where the data text will first be converted to an unreadable cipher followed by embedding the cipher into an image file in an encrypted format which will then be divided into shares to achieve another layer of security. Hence, the concept of cryptography and steganography both are used to provide two layers of security followed by visual cryptography scheme to divide the image into shares for it to be transmitted over a network channel.
Yashwant kumar, Rajat joshi Tameshwar mandavi, Simran bharti Miss Roshni Rathour
Enhancing the Security of Data Using DES Algorithm along with Substitution Technique
Yashwant kumar, Rajat joshi Tameshwar mandavi, Simran bharti Miss Roshni Rathour
The security is important issue of the field of networking. Security is the initial stage of authentication and the authentication is highly influencing of modern cryptography. The Data encryption standard (DES) algorithm is a symmetric key algorithm for the using of encryption of electronic data and secure of information. The DES algorithm is provided security of brute force attack. To improve the security of DES algorithm using substitution technique is added before the DES algorithm to perform its process. The substitution cipher is the method of encoding by which unit of plaintext are replaced with cipher text. If the substitution technique is used before the original DES algorithm then the intruder required first to break the original DES algorithm and then substitution technique.
An Efficient Method for Finding Closed Subspace Clusters for High Dimensional Data
S. Anuradha, K.B. Madhuri B. Jaya Lakshmi
Subspace clustering tries to find groups of similar objects from the given dataset such that the objects are projected on only a subset of the feature space. It finds meaningful clusters in all possible subspaces. However, when it comes to the quality of the resultant subspace clusters most of the subspace clusters are redundant. These redundant subspace clusters don’t provide new information. Hence there is a need for eliminating such redundant subspace clusters and output only those subspace clusters which are non redundant and each of them contributing some unique information to the data miner. The set of non redundant subspace clusters is helpful for easy analysis. In order to accomplish this, the concept of closedness has been applied to the subspace clusters. An algorithm known as Finding Closed Subspace Clusters (FCSC) is presented which efficiently outputs the closed subspace clusters from a given set of subspace clusters produced from any subspace clustering algorithm. Based on the experimental study conducted, the number of clusters generated by FCSC has been greatly reduced when compared to the existing SUBCLU algorithm and the average purity of the clusters is marginally improved without loss of coverage.
Effect of Nano Fluids in Solar Flat Plate Collector Systems
S. Arockiaraj, P. Jidhesh
Nano fluids are one of the combination of fluid and nano-sized solid particles which they are relatively used for some application. In general, nano fluids were used for various applications like heat transfer and solar energy applications. Nowadays the high usage of fossil fuel, which it is coming down in quantity in the earth. So we will go for alternate energy sources of solar, wind, tidal, etc. In a heat transfer system contain many losses, In order to overcome the losses, we will use some effectual heat transfer substance. So far reviews were done about nano fluids preparation, thermal conductivity, and performance. The main objective of this paper is to study about enhanced heat transfer in solar collector systems with the effect of various nano fluids.
H.T. Rathod*, K. Sugantha Devi , C.S.Nagabhushana , H.M.Chudamani
Finite Element Analysis Of Linear Elastic Torsion For Regular Polygons
H.T. Rathod*, K. Sugantha Devi , C.S.Nagabhushana , H.M.Chudamani
This paper presents an explicit finite element integration scheme to compute the stiffness matrices for linear convex quadrilaterals. Finite element formulationals express stiffness matrices as double integrals of the products of global derivatives. These integrals can be shown to depend on triple products of the geometric properties matrix and the matrix of integrals containing the rational functions with polynomial numerators and linear denominator in bivariates as integrands over a 2-square. These integrals are computed explicitely by using symbolic mathematics capabilities of MATLAB. The proposed explicit finite element integration scheme can be applied to solve boundary value problems in continuum mechanics over convex polygonal domains.We have also developed an automatic all quadrilateral mesh generation technique for convex polygonal domain which provides the nodal coordinates and element connectivity.We have demonstrated the proposed explicit integration scheme to solve the Poisson Boundary Value Problem for a linear elastic torsion of a non-circular bar with cross sections having profiles of equilateral triangle, a square and regular polygons (pentagon(5-gon)to icosagon(20-gon)) which are inscribed in a circle of unit radius. Monotonic convergence from below is observed with known analytical solutions for the Prandtl stress function and torsional constant .We have shown the solutions in Tables which list both the FEM and exact solutions. The graphical solutions of contour level curves and the corresponding finite element meshes are also displayed.
Cloud Key Bank: Privacy and Owner Approval Of Required Primary Management Scheme
Aarti Bhoi Ashwini Madan , Tejal Khandave
In particular, there is a large range from schemes contribution minor security benefits above legacy passwords, to those contribution significant security benefits in appearance for being more costly to deploy or more difficult to use. Current access to enforce in grained access control on confidential data presented in the cloud are based on in grained in creation of the data .Current solutions in general information expanding unit inadequate to at the same time meet the consequent 3security necessities for keys expanding. To implement CloudKeyBank expeditiously, aim to propose a brand new cryptanalytic primitive named SC-PRE which mixes the techniques of HVE and PRE aimlessly, and propose a concrete SCPRE theme supported existing HVE and PRE designs. System assures the confidentiality of the data and conserves the privacy of users from the cloud while authorizing most of the access control application to the cloud.
Pulsed Latches Methodology to Attain Reduced Power and Area Based On Shift Register
Cheekati Sirisha K. Prakash
The power consumption and area reduction are the key challenges in the Very Large Scale Integration (VLSI) circuit design. Power consumption and Area reduction plays a major role in sequential circuit design. Shift register is the main building block in the VLSI circuits. The shift register is composed of clock inter connection network and timing elements such as flip-Flops and latches. The shift registers are design using edge triggered flip flops but the use of latches for shift register design also optimizes the area. This project introduces a low power and area efficient shift register using pulsed latch and pulse generation circuit. If the Flip-Flop is replaced with the pulsed latch the area and power consumption can be reduced to 50% in the shift register. For this design a non overlap clock pulses are used. This solves the timing problem between pulsed latches through the use of multiple non-overlap delayed pulsed clock signals instead of the conventional single pulsed clock signal. To minimize power consumption various non overlap delayed pulsed clock signal design is proposed for data synchronization in an exceedingly multi bit shift register. The proposed system is designed by using a popular Schematic and layout capture tool with 90nm technology.
Optimized Test compression bandwidth management for Ultra-large-Scale System-on-Chip Architectures performing Scan Test Bandwidth Management
Vengala Abhilash , J. Pushparani
In today’s increasingly complex and interconnected world, system-on-a-chip (SoC) performance requirements are influenced by existing as well as evolving and emerging applications. With Moore’s law supplying billions of transistors, and uni-processor architectures delivering diminishing performance, multicore chips are emerging as the prevailing architecture in both general-purpose and application-specific markets. As the core count increases, the need for a scalable on-chip communication fabric that can deliver high bandwidth is gaining in importance, leading to recent multicore chips interconnected with sophisticated on-chip networks. This paper introduces several test logic architectures that facilitate preemptive test scheduling for SoC circuits with embedded deterministic test-based test data compression. The same solutions allow efficient handling of physical constraints in realistic applications. A detailed experimental analysis is carried out on different provisions, architectures and test-related factors to prove the proposed method efficiency over state of methods.
A Review on Feature Extraction Techniques for Speech Processing
Amandeep Singh Gill .
Speech and language are considered uniquely human abilities Speech is a complex signal that is characterized by varying distributions of energy in time as well as in frequency, depending on the specific sound that is being produced. The aim of digital speech processing is to take advantage of digital computing techniques to process the speech signal for increased understanding, improved communication, and increased efficiency and Definition of various types of speech classes, feature extraction techniques, speech classifiers and performance evaluation are issues that requires attention in designing of speech processing system.
A Study on Performance Modelling and Analysis of Network on Chip under M-Port N-Tree Bursty Traffic
A.Malathi *, M.Theinmozhi
Physical constrains of integrated circuits (commonly called chip) in regards to size and finite number of wires, has made the design of System-on-Chip (SoC) more interesting to study in terms of finding better solutions for the complexity of the chip-interconnections. The SoC has hundreds of Processing Elements (PEs), and a single shared bus can no longer be acceptable due to poor scalability with the system size. Networks on Chip (NoC) have been proposed as a solution to mitigate complex on-chip communication problems for complex SoCs. They consist of computational resources in the form of PE cores and switching nodes which allow PEs to communicate with each other. Performance modelling and analysis has great theoretical and practical importance. This research is devoted to developing efficient and cost effective analytical tools for the performance analysis and enhancement of NoCs with m-port n-tree topology under bursty traffic. Therefore a new analytical model is developed to investigate the performance of NoCs with the m-port n-tree topology under bursty traffic. Even though it is broadly proved in practice that fat-tree topology and its varieties result in lower latency, higher throughput and bandwidth, still most studies on NoCs adopt Mesh, Torus and Spidergon topologies. The analytical results and those obtained from extensive simulation experiments have shown a good degree of accuracy for predicting the network performance under different design alternatives and various traffic conditions.
With the emergence of cloud computing and continuously decreasing cost of the storage; it has become very easy to store unstructured data generated from the social media post, multimedia etc. To store unstructured data a new mechanism called NoSQL has evolved, which can store the data naturally and logically with lose restrictions on the database schema. This paper attempts to use NoSQL database system namely MongoDB for read intensive type cases, implementation scheme, and finally average run time compared of different cases with increase number of records and operations using YCSB.
Stochastic Modeling for Analyzing Scalability Impact of Lottery Scheduling using Proportion Reformation
Manish Vyas , Dr. Saurabh Jain
For effective processor scheduling,algorithms are required to develop not only for fair scheduling but also for efficient implementation of resource management with rapid adjustment to control over relative execution rates. Proportional share scheduler assure that each job obtain a certain percentage of processor time. Lottery scheduling is based on randomized approach to achieve proportional share resource management where resources are allocated to the clients in proportion to their respective weights. In this paper conventional lottery scheduling scheme is designed and extended along with some conditions to get new scheduling schemes. Stochastic modeling is applied for study and analysis.
Technology and advancement are two sides of the same coin, with one comes the other into play. We are living in advanced world leading to development, progressing at a higher pace and with this advancement, we have invited problems for ourselves .With the enhancement in lifestyle, living standard leads to population explosion which consequently leads to increase in number of transportation vehicles. Increase in transportation vehicles are sign of development but on the other hand this rapid increase also leads to problems like pollution, traffic, exploitation of resources. These vehicles include cars, buses, ships, bikes, trains etc. All of these transportation vehicles derive their power from engines. As of the transforming technologies, the engines have also been updated but till now there is no engine which has 100% efficiency. According to thermodynamics any system should give an output as corresponding to the given input. But due to the lack of efficiency the output is as “power with exhaust”. The exhaust is the prominent problem which contributes to pollution. The engines used in cars are internal combustion engine or IC engines. The main pollutant of IC engines are CO, NOx (derivatives of Nitrogen Oxide), partially burned hydro-carbons (HC), particulate matter etc. These pollutants affect our body both directly and indirectly. The problems like Green House Effect and Global Warming are also into play because of the effects of exhaust pollution
Implementation of Automatic Retina Exudates Segmentation Algorithm for Early Detection with Low Computational Time
B. SUMANJALI , K. NAGAIAH , B. ANITHA
In the world, Diabetic Retinopathy is the leading cause of vision loss. Early symptoms of this disease are exudates, so early diagnosis and treatment at right time is very important to prevent blindness. For a particularly long time, automatic diagnosis of diabetic retinopathy from digital fundus images has been an active research topic in the medical image processing community. In this work, two new methods for the detection of exudates are presented which do not use a supervised learning step; therefore, they do not require labelled lesion training sets which are time consuming to create, difficult to obtain and prone to human error. We introduce a new dataset of fundus images from various ethnic groups and levels of DME which we have made publicly available. Experimental results show that proposed yields better results over state of art methods.
A Concise Study about Data Mining Methods, Tools and Its Trendy Applications
C.Thangamalar, R.Ramya R.Lavanya
Data mining is an emerging topic of the computer science research in recent years, and it has extensive applications in various fields such as Bio informatics, Criminal investigation, Research analysis, Corporate surveillance, Manufacturing engineering, Web and Semantic web. It uses machine learning, statistical and visualization techniques to discovery and present knowledge in a form, which is easily and is well explicable to humans. In today’s ferociously competitive business surroundings, corporations to speedily flip these zettabytes of information into vital insights into their customers and markets to guide their promoting, investment and management ways. We are concentrating many taking survey for many data mining 102 software tools because however current information contains most structured and unstructured data, spatial data.etc., that it becomes almost not possible to manually analyze them for valuable decision-making information .Automated discovery tools have the aptitude to investigate the information and gift the extracted high level info to the analyst or decision-maker. This research paper focuses on data mining techniques, software tools, and most trendy applications.
: Positioning misrepresentation in the portable Application business shows to false or dubious activities which have a motivation behind, thumping up the Applications in the popularity list. To be sure, it ends up being more endless for Application architects to adventure shady means, for instance, developing their Applications' business or posting fraud Application assessments, to ponder situating deception. While the ramification of abstaining from positioning misrepresentation has been by and large maintained, there is constrained comprehension and examination here. This paper gives an all-inclusive viewpoint of situating deception and proposes a Positioning misrepresentation distinguishing proof structure for versatile Applications. In particular, it is proposed to precisely discover the mining in order to posture blackmail the dynamic periods, to be particular driving sessions, of compact Applications. Such driving sessions can be used for recognizing the area irregularity as opposed to an overall anomaly of Application rankings. In addition, three sorts of verification s are investigated, i.e., situating based affirmations, displaying to rate based confirmations and study based evidences, Applications' situating, rating and review rehearses through genuine speculations tests. In the request, this paper gets the skill of the proposed system, and shows the distinguishing proof's flexibility estimation furthermore some consistency of situating deception works out.
An Enhanced Light-weight Proactive Source Routing Protocol using DIFT-BFHS for MANET
Dr. N. Rama, Justin Sophia. I
A mobile ad hoc network (MANET) is a foundationless wireless communication network with a collection of mobile nodes, these nodes are not lies within the direct transmission range of each other but depend on the intermediate nodes for data transference. In multi hop wireless networking the opportunistic data forwarding related research has drawn much attention. The main reason for opportunistic data forwarding has not been widely utilized in mobile ad hoc networks (MANETs) because of lack of an effective lightweight proactive routing feature with strong source routing capability. In this research the PSR working is modified by Depth-first iterative-deepening combined with the best first heuristic search to maintain the information of the entire network topology. DIFT-BFHS spanning tree is constructed to maintain the network topology information. Instead of repeatedly updating this information, the updating is made only when modification occur, in network topology through the nodes. This makes the acquaintance node discovery process simple and reduces the routing overhead; therefore the vigor is saved as much as possible.
Sanjay Kumar, Narendra Sahu Aakash Deep , Khushboo Gavel Miss Rumi Ghosh
Offline Handwriting Character Recognition (for use of medical purpose) Using Neural Network
Sanjay Kumar, Narendra Sahu Aakash Deep , Khushboo Gavel Miss Rumi Ghosh
Handwriting recognition means to scribe the written words from the paper. The written words are taken from the paper by the means of OCR. OCR means Optical Character Recognition. It is a technique of getting the printed or written text from the paper by using any scanning hardware like camera or scanner which can get printed or written words from the paper and convert it digitally. Offline handwriting recognition uses the techniques of OCR and able to detect and analyze the handwriting words. Here we can recognize the handwriting of the doctor and try to get some pattern which will be used to guess the most probable prescribed medicine. We can only guess prescribed medicine’s name and does not assure the exact name because we all know that it is difficult to read the handwriting of doctors for non-medicine personal.
A Review on Prediction of User Action Interpretation for Online Content Optimization
Mr. Pratik V. Pande , Dr. M.A.Pund
: Now a day, the use of internet increases tremendously and it has become an important medium to deliver digital content to web users. Most of the vendors may take a help of e-commerce sites to sell their items, but the problem is that how to improve the business, how to attract more web users attention and retain them to their portal sites on an ongoing basis. To attract more users it is necessary to build recommender system that can help to improve the business by understanding users’ interest. Recommender system provides best items to the consumers by keeping track of their search patterns and online merchants can get a better understanding of their behaviors and intentions. For that purpose, we propose deeper user action interpretation. We take the use of business intelligence, which is the technology for gathering, storing, analyzing, and providing access to data to help enterprise users make better business decisions. We are using Data mining to understand user interest and according to that items will deliver to them. We approach decision support system to generate mining reports and that will help merchant to improve the performance of the system.
Fusion Layer Topological Space Query Indexing For Uncertain Data Mining
M. Kalavathi, Dr. P. Suresh
Data uncertainty is an intrinsic property in different applications such as sensor network monitoring, object recognition, location-based services (LBS), and moving object tracking. The data mining methods are applied to the above mentionedapplications their uncertainty has to be handled to achieve the accurate query results. The several probabilistic algorithm estimates the location and control for each object but not effective in handling query processing in distributed environments. Probabilistic inverted indexing computes the lower bound and upper bound for a threshold keyword but fails to extend the technique on tackle the correlation. In order to overcome the issues in uncertain data mining, Fusion Layer Topological Space Query Indexing technique (FLTS)is introduced. Initially, the queries are articulated on any random subset of attributes in the uncertain data.The FLTS index technique answers the top-k queries competently. FLTS index correctlyshows in their dominant relationships andsignificantly reduces the number of tuple accessed through query processing by pruning redundant tuple based on two criterions such as layer point sort method and the record point sort method. Initially, layer point sort method is used in FLTS indexto sort out tuple totally on the combination of all attribute values of the tuple. Subsequently, each attributes values particularly used for rating the tuple using record point sort method. Therefore,the correlation is removed significantly. Through an analysis of the interaction of the two sorting methods, derive a fixed bound that reduces the number of tuple retrieved during query processing and obtaining the correct query results in distributed environments. An experimental result shows that the FLTS indexing technique improves the query retrieving efficiency, response time, memory consumption and scalability.
Blood group Detection and RBC, WBC Counting: An Image Processing Approach
Akshaya P. Sahastrabuddhe , Dr. Sayyad D. Ajij
The human blood is a health indicator; it delivers necessary substances such as oxygen and substance that provides nourishment is necessary. Hence, segmentation of blood cells and identification of blood type is very important. The human blood consists of the RBCs, WBCs, Platelets and Plasma. Presently, lab technicians tests blood groups manually and they use a device called Hemocytometer and microscope to count blood cells. But this method is extremely time consuming, monotonous and leads to the inaccurate results due to human errors. To overcome the problems regarding time, accuracy and cost, a method is proposed based on processing of images acquired from laboratory. The image processing techniques such as Segmentation, Morphological operations and Circular Hough Transform are used. Accuracy of the system is high with very low execution time.
Black Hole Attack Detection Using HlA with Optimized Link State Routing Protocol In Wanet
J.JenoMactaline Pears ,Dr.D.C.Joy Winnie Wise
A wireless ad-hoc network also known as autonomous Basic Service Set which is a computer network in which the communication links are wireless. In ad-hoc network each node can forward data for other nodes and so the determination of which nodes forward data is made dynamically depended on the network connectivity. There are two kinds of sources that providing packet losses are link failure and dropping of packets by adversary activities in multi-hop wireless ad hoc network. While observing a continuous packet losses in the network for determining whether the packet drops are occurred by link errors or by the joined effect of link errors and malicious drop. To improve the detection accuracy of adversary an enhanced Optimized Link State Routing Protocol (OLSR) is used to correctly broadcast the routing protocol control traffic on behalf of other nodes. Homomorphic Linear Authenticator (HLA) based public auditing scheme allows the detector to calculate the truthfulness of the packet loss information informed by nodes. This algorithm is privacy preserving and find the optimal path during transmission.
Emotions are an important part of human life. It creates direct effect when the gestures of people are visible to the audience, but it is unclear when only texts are used rather than used in combination with gestures and facial expression. As the wide use of internet and mobile, text data has become the main source of communication in several social networking sites. One way of making such textual based interaction more interactive is to make a system that can recognize the sentimental purpose present behind the text. Now a days several researches are ongoing to make computer and human interaction as effective as it can detect the sentimental state of a person. This paper focuses on studying different work done in the area of sentiment analysis through text mining and provides a review that covers the best method and techniques used based on the previous experimental result.
The term BIG DATA has getting more importance in various industries over last couple of years because industries generated huge amount of data per day. Big Data is applied on various huge and large data sets because it can not be store and process through traditional databases. traditional databases does not have potential to process such large data sets n a reasonable time. Big data has huge potential to store and process such huge and large datasets in several ways because in processing we are analyse the large datasets in a required time. One of the main reason of using R is freely available and its comes with a lots of free packages and powerful tools through which we can easily analyse the large datasets in a sufficient time. Text analysis is still somewhat in its infancy, but is very promising. Because in most of the companies 80% of data is in unstructured form, while most types of analysis only work with structured data. In this paper, we will use R packages to analyze unstructured text.
Efficient Cloud Server Job Scheduling using NN and ABC in cloud computing
Amandeep Kaur , Pooja Nagpal
High work load not only translates to high operational cost, which reduces the marginal profit of Cloud-providers, which also direct towards low throughput of system. Henceforth, efficient job execution solutions are required to minimize the impact of Cloud computing on the environment. So, to generate such solutions, deep analysis of Cloud is obligatory with regard to their job-efficiency. Otherwise, Cloud computing with progressively persistent front-end client-devices interrelating with back-end data centers will cause an enormous load of jobs. To address this problem, data centre servers need to be managed in an efficient manner to drive computing. In proposed work, hybrid algorithm which is based on neural network with artificial bee colony and some concepts of scheduling has been introduced in work. On the basis of them, tasks execution will be enhanced and load will be reduced. The results evaluation will be done on the basis of total jobs completed in the MATLAB 2010a environment.
Preventing Social Sites from Publishing Malicious Content
Deepak Ranjan , Dr. Tripti Arjariya
The World Wide Web has become an inseparable part of millions of people who use online services e.g. online banking, online shopping, social networking, e-commerce, and store and manage user sensitive information, etc. In fact, it is a popular tool for any class of user over the Internet. Rich Web based applications are available over the World Wide Web to provide such types of services. At the same time, the Web has become an important means for people to interact with each other and do business. This is the positive side of this technology. Because the Web can also become a most dangerous place. Due to popularity of World Wide Web which has also attracted intruders and attackers to harm network services. These intruders abuse the users and internet by performing illegitimate activity for their financial profit. These Web pages that contain such types of malicious code are called as malicious Web pages.
Manpreetkour Basantsingh Sardar Dr. Sayyad D. Ajij
Fruit Recognition and its Calorie Measurement: An Image Processing Approach
Manpreetkour Basantsingh Sardar Dr. Sayyad D. Ajij
Fruits contribute to an essential part of our diet because they are a major source of energy, vitamins, fiber, plant chemicals and nutrients. Fruits are naturally low in fat, sodium & calories and rich in potassium and fiber, vitamin C. A diet high in fruit can help us against cancer, diabetes, heart diseases etc. A system that gives quickly how much calories present in their diet or fruit intake that can be very useful to maintain health without expert dietitian advice. Use of image processing technique is increasing day by day in all fields and including the agriculture, food science etc. Shape, color and texture are the image features which help in classification and calorie estimation of fruits. This paper proposes an algorithm for fruit recognition and its calorie measurement based on the shape, color and texture along with histogram of gradients and GLCM with local binary pattern algorithms for texture segmentation scheme recognizing the fruits and area, major axis , minor axis are calculated by using shape feature to get more accurate calorie value. With the help of nutritional look up table these features are fed to multi SVM classifier for accurate classification. Evaluation is performed in MATLAB software using two database namely real time database and fake plastic fruit database. Results obtained are very close to real calories of the fruit.
Software Testing is a process of detecting errors while executing a program so that we get a zero defect software. The aim of this paper is to evaluate and establish a comprehensive view of the field of software testing. The main objective of this paper is to bring out the relevant issues of the mobile application testing and the tools to remove them. There are many tools available in the market at the moment, each having its own features to test software. For this paper, we have discussed a set of tools, which are constructed to tackle the appropriate topic problems. Software testing is a crucial area of research and a lot of development has been made. We did not mean to give a complete overview of the mobile application testing field- challenges and methodologies, rather we intended to show an overview of the tools which are meant to overcome the challenges of the software testing.
Various audio Steganography techniques for audio signals
Rubby Garg , Dr.Vijay Laxmi
The rapid development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information. Besides that, digital documents are also easy to copy and distribute, therefore it will be faced by many threats. It is a big security and privacy issue, it become necessary to find appropriate rotation because of the significance, accuracy and sensitivity of the information. Steganography and Cryptography are considered as one of the techniques which are used to protect the important information, but both techniques have their pro’s and con’s. In this paper we have proposed a novel approach to hide the data in audio signals based on LSB technique. Performance of the proposed system is evaluated on various parameters and is compared with the existing systems.
Impact of Security Risk on Cloud Computing Adoption
Shashank Mishra, Manju. Pandey
Cloud computing provides application and services over the Web. The services are provided to all over the world by the data centers, which is referred to as the "cloud." This mechanism provides solution to the many network connections and Computer systems in online services. This shows the broad reach of internet, while we are simplifying its complexity. Any user with an Internet connection can use the cloud services provided by it. when these services are connected, users can share information to each other and also to the web. But before entering into the cloud we have know that the rapid growth of cloud computing also increases sever security concern. The Security of cloud is a big issue for open system and cloud computing. It has many security issues like analyzing the data privacy, security auditing and data monitoring. Till now the existing models are too far away to cover the full complexity of the cloud computing model. Some papers has been proposed previously for security of cloud computing but the implementation proposed by them is not enough for full complexity of cloud. In this paper we are going to implement a new mathematical algorithm for low level of risk in Cloud computing environment. The purpose of this attempt is to focus (or provide mechanism) on some new points where the security level founds very low.
A New Hybrid Prediction Approach For Enhance Prediction Accuracy of Complex Data
Akash Sharma , Nikita Jain
Over past few decades a lot of approaches have introduced for data prediction practice and many of researchers are continuously introducing their unique ideas on a daily basis for enhancing the power of prediction system in different areas but each and every approach has its own limitation. One of the most limitations of existing prediction approaches is that they are designed with the strength of a single classification technique and are not enough to handle different types of data efficiently thus the cost of system is high and field still lacks with high rate of prediction accuracy. This dilemma of existing prediction systems has consider into this paper with proposing an effective and efficient data prediction system that enhances data prediction accuracy in a significant way
Critical Evaluation of Prince2 and Agile Project Management Methodologies for a complex project
Priyanka ,
Most of major IT projects failed due to its complexity and the project complexity is determined by the intensity of business requirements and its fluctuant nature. Prince2 is the solution for a sophisticated project because of its vast range of stage driven processes which mainly focus on project benefits in each stage. The main objective of Prince2 is breaking the complexity into small parts/stages which can be achieved easily. The seven processes of Prince2 cover the entire project lifecycle including requirement gathering, as-is process, development, and product delivery. The seven themes cover major concerns of any project including business case, risk management and quality on deliverables. The seven principles are the seven pillars of Prince2 which sustain its standard. It is understood that the Agile project management methodology is suitable for medium and small scaled projects. The five groups processes of Agile project management intensely focus on development activities and make sure the required product is delivered without any error. It is recommended that Prince2 is certainly suitable for complex projects as a dominant project management methodology, however, Scrum technique can be used for depth IT development as a secondary project management methodology within Prince2.
Users normally tend to reuse the same personalized identification number (PIN) for multiple applications. Direct PIN entries are highly susceptible to shoulder-surfing attacks as attackers can effectively capture user’s PIN entry number with the help of concealed cameras. Indirect PIN entry methods proposed as counter measures are rarely deployed because they demand a heavier cognitive workload for users. To achieve fool-proof security and usability, a practical indirect PIN entry method called SteganoPIN is proposed. The human–machine interface of SteganoPIN comprises two numerical keypads: one shielded or hidden and the other exposed, designed specifically to physically thwart and protect against shoulder-surfing attacks. After locating a long-term PIN in the more usual layout, through the covered permuted keypad, a user generates a one-time password that can safely be entered in plain view of attackers. This enables the user to establish a secure transaction by means of a mobile app to the server by implementing the SteganoPIN method using multi-touch concept that is based on independent variable PIN entry system (Standard PIN, SteganoPIN).
Multilevel and Biometric-Graphical Secure Authentication System Using Pattern Matching and Gene Based Data Extraction
Soriful Hoque
In computer science considering the large scale network and increasing number of application, the protection and security of the system is essential. The Information Technology System is improving day by day. The growing concern of computer science technology creates problem for identity theft problems, hence protection and computer system security is playing vital role. The conventional alphanumeric password protection system is very vulnerable. This kind of alphanumeric password can be traced and difficult to remember for the system user. This leads to a new era of protection and security system in case of authenticity in computer science. In this paper, it is proposed to have very user-friendly secure multilevel biometric graphical authentication system. This kind of authentication system is easy for authorized user but very difficult to un-authorized user to trace. The multilevel and biometric-graphical secure authentication system algorithm described here uses more than one level of authentication system such as face detection, DNA finger printing, image pattern matching and alphanumeric password with image pattern. This algorithm based on entities that are actually bound with the individual at a much secure level than. As a result, they are more reliable since biometric information cannot be lost, forgotten, or guessed easily.
Trusted and Energy Efficient Routing Protocol for Heterogeneous Wireless Sensor Networks
R. Sai Sandhya, R. Padmini
In this paper, we propose STEER for establishing secure trust-value and vitality productive directing convention for Heterogeneous Wireless Sensor Networks. STEER joins Incentive and notoriety based system to enhance the reliability in Heterogeneous Wireless Sensor Networks. The Incentive System includes credits the hubs for relaying packets. The Trust System evaluates the hub's ability and dependability in transmitting packets as far as multi-dimensional trust values. We execute two steering conventions to direct movement through exceedingly trusted hub having adequate vitality to minimize the likelihood of breaking the course. Though there are some current studies in Incentive Schemes, still they are not proficient in either vitality effectiveness or trust value, we proposed a safe convention that insurances Shortest course Path, Maximizing the system life time furthermore the Trust value. By along these lines, STEER can fortify the hubs handoff bundles, as well as keep up course strength and report right vitality ability. This is on the grounds that any loss of trust will bring about loss of future profit. In addition, for the proficient usage of the trust framework, the trust qualities are figured by handling the installment receipts. Systematic results show that STEER a safe the installment and trust figuring without false allegations. Recreation results exhibit that our directing protocols can enhance the packet conveyance proportion and course dependability.
Providing Privateness via Transaction Splitting Using PFP Growth Algorithm
M. Jhansi Rani, K. Venkatagurunath Naidu
Frequent itemset mining is one of the most necessary problems in data extraction. The chance of devious a discrepancy private frequent itemset mining algorithm which can not only accomplish high data usefulness and a high level of secrecy, but also offer high time effectiveness. To this end, offer a discrepancy frequent itemset mining algorithm based on the Frequent Pattern-growth algorithm, which is referred to as Private Frequent Pattern-growth. The Private frequent Pattern -growth algorithm consists of a preprocessing phase and a mining phase. To improve the utility and secrecy tradeoff, a innovative smart splitting method is proposed to change the database in preprocessing phase. It needs to be performed only once for a given database. To compensate the information loss caused by transaction splitting, To estimate the definite support of itemsets in the original database in mining segment utilize run time estimation method. In accumulation, develop a dynamic reduction method to dynamically reduce the amount of noise added to guarantee privacy during the extracting process by leveraging the downward closure property. Private common pattern-growth algorithm is shown it is ε-differentially private through formal privacy analysis; explain that PFP-growth algorithm is ε- discrepancy secrecy. Extensive experiments on real datasets exemplify that our PFP-growth algorithm considerably outperforms the state-of-the-art techniques.
Time Sequence Scheme for Ideal Transmit In Remote Systems
K. Surekha, N. Padmaja
In the paper work scheme, particularly in reconfigurable wireless ad hoc networks is a fundamental operation , For example, By all on-demand mobile network routing protocols, some form of broadcasting is used . In case of any uncertainty to the location of the destination node, or for service discovery. Inside work, with dynamic topologies we present a new approach to efficient broadcast in networks and we proposed a new online local broadcasting algorithm called time sequence scheme (TSS), for such networking environments. TSS ranks by priority, in a sequence, applicant broadcasting nodes to the total number of re-broadcasts in the network is reduced. as evaluating TSS, showing that its performance comes remarkably close to the equivalent theoretical performance boundaries, during in the packet loss, example, to MAC-layer collisions.In addition, compare our algorithm along with a recently developed schemes considering the performance in various sensible network mobility scenarios. To demonstrate that TSS performance is robust in the background of mobility induced topology reconfigurations—together with temporal network partitioni—through propagation of the broadcast message
Performance of Multi Server Validation and Key Association with User Protection in Network Security
R. Padmini .
The use of smart cards, far off person validation and key association can be simplified, bendy, and efficient for growing a comfy allotted computer systems surroundings. Addition to user authentication and key distribution, it is very useful for providing identification privacy for users. on this paper, we propose novel multi server authentication and key agreement schemes with consumer protection in community protection. We first endorse a single-server scheme and then observe this scheme to a multi-server environment. the principle deserves. Encompass: (1) The privacy of users may be ensured; (2) a consumer can freely choose his own password; (3) the computation and conversation price could be very low; (4) servers and users can authenticate every other; (five) it generates a consultation key agreed with the aid of the server and the person; (6) our proposed schemes are Nonce-based schemes which does now not have a serious time synchronization problem.
Texture Classification with Motif Shape Patterns on Local Directional Pattern
Dr. P Chandra Sekhar Reddy
The present paper evaluates motif shape patterns on Local Directional Pattern for texture classification. In this method Local Directional Pattern is computed on the image and then motif shape parameters are evaluated. These shape patterns are used as features for classification. The present method extensively tested on the Dataset consists of various brick, granite, and marble and mosaic stone textures collected from Vistex, Mayang database and also from natural resources from digital camera. The experimental results demonstrate that it is simple and much more effective than the other methods for texture classification.