Impacts of Television Serials on Tamil Rural Women: A Data Mining Case Study of Theni District
K. Sujitha, R. Shantha Mary Joshitta, B. Nivetha,
Television, a medium invented for entertainment, has become burdens to today’s life. Instead of improving the knowledge of the people, now-a-days the serials dramas broadcast on the Indian television channels become nuisance for human life. The rural people of Tamilnadu are greatly affected by these television dramas. The researcher conducted a case study in Theni district of Tamilnadu to analyze the impacts of television serials on rural adults and women using data mining clustering techniques. Data were collected from the public by a set of questioner and it is processed using R tool – a statistical tool which is used for data mining analysis. Gender, Age and educational level based classification is performed to analyze the impact of television serials. It is found that the rural women, irrespective of age, educational qualification are greatly affected by these serial dramas and wasting their precious time on this television serials. This research bring an alarm that these serial dramas become a threat to the socio-cultural values of Tamilnadu.
Rakesh Pathak (PG Scholar) k.Narender M.tech, Dr. B.R.Vikram Ph.D,
A Novel Ber Analysis Approach For Different Modulation Schemes For Conventional And Wavelet Based Ofdm System
Rakesh Pathak (PG Scholar) k.Narender M.tech, Dr. B.R.Vikram Ph.D,
Orthogonal Frequency Division Multiplexing is considered as an advanced communication model which has wide range of applications such as 3G, 4G, and Wi-Fi etc. Comparing with Frequency Division Multiplexing (FDM), we are providing a higher level of spectral efficiency by using the OFDM Multiple Carriers. To overcome the problems of the Inter carrier interference (ICI) and inter symbol interference (ISI), using of the cyclic prefix which uses the available bandwidth of 20% for the loss of the Orthogonality between the sub carriers in OFDM. Providing of better Orthogonality which is based on the wavelet of OFDM and improving of BER (Bit Error Rate) is also used in it. Increasing of the spectral efficiency we doesn’t require the cyclic prefix which is based on the wavelet system. In the upcoming of the 4th generation LTE, using of wavelets in place of Discrete Fourier Transform (DFT) based OFDM is proposed. Using of the wavelets and DFT based OFDM Systems we have compared the performance of the BER.
Increasing Coverage in Wireless Mesh Networks: Mobile mesh networks
F Mr. Basavantraya, Mr. Kiranbabu T S,
These Mobile ad hoc networks (MANETs) are an epitome for instances where a fixed infrastructure is not available or infeasible. Today’s MANETs however, possibly suffer from network partitioning. This kind of constraints makes MANETs illegible for applications such as crisis or catastrophe management and extreme battlefield communications, in which team ought to endeavor in groups scattered in the application territory. In similar applications, intercommunication within the group is vital to the team effort. To address this flaw, we have come up with this paper, a totally new class of ad-hoc network called Mobile Mesh Network. In Contrary to the existing typical mobile-mesh networks, the nodes of an mobile mesh networks can pursue the mesh clients in the application territory, and organize themselves into a network topology which is suitable to assure good connectivity for intra and intergroup both communications. We lay before a distributed or evenly spread client tracking solution to deal with the charismatic nature of client mobility, and present modus operandi for dynamic topology adoption in compliance with the pattern of mobility of the clients. Our imitated results indicate that mobile mesh network is robust against network partitioning and cover the more no of clients compare to the standard wireless network that is here we track the disappeared clients
Cloud Computing and Virtualization are two booming technologies in this era. This paper is an attempt to correlate cloud with virtualization. This paper represents cloud as ‘automated virtualization’ and ‘how virtualization is a necessity to implement cloud’ is the main concern of paper. Cloud comes with business need in any IT organization to manage business cost and faster availability of required infrastructure. It comes by implementing defined processes and services to meet the increasing demand by business to reduce time of availability. Cloud computing is inclusive of virtualization and a way to implement it. While it is not uncommon to discuss them interchangeably; they are different approaches to solve the problem of maximizing the use of available resources. They differ in many ways and that also leads to some important considerations when selecting both or between the two. While understanding these technologies separately, we found that somewhere these technologies cannot be separated from each other. When we implement cloud, Virtualization is the technology which works as the right hand and is difficult to avoid. So, it will not wrong if we say that both of these technologies complement each other.
To Automate the Services Life Cycle of Cloud Using Semantic Technologies
N.Sunanda, M.Naresh Babu,
Cloud computing is an emerging computing paradigm in which resources of the computing infrastructure. As promising as it is, this paradigm also brings forth many new challenges for data security and access control when users outsource sensitive data for sharing on cloud servers in which they are not in the same domain as data owners. To retain delicate user data privileged against untrusted servers, survive solutions commonly uses cryptographic methods by disclosing data decryption keys only to authorized users. In Proposed prototype, the service attributes are like service cost, backup rules, storage size, and service availability. Specification details like quality of data, security levels that are accepted for the service software. To show however this life cycle will change the usage of cloud services, we have a tendency to describe a cloud storage epitome that we have developed. We have divided the IT service life cycle into five phases those are requirements, discovery, negotiation, composition, and consumption
Web Based Spatial Query System for Decision Making
Alfisha Khan, Vinod M Bothale , K.Subba Reddy,
GIS is becoming essential to understand what is happening and what will happen-in geographic space. This project web based geographic information system (GIS) lets us visualize, question, analyze, and interpret data to understand relationships, patterns, and trends online. Most of the geo-portals which are web based primarily provide display of geo-data/ satellite images. But analysis of geographical data to solve the problem of real nature on the move is very limited. The proposed project work envisages development of a prototype for carrying out multilayer GIS queries online using geo processing techniques and WPS. The development of this prototype is aimed at providing support in decision making to the users online. Through this model, users can query information of specific geographical area of interest from data spanning multiple layers. The query result will be retrieved by making use of geo-processing techniques, and Open Geo Consortium (OGC) compliant services such as Web Processing Service (WPS) and Web Map Service (WMS) etc. Spatial queries are also performed on the resultant output obtained by spatial queries performed earlier, thus forming a chain of web of processing services.
Event Detection in Twitter Using Mentioning Practice of the User
Chaitra , M,
Twitter is one of the most popular Online Social Networks (OSN’s) nowadays. Twitter is valuable source of timely information. Detecting events in the twitter is difficult task; information on Twitter is overwhelmed by huge volume of uninteresting tweets. Prior works may not be appropriate, because it focuses on only textual contents. In Twitter user may also inserts non- textual contents of their interest. Event detection in Twitter Using mentioning practice of the user (EDT), a novel method that gives desired result of creation frequency of dynamic links (i.e. mentions) that users inserts in tweets to detect important events and find the impactful words of the events The experiments we conducted shows that the proposed method can efficiently detect accurate (95%) and meaningful events.
Predicting Patterns over Protein Sequences Using Apriori Algorithm
Savitri , Dhumale,
Data mining is the domain that helps to extract the useful data from large store house of data. It has a large scope in the field of biological science as it can solve the critical problems related to the sequence pattern mining by working on large data sets. It helps in classification of biological sequences. The building blocks of proteins are amino acids. These amino acids play important role. Some are good for health and some are harmful when they come in association with other amino acid. The proposed system is focused on generating the frequent amino acid sets and association rules. We have considered five diseases in our project. If the association rule corresponding to the particular disease occurs in the list of association rules obtained in output, then we can conclude that the disease corresponding to that association rule exists. The measuring parameters are support count and confidence.
The traditional encryption techniques use mathematical and theoretical concepts. With the advances in technology, the intruders are able to decrypt the private data and steal the secretive information easily. Thus cryptography using DNA technology can be viewed as a new hope for unbreakable algorithms. The text data is encrypted using DNA technology. A key is generated and encrypted. The encrypted text and key are then hidden in an image by applying Discrete Cosine Transform (DCT) or Discrete Wavelet Transform (DWT) to generate a DCT watermarked image or a DWT watermarked image. The watermarked image is transmitted to the receiver. Encryption followed by steganography further enhances the security of the data being transmitted. The proposed algorithm is evaluated using correlation coefficient, PSNR, MSE and embedding capacity
Lung cancer is one of the most difficult and dangerous cancer to cure and the number of deaths due to cancer is increasing day by day. The death rate of patients will be reduced if lung cancer is detected in the early stages. This paper aims at developing a Computer Aided Diagnosis (CAD) system for detection of lung cancer by analyzing the Computed Tomography (CT) images of lungs. The images are collected from the LIDC dataset and enhanced to increase the contrast of images by Median filter. After enhancement morphological segmentation is used to segment lungs region from the CT images. Then segmented images are used to identify and classify the cancerous and noncancerous nodules using the Support Vector Machine (SVM) classifier. The SVM polynomial has given the 96.6% accuracy and SVM quadratic function has given the 92% accuracy.
Improved Gradient & Multiple Selection Based Sorted Switching Median Filter
Kanchan Bali Er. Prabhpreet Kaur,
This research work has proposed a improved gradient based sorted switching median filter for highly corrupted medical images. It has the capability to decrease the high thickness of noise from medical photos and also conduct better around the others when input picture is noise-less. The proposed technique has additionally capability to conserve the edges by usage of the gradient based smoothing. The proposed approach has been developed and executed in MATLAB 2013a tool using image processing toolbox. Various kinds of the medical photos have been taken for experimental purpose. Comparative examination has shown that the proposed algorithm is efficient compared to available techniques.
Image denoising has been evolving as an essential exercise in medical imaging especially the Magnetic Resonance Imaging (MRI). The presence of noise in the biomedical images is one of the major challenge in image analysis and image processing. Denoising techniques are aimed at removing distortion or noise from the image while retaining the original quality of the image. Medical image obtained are often corrupted with noise. The MR images are processed to improve visual appearance to the viewers. Here we will discuss the multiresolution techniques such as scalar wavelet, multiwavelet and laplacian pyramid, Non local means (NLM) and compare their statistical parameters.
Cloud computing is one of the latest computer and business industry buzz words. These days everyone wants to get rid of on desk burden. For example the Burden of on desk hardware (memory, hard disk, CPU), application software, system software, business continuity techniques, database ext. Cloud computing is one solution which provide all the above ‘As a service’. Cloud computing is a package of services. Data is no more Desktop bounded, you can access it from anywhere throughout world. Technological advancements – particularly the introduction of ‘cloud computing’ – have now made this a reality. This paper presents cloud as a service provider. Platform as a Service (PaaS) has become a bigger player in cloud computing, although it will continue to remain a smaller domain than Infrastructure as a Service (IaaS) and Software as a Service (SaaS). As the concept spread and offerings expanded, the industry has now embraced two flavors of cloud storage: public and private. Cloud computing is the way of doing business where the customer buys on demand, and the supplier charges for what the customer uses.. Due to Environmental benefits of cloud, it is on boom these days.
Design And Development Of Algorithm To Avoid Wormhole Attacks In Wireless Networks For Proactive Algorithms
Deepika D, Pai,
Mobile Adhoc Networks (MANETS) allows portable devices to establish a communication independent of a central infrastructure. The wireless links in this network are highly error prone and can go down frequently due to mobility of nodes.MANETS are vulnerable to many security attacks because of shared channel, insecure operating environment, lack of central authority, limited resource availability, dynamically changing network topology and resource constraints. Among the different attacks at the different network layers the wormhole attack is the most malicious. In this work, the wormhole attack is implemented in three different modes i.e. all pass, all drop and threshold mode with varying network sizes. Due to a highly dynamic environment routing in MANETS is a critical task. To make the network reliable we need efficient routing protocols. The routing protocols are of three kinds i.e. Proactive, reactive and hybrid. Four proactive protocols(DSDV, OLSR ,WRP,GSR,TBRPF) are analyzed in the presence of the worm hole attack considering four parameters which are throughput, average end-to-end delay, average jitter and packet delivery ratio in mobility and non-mobility domain.
A Feasible Approach to Design a Cmos Domino Circuit at Low Power VLSI Application Design
Pallavi Sharma, Dr Subodh Wariya,
Dynamic logic style is used in high performance circuit design because of its fast speed and less transistors requirement as compared to CMOS logic style. But it is not widely accepted for all types of circuit implementations due to its less noise tolerance and charge sharing problems. A small noise at the input of the dynamic logic can change the desired output. Domino logic uses one static CMOS inverter at the output of dynamic node which is more noise immune and consuming very less power as compared to other proposed circuit. In this paper we have proposed a novel circuit for domino logic which has less area and has less power-delay product (PDP) as compared to previous reported articles. Low PDP is achieved by using semi-dynamic logic buffer and also reducing leakage current when PDN is not conducting. Then comparison analysis has been carried out by simulating the circuits in 90nm CMOS process technology from TSMC using Tanner EDA 14.11..
Design Of Energy-Efficient Clustering Algorithm for Large Scale Wireless Sensor Networks
Sridevi H N, C K Raju,
A wireless sensor network (WSN) is a network system comprised of spatially distributed devices using wireless sensor nodes with limited energy resources. The individual nodes are capable of sensing their environments, processing the information data locally, and sending data to one or more collection points in a WSN. That issue can be solve by using clustering algorithm as a solution to reduce energy consumption. The System implements a clustering method for entire networks. The nodes are organized into clusters. Within a cluster, nodes transmit data to cluster head (CH) without using . CHs transmit data to sink. In this paper, a new clustering algorithm, named PDKC, is proposed for wireless sensor networks based on node deployment knowledge. In PDKC , sensor node location is modelled by Gaussian probability distribution function instead of using GPSs or any other location-aware devices. Finally, we present a distributed implementation of the clustering method. Extensive simulations confirm that our method can reduce the number of transmissions significantly and shows corresponding time delay.
Design of Wireless Variable Gesture Control For Mobile Devices
1Ch.Supriya, R.Prashanthi, M.Tech
Electromyography (EMG) is a technique for evaluating and recording the electrical activity produced by skeletal muscles. EMG is performed using an instrument called an electromyograph, to produce a record called an electromyogram. An electromyograph detects the electrical potential generated by muscle cell when these cells are electrically or neurologically activated. The signals can be analyzed to detect medical abnormalities, activation level, or recruitment order or to analyze the biomechanics of human or animal movement. In the proposed method we are using sensors instead of camera to sense and identify the gestures. Here we are using Accelerometers and surface electromyography (SEMG) sensors. These two sensors provide two potential technologies for gesture sensing. Accelerometers can measure accelerations (ACC) from vibrations and the gravity; therefore they are good at capturing noticeable, large-scale gestures.
ARM Based Application Of Data Hiding Process By Using Anti Forensics Technique For Authentication
K. Jaswanthi, J. Amarendra, M.Tech(ph.d)
In this paper, the system for data hiding in audio-video using anti forensics technique for authentication and data security is designed by using 32-bit ARM controller for image, audio and image steganography. The embedded system will help to secure the message with in the audio and video file. In Embedded system, the message much secure because even though if the unauthorized person succeeds in being able to hack the image, the person will not able to read the message as well as acquire the information in the audio file. Secret data like image and audio is encrypted into cover data by developing the application involved with LSB algorithm on ARM architecture device.
Quality Of Experience In Remote Virtual Desktop Services
P.Lakshmi, S.Himabindu,
The Remote Desktop (RD) paradigm allows end users to remotely access content and applications running on their PCs through a network connection. Remote desktop system uses ARM 32-bit Raspberry Pi micro controller useful for the development of remote virtual desktop services. Raspberry pi board is interface with keyboard, mouse and monitor. Raspberry Pi board setup is interfaced to the PC using LAN connection and communication is done using RDP protocol. To access PC have to use xserver both in PC and Raspberry Pi. Xserver (GUI) is used as interface between user and operating system. Xserver is used to control all actions generated by key board, mouse and gives indication to the system to perform the task. For example if any icon is pressed on the desktop using mouse then it raises event to operating system to perform certain action. Since PC and raspberry Pi both are interfaced through LAN network can be access the desktop using RDP protocol. RDP protocol is remote desktop protocol which is used to read frame buffer from remote desktop and allows displaying the received frame buffer on the monitor connected to Raspberry Pi. Now remote desktop on monitor connected to raspberry pi board. By using keyboard and mouse can be access desktop like opening web browser or entering data on note pad through key board etc.
Kindiling And Perception Of Qr-Images Using Raspberry-PI
1P.V.Vinod Kumar, K.Dhanunjaya, M.Tech
The QR Codes can be detected by webcam which is connected to ARM micro controller through USB device and the image is processed by using image processing technique. Image processing is any form of signal processing for which the input is an image, such as a photograph or video frame; the output of image processing may be either an image or a set of characteristics or parameters related to the image. In this project we are not only recognizing QR code but also generating color image from the QR code. The main advantage of this project is to recognize the QR Code from the webcam and generates color image. The controller will display the results on display unit as well as we can store the QR code and color image into pen drive.
Text Mining using Ontology based Similarity Measure
Atiya Kazi, D.T.Kurian,
In many text mining applications, the side-information contained within the text document will contribute to enhance the overall clustering process. The proposed algorithm performs clustering of data along with the side information, by combining classical partitioning algorithms with probabilistic models to boost the efficacy of the clustering approach. The clusters generated will be used as a training model to solve the classification problem. The proposed work will also make use of a similarity based ontology algorithm, by incorporating two shared word spaces, to perk up the clustering approach. This will boost the amount of knowledge gained from text documents by including ontology with side-information.
Exploiting a Protocol at network frame to avoid packet failure
Ms.Jayashree Yadannavar, Ms.Chandrakala Patil
The concurrent data communication which depends upon congestion control in internet without packet failure very much difficult ,it straight away signifies an incompetent design to show insecurity and reduced performance. There should some mechanism to control packet lass to avoid quality of services while interpretation media rich content.. Data communication needs major role of network congestion. congestion control algorithm which is the mechanism helps in controlling packet loss which gives good performance of the network. There are many protocols which work as supplement the TCP protocol. These protocol helps to avoid the network congestion. The fair service with open – loop controller built by CSFQ has in progress corrosion in quality as P2P flows conquered internet traffic of late. Token-Based Congestion Control (TBCC) which is the closed loop congestion control was able to limit unbearable resources and give best service to end users. It monitors inter-domain traffic for trust relationships. The new mechanism presents Stable Token-Limited Congestion Control (STLCC). for controlling inter-domain congestion and improve network performance. In this paper we apply the STLCC method
Minimizing Time Complexity based on Parallel Processing using RNS for Audio
Divya Jennifer Dsouza, Radhakrishna Dodmane,
In this paper, we propose a secure mechanism to transfer sensitive audio files over an unreliable network. In order to prevent the unauthorized access of data that can be collected at the time of data transmission, a secure approach is introduced Audio Cryptography with parallelism that may be some of the future solution for the above mentioned problem. The proposed approach uses the suitable technique such as the Pseudo Random Number Generation(PRNG) method with seed called as the Initial Vector (IV) as the parameter for generating random numbers and the Residual Number System for encryption. The audio file is in turn encrypted using the keys generated using the PRNG. The concept of Residual Number System (RNS) based on Chinese Remainder Theorem (CRT) for share creation is performed in parallel to maximize the resource utilization and make the process faster. These, audio shares are sent over the channel over the network to increase security. Theoretical analyses and experimental results show that, there is significant reduction in time consumption.
Design of An Efficient Aging-Aware Reliable Multiplier Based On Adaptive Hold Logic
T. Shireesha, P.P. Nagaraja Rao,
Digital multipliers are among the most critical arithmetic functional units. The overall performance of the Digital multiplier systems depends on throughput of the multiplier. The negative bias temperature instability effect occurs when a pMOS transistor is under negative bias (Vgs = −Vdd), increasing the threshold voltage of a pMOS transistor and reducing the multiplier speed. Similarly, positive bias temperature instability occurs when an nMOS transistor is under positive bias. Both effects degrade the speed of the transistor and in the long term, the system may be fail due to timing violations. Therefore, it is required to design reliable high-performance multipliers. In this paper, we implement an agingaware multiplier design with a novel adaptive hold logic (AHL) circuit. The multiplier is able to provide the higher throughput through the variable latency and can adjust the adaptive hold logic (AHL) circuit to lessen performance degradation that is due to the aging effect. The proposed design can be applied to the column bypass multiplier.
The objective of this paper is to design and build a portable assistive device for narcolepsy patients in order to help them to feel comfortable and lead a normal life, so people can be independent. A narcoleptic person suffers from a brain disorder that causes uncontrolled sleepiness because of over emotions, anger etc. This device provides real time monitoring system and alerts if they are in state of sleep and it also reduces the parameters causing narcolepsy (such as excessive vibration, heavy noise etc) . This paper gives brief description about an assistive system designed for narcoleptic patients in order to help them to lead a normal life
P.Sreevani, Prof.B. Abdul Rahim, Fahimuddin.Shaik,
Spectral _Spatial Classification For Hyper Spectral Images Based On An Effective Extended Random Walker
P.Sreevani, Prof.B. Abdul Rahim, Fahimuddin.Shaik,
Hyper spectral image classify by using spectral –spatial classification based on Extended random walker (ERW)s .In this ERW have mainly two steps . First go to pixel wise classification by using support vector machine (SVM) .It is used for classification probability maps for a hyper spectral images. The probabilities of hyper spectral Pixel belongs to different classes .The second approach is obtain pixel wise probability maps are optimized by Extended random walker algorithm. Based on three factors i.e, Pixel wise statistics information by SVM classifier, spatial correlation among adjacent pixels modeled by the weights of graph edges and the connectedness between the training and test samples modeled by random walkers used to the class of the test pixel determined .So these three factors considered in ERW. By using Gaussian mixture model method in the proposed method shows very good classification and high accuracy performs for three widely used real hyper spectral data sets even the number of training samples is relatively small
Spray Forming of Al-10Si-20Pb, Al-10Si-20Sn and Al-20Sn alloy: A Comparative study of their microstructure and wear characteristics
Mallikarjuna Pujar, Dr.G.B.Rudrakshi,
In this work an attempt has been made to evaluate the microstructural and wear characteristics of spray formed Al-10Si-20Pb, Al-10Si-20Sn and Al-20Sn alloys. The spray forming method was employed for preparation of samples. The microstructural futures of spray formed alloys were studied under optical microscope. The microstructure of alloys shows uniform distribution of lead and tin particles along the grain boundary of base alloy. The wear characteristics of alloys were investigated under dray sliding condition by pin on disc type wear testing machine. Also the influence of applied load, sliding speed and frictional coefficient are determined. Worn out surface of pins are studied under optical microscope to determine the type of wear mechanism. The wear test results of all alloys shows increase in wear rate with applied load and sliding speed. The spray formed Al-10Si-20Pb alloy exhibits better wear resistance and lower coefficient of friction as compared to Al-10Si-20Sn and al-20Sn alloy
Comparative analysis of Statistical Parameters of Finger Knuckle in Digital Image Processing
Er. Mahak kukreja, Er. Alankrtia Aggarwal,
Accurate automatic personal identification is becoming more and more important to the operation of our increasingly electronically interconnected information society. Among the several types of biometrics used for security purpose, finger-knuckle print biometric has a low data collection error rate and high user acceptability. So, the finger-knuckle print identification has been the important topic of research for many years. Another approaches that uses the estimated orientation field in a finger-knuckle print image to classify finger- knuckle print, on the other hand suffer from lower accuracy. The work explored the use of image processing in finger knuckle-print gives the different parameters of finger-knuckle print, and also explains in what amount two finger-knuckle print are similar. This system is used for the verification of a person, and we also analysis PSNR as well as SNR.
Comparative analysis of Statistical Parameters of Finger Knuckle in Digital Image Processing
Azmath Akheela, Hina Naz,
With fast-growing consumer demands and rapidly-developing mobile technologies, portable mobile devices are becoming a necessity of our daily lives. However, existing mobile devices rely on the wireless infrastructure to access Internet services provided by central application providers. This architecture is inefficient in many situations and also does not utilize abundant interdevice communication opportunities in many scenarios. This paper proposes the human network (HUNET), a network architecture that enables information sharing between mobile devices through direct interdevice communication. We design B-SUB, an interest-driven information sharing system for HUNETs. In B-SUB, content and user interests are described by tags, which are human-readable strings that are designated by users. An experiment is performed to demonstrate the effectiveness of this tag-based content description method. To facilitate efficient data dissemination, we invent the Temporal Counting Bloom filter (TCBF) to encode tags, which also reduces the overhead of content routing. Comprehensive theoretical analyses on the parameter tuning of B-SUB are presented and verify B-SUB’s ability to work efficiently under various network conditions. We then extend B-SUB’s routing scheme to provide a stronger privacy guarantee. Extensive realworld trace-driven simulations are performed to evaluate the performance of B-SUB, and the results demonstrate its efficiency and usefulness
Performance Evaluation of LMS, DLMS and TVLMS Digital Adaptive FIR filters by using MATLAB
Ashwini Ramteke , Prof. N.P.Bodane
This paper describes the comparison between adaptive filtering algorithms that is least mean square (LMS), Delay least mean square (DLMS),Time varying least mean square (TVLMS) . Implementation aspects of these algorithms and Signal to Noise ratio are examined. Here, the adaptive behavior of the algorithms is analyzed. MATLAB/ Simulink is used to design. Noise Cancellation is a technique used for reducing undesired noise signal. Analysis of LMS , DLMS and Time-Varying Least Mean Square algorithm and comparison on the basis of various performance indices like MSE, SNR. The experimental results reveal that the LMS algorithm outputs are the best.
Reactive Power Compensation In Distribution Systems Using SAPF
Mandeep Paposa, Mr. Shailendra Kumar Verma
Traditionally when one imagine about power quality, images of classical waveforms comprising 3rd, 5th, 7th etc. harmonics come into picture .On this basis IEEE in team with industries and utilities and academia began to attack this problem in “power quality ” from the 80s. However in last few years the term power quality has grown in meaning form a simple power system harmonics. Since the growth of the industries is inevitable with more complex and sensitive equipments involving semiconductor etc. the power quality engulfed a whole new dimension of errors that appear in power system. These issues or anomalies are becoming more unavoidable and growing strongly and sensitive equipments will become even more sensitive due to the fierce market competition. Now, efficiency and cost are treated as peer parameters of equal importance. In such a situation Active Power Filters have emerged as the solution in which SAPF is employed for removal of load current harmonics and reactive power compensation .Active filters for power conditioning offer the following:
Cloud-Computing: A milestone in Information Technology
Kumari Seema Rani ,
-In the recent booming era, the information technology is developed which offers human entertainment, convenience, enjoyment etc. Cloud computing is one of emerging development of IT industry. It is one of the IT services which based on leased and provide service to customer over a network. It also known as on-demand computing. The services of cloud-computing are managed by third party. It provides scalability, reliability, high performance and low cost as compared to dedicated infrastructure and saves managing cost and time. Most of sectors like Banking, public healthcare, and education are approaching towards the cloud due to efficient utilization of cloud services on pay-per-use basis. Cloud computing is completely an internet dependent technology where client data is stored and maintain in the data centre of cloud provider like Google, Amazon, Microsoft etc. The aim of this paper is better understanding about cloud computing and their design. This research paper outlines about cloud and all the details.
Design of Query Recommendation System using Clustering & Rank Updater
Priyanka Rani, Mrs. Annu Mor
In this paper I propose a method that, given a query submitted to a search engine, suggests a list of related queries. Query recommendation is a method to improve search results in web. This paper presents a method for mining search engine query logs to obtain fast query recommendation on a large scale. Search engines generally return long list of ranked pages, finding the important information related to a particular topic is becoming increasingly difficult and therefore, optimized search engines become one of the most popular solution available. In this work, an algorithm has been applied to recommend related queries to a query submitted by user. For this, the technology used for allowing query recommendations is query log which contains attributes like query name, clicked URL, rank, time. Then, the similarity based on keywords as well as clicked URL’s is calculated. Additionally, clusters have been obtained by combining the similarities of both keywords and clicked URL’s. The related queries are based in previously issued queries The method not only discovers the related queries, but also ranks them according to a relevance criterion. In this paper the rank is updated only the clicked URL, not all the related URL’s of the page.
Abnormality Detection in Kidney Ultrasound Imaging
Prema T. Akkasalgar Shruti S. karakalmani
This paper proposes a technique for segmentation and identification of kidney stone from the ultrasound image of kidney. Initially ultrasound kidney image is pre-processed using two types of filters namely median filter and wavelet filter. Pre-processed image is taken as input for segmentation process; seed region growing algorithm is used to segment the renal calculi from ultrasound image of kidney, further region parameters are extracted from the segmented region. Finally area of each renal calculi and time taken for the entire process is calculated.
Simulation Of High Voltage Gain Boost Converter For Battery Charging With PV System In A Single Stage Conversion
Praneetha Tummala, Amarendra Alluri
The require for renewable energy sources is on the rise because of the sensitive energy crisis in the world today. Renewable energy is the energy which comes from natural resources such as sunlight, wind, rain, tides and geothermal heat. These resources are renewable and can be obviously replenished. Therefore, for all practical purposes, these resources can be considered to be never-ending, unlike diminishing conventional fossil fuels. In order to give support to the growing technology demand of renewable energy applications. This work presents a high voltage gain boost converter topology based on battery charging using coupled inductors and solar panels to decrease ripples in output voltage. The presented converter operates in Zero voltage switching (ZVS) mode for all switches. By using the novel concept of single –stage approaches, the converter be capable of generate a dc bus simultaneous charge of the batteries according to the radiation level. In this paper we can check the output voltage ripple in various MPPT techniques. Simulation is done by using the MATLAB/SIMULINK.
Acoustic Analysis of voice samples to differentiate Healthy and Asthmatic persons
Khushboo Batra , Swati Bhasin , Amandeep Singh
Voice analysis for disease detection is a very important research topic in bio medical engineering. This process is non invasive and reliable. In this paper , For comparison of asthma and healthy persons speech records of five vowels /a/, /i/, /e/, /o/, /u/ have been used. Some acoustic voice parameters jitter, shimmer, harmonic to noise ratio (HNR), noise to harmonic ratio (NHR) & mean autocoorelation were extracted using PRAAT software. These parameters were then used to classify and compare the asthmatic patients and healthy persons. In other words, these features represents the particular voice and may be used for the comparison of healthy persons from unhealthy persons. This result suggests that the feasibility of the detection of asthma disease through analyzing the various acoustic parameters of voice.
Pitch, Formant Frequencies, Intensities as Speech Cue to preceived age
Tarunam Chaudhary Sawati Bhasin, Saloni
Voice of a person gives an idea regarding the age, gender and health of a person. In this paper we tried to have an idea about the age of speaker. Our study includes 30 speakers of 14-30, 31-49, 50-80 age groups. we have considered different voice parameters fundamental frequency F0 , formant frequencies F1, F2, F3, F4 and intensities. In this different voiced vowels are used to study the changes between the three age groups
Running Costminimization Of An Electric Power Sector
Osita Oputa .
The electric power sector like any other sector provides services to the public with one of its aim as making profit; this ensure that human and material resources involve in running the business are well taken care of so as to keep the business on and improve on customer satisfaction. This can be achieved by maximizing profit without placing exorbitant tariff on customers. Hence, reducing the cost of running the system will be a source of maximizing profit. This paper shall consider how the running of the system can be reduced in the generation stage- reducing the total fuel burnt in generating a specific amount of power in the system per hour. This paper shall consider only part of the Nigeria electric power sector as a case study, precisely the Niger-Delta region of Nigeria. We shall assume that the power generated by the plants in the region is also consume there. Research shows that for a total power demand/generation of by all generating units in the region, a total of of fuel will be burn. But with the system of running cost minimization developed in this paper, about of fuel will be burnt in generating that same quantity of power demanded resulting a net savings of .
Implementation of ERP module for the key processes of hull production in shipbuilding industry
Kairav Bhatt .
Enterprise Resource Planning (ERP) is the strong tool in the hands of management for taking timely and accurate decisions during the execution of projects. It also enables organization competitively for value addition, cost reduction and improvement. At present context, ERP has become common requirement of any organizational excellence model.
ERP has several modules like Customer requirement module, finance and accounts, materials, projects, manufacturing, quality, welding and services.
Duplicate Detection In Hierarchical XML Multimedia Data Using Improved Multidup Method
Prof. Pramod. B. Gosavi Mrs. M. A. Patel
This paper aims at presenting a novel technique to find duplicate records in hierarchical (XML) data which contains multimedia attributes. Now a days the data is being stored in more complex and semistructured or hierarchical structure and the problem arose is how to detect duplicates on this XML data. Due to differences between various data models we cannot apply same algorithms which are for single relation on XML data. The objective of this paper is to detect duplicates in hierarchical data which contain textual and multimedia data like images, audio and video. Also to act according to user choice on that data like delete, update etc. Also to prune the duplicate data by using pruning algorithm that is included in proposed system. Here Bayesian network will be used for duplicate detection, and by experimenting on both artificial and real world datasets the MULTIDUP method will be able to perform duplicate detection with high efficiency and effectiveness. This method will compare each level of XML tree from root to the leaves. The first step is to go through the structure of tree comparing each descendant of both datasets inputted and find duplicates despite difference in data.
Although hidden web crawlers seems to be in great use while searching for hidden data, but the curent hidden web crawlers works on only one domain which proves to be a major drawback in the compatibilty of the searching process. This paper highlights the extended working of the hidden crawlers by providing multiple domain feature which not only helps the user to get high quality content but also makes searching process faster and more broader. The multiple domain system can work on two or more than two domains depending upon the requirement. The multiple domain system also provides a fast linkage between various webpages using multiple links. Thus the project proves to be an useful extension of this extensive tool.
On Improving Side channel Attack defence mechanism using ECPM technique under Elliptic curve Cryptography
Mrs. Sweta Nigam, Mr. K.N. Hande
In recent years, elliptic curve cryptography (ECC) has gained widespread exposure and acceptance, and has already been included in many security standards. Engineering of ECC is a complex, interdisciplinary research field encompassing such fields as mathematics, computer science, and electrical engineering. In this paper, we survey ECC implementation issues as a prominent case study for the relatively new discipline of cryptographic engineering. In particular, we show that the requirements of efficiency and security considered at the implementation stage affect not only mere low-level, technological aspects but also, significantly, higher level choices, ranging from finite field arithmetic up to curve mathematics and protocols.
To Study the Nonlinear Effects Related To PAPR and PAS in Wireless System
Rajvansh Sehamby , Dr. Jyoteesh Malhotra
A communication system is provided that includes a transmitter device and a receiver device. The transmitter device transmits input data as a transmitted signal having the known non-linear distortion (NLD) characteristic. The receiver receives a received signal that represents a channel affected version of the transmitted signal and that has the known NLD characteristic. The received signal includes power amplifier distortion (PAD) induced by the transmitter device's power amplifier. To reduce the NLD by reducing the PAPR of multicarrier system and PADs at transmitter side by using techniques and simulation in MATLAB
Optimized Distributed Association Mining (ODAM) Algorithm for detecting Web Robots
Akshata Dilip Jagtap, Prof. Vijayalaxmi Kadroli
As a result of the tremendous growth of the World Wide Web, the raw web data has become a vast source of information. Nowadays, the usages of search engines are considerably high by all types of users. The information required by different group of users are collected and represented by different types of search engines from many other sources. Web robot is one of the strategies which is used by many search engines as well as many other websites for collecting their data from the respective websites. These web robots are useful for many application, these are also dangerous for websites due to extracting the information of a particular site in an unauthorized way.
In this paper, We are suggesting the association rule mining algorithm named as "Optimized Distributed Association Mining" Algorithm for identifying the web robots which trying to traverse through website's confidential page or content and want to extract that content and after identifying ODAM will remove that web robot program. In such a way ODAM will help to prevent the websites from unauthorized access and other malicious programs.
Classification of underwater objects using acoustic techniques
Pooja Gavande Dr. R.S. Kawitkar, , M. Selva Balan
Underwater object classification is an attractive approach using acoustic remote sensing techniques due to its high coverage capabilities and limited costs as compared to manual method. In this approach Multi beam method is used for detection of underwater objects like sand, stone, silt etc. This under water object classification is performed by using model based approach which is achieved by implementing mathematical model of reservoir in Matlab using Finite Volume Method. An acoustic signal of particular frequency of Multi beam echo sounder is transmitted through this reservoir model. And thus from received signal strength various features are extracted, like backscatter strength, energy, power, surface roughness, skewness etc. It shows that these features are varying with changing seabed type, hence can be used to detect under water objects by using Multibeam echosounder of particular frequency. The scope of this project is found in underwater depth measurement using ultra sound to estimate the capacity, area elevation and sedimentation of a reservoir.
A Survey of Military Application of Wireless Sensor Networks For Soldiers
Farkhonde Khakestani , Saeed balochian
Wireless sensor network with a large number of sensor nodes can be used as an effective tool for gathering data in different ways. Wireless Sensor Networks (WSNs) with small nodes can use for sensing, computation, and wireless communications capabilities. Wireless sensor networks (WSNs) are a relatively new and rapidly developing technology; they have a wide range of applications including environmental monitoring, agriculture, public health and detect, identify and track adjacent hostile targets in military environments. In this paper we propose an application of WSNs in military enviroments
Recognition Of Indian Currency Note Using Grid Technique
Sneha Talikoti1 , Prof S.S.Panchal
In recent years it has become a very essential to develop an automatic methods for paper currency recognition as its more likely to be used in most of the areas such as vending machines, shopping centers, educational sectors, banking systems- in case of huge transactions and so on. As the technology is growing fastly it has become more easy to use such systems. Now-a-days using automated machines any one can easily get to know whether the currency is a genuine or counterfeit and also the denomination of that currency. The technology has also provided a better way of life for peoples. There is a need to design a system that is helpful in recognition of paper currency notes with fast speed and in less time. This approach basically provides an easy method for recognizing the denomination of an Indian currency note .This proposed approach works on all the notes such as Rs.10, Rs,20, Rs.50, Rs.100, Rs.500, Rs.1000.Indian currency features such as extracting geometric shape ,denomination object are extracted from a 4*4 grid of Indian currency image. Feature extraction plays an important role in successfully achieving value/ denomination of an Indian note. The approach consists of a number of components including image acquisition ,converting RBG to HSV image ,image pre-processing, ,Feature extraction, images are compared. The Artificial neural network is used to identify value / denomination of an Indian currency note.
Sensors in a wireless sensor network (WSN) are prone to failure, due to the energy depletion, hardware failures, etc. Fault tolerance is one of the critical issues in WSNs. The existing fault tolerance mechanisms either consume significant extra energy to detect and recover from the failures or need to use additional hardware and software resource. In this paper, we propose a novel energy-aware fault tolerance mechanism for WSN, called Informer Homed Routing (IHR). In our IHR, non cluster head (NCH) nodes select a limited number of targets in the data transmission. Therefore it consumes less energy.In this paper, we propose an agreement-based fault detection and recovery protocol for cluster head (CH) in wireless sensor networks (WSNs). The aim of protocol is to accurately detect CH failure to avoid unnecessary energy consumption caused by a mistaken detection process. Our experimental results show that our proposed protocol can significantly reduce energy consumption, compared to two existing protocols: Low-Energy Adaptive Clustering Hierarchy (LEACH) and Dual Homed Routing (DHR).
Information Security with the help of MLMS and HLSB
Ms. Ekata S. Bele, Prof. Chetan Bawankar
Abstract - To assert the secrecy and confidentiality of pictures or image could be a vivacious space of analysis, with totally different approaches being followed, the primary being encrypting the pictures through multi share multi level algorithms mistreatment keys, the opposite approach involves activity information mistreatment higher lsb data activity algorithmic rule to keep up the pictures secrecy.
A data content owner encrypts the important image by mistreatment totally different share, and a hide knowledge will add further knowledge into the encrypted image mistreatment higher lsb data-hiding technique although he doesn\'t understand the initial and real data. With an encrypted image containing further knowledge, a receiver could initial rewrite it consistent with the cryptography key, and so extract the embedded knowledge and recover the initial image consistent with the data-hiding key.
An Efficient Denial-of-Service Attack Detection System Based on Multivariate Correlation Analysis
Binil Anto Thampi C., Syed Farook K.
Internet is increasingly being used in almost every aspect of our lives and it is becoming a critical resource whose disruption has serious implications. Attacks that aimed at blocking availability of an internet services are generally referred to as denial of service (DoS) attacks. They cause financial losses, as in the case of an attack that prevented users from having steady connectivity to major e-commerce Web sites and imply threat to public safety and national security as in the case of taking down confidential government websites. The consequences of denial of service attacks can be very damaging. Therefore, it is crucial to deter, or otherwise minimize, the damage caused by denial of service attacks. A DoS attack detection system adapted should be able to detect all variants of DoS attacks with efficient computational costs. Thus far various methods have been introduced; however most of them failed to detect new variants of DoS attack and have destitute computation cost. In Multivariate Correlation Analysis based scheme a network features based detection system is introduced. The detection is done based on a normal profile which is generated by applying statistical analysis on the network features. Even though this system has quite good detection rates, some type of DoS attacks are left undetected and have high computation costs. The problems are due to the data used in detection, where the basic features in the original data are in different scales and not optimized. So an efficient detection system is designed by applying Statistical Normalization technique and Particle Swarm Optimization on the raw data to provide efficient detection rates and immaculate computation cost.
This paper proposes and evaluates an approach to the parallelization, deployment and management of applications that integrates several emerging technologies for distributed computing. The proposed approach uses the Map Reduce paradigm to parallelize tools and manage their execution, machine virtualization to encapsulate their execution environments and commonly used data sets into flexibly deployable virtual machines. Multi node environment in which one node will act as a gateway client machine can access the cluster through the gateway via REST API. Using this concept we propose a virtual infrastructure gateway that lifts this restriction. Through gateway cloud consumers provide deployment hints on the possible mapping of VMs to physical nodes. Such hints include the collocation and ant collocation of VMs, the existence of potential performance bottlenecks, the presence of underlying hardware features (e.g., high availability), the proximity of certain VMs to data repositories, or any other information that would contribute in a more effective placement of VMs to physical hosting nodes. Oozier will allow REST access for submitting and monitoring jobs. Cloud Computing allows cloud consumers to scale up and down their resource usage based on demand using the Apache Hadoop, using this prototype we are analyzing various techniques for scalability in cloud. It also demonstrates how power-aware policies may reduce the energy consumption of the physical installation
Bank Cheque Identification and Classification using ANN
Savita Biradar , S.S.Panchal
Identifying bank cheques is one of the important task occurring in cash transactions, as banks receives huge number of different bank cheques every day from various account holders. For country like India, it is big hurdle. Identifying bank cheques, manually can be done but It is very time consuming. This paper presents the novel method for identification and classification process which can be efficiently done. The proposed approach works on a specific features of the Indian bank cheque and classify according to different banks. The recognition system is having three parts. First, The scanned image is processed by reducing size and extracting its features by considering 18 color features that is RGB(Red Green Blue), HSV(Hue Saturation Value). Second, Logo detection using Speeded Up Robust Features (SURF) point matching algorithm and extract region properties of logo and Third Classification of different bank cheques using neural network classifier. The Bank Cheque Identification and Classification system has given 91.43 accuracy.
Tejeswara Raju V Vineeth V , Sneha R, Suhas Chadha
Survey of Routing Protocols in MANET
Tejeswara Raju V Vineeth V , Sneha R, Suhas Chadha
MANET is set of different types of mobile node. MANET is mobile so they utilize wireless connection to attach with network. MANET can be deployed at low cost in variety of application. In MANET different types of routing protocols have been recommended. These protocols can be classified into three main categories reactive (on-demand), proactive (table-driven) and hybrid routing protocols namely AODV, OLSR and ZRP. This paper focus on the survey of reactive, proactive and hybrid routing protocols
Design Of Embedded Virtual System By Using Human Machine Interaction (HMI)
Tiyyagura. Sneha , M. Kezia Aruna Jyoth
We propose a novel interactive projection system (IPS), which enables bare-finger touch interaction on regular planar surfaces (e.g., walls, tables), with only one standard camera and one projector. The challenge of bare-finger touch detection is recovering the touching information just from the 2-D image captured by the camera. In our method, the graphical user interface (GUI) button is projected on the surface and is distorted by the finger when clicking it, and there is a significant positive correlation between the button’s distortion and the finger’s height to the surface. Therefore, we propose a novel, fast, and robust algorithm, which takes advantage of the button’s distortion to detect the touch action. The existing keyboards used keys based keyboard for typing on the computer. These keyboards are working on the mechanical push principle. But for the small devices like mobile phones and tablets it is impossible to carry big keyboard with them. The touchscreen based keyboards available in such devices are very inconvenient to write because the size of people finger is big and the size of the keys on the touch screen is small. So typing work on the small devices is not convenient and on computer our fingers get pain after doing long time typing work because of mechanical vibration of the keys.
Optimization Of Cmos Circuits By Using Quaternary Logic LUT
V.Narasimhulu, K.V. Bhanu Prasanth, C.Md.Aslam
vNow a days due to Interconnections the delay will increase in design of a circuit. The more number of interconnections leads to increase in the area of the circuit due to this the power consumption also more in COMS digital circuits. Multiple-valued logic can reduce the average power required for level transitions and reduces the number of interconnections for the design, hence also reducing the effects of interconnections on overall power consumption. In this paper, we propose quaternary lookup table (LUT) architecture was designed to replace or complement binary LUTs in field programmable gate arrays. The circuit is implemented with the CMOS processes, with a single voltage supply and voltage mode structures.
Annotations are useful in effective information retrieval. Annotation can be applied to several fields like image, videos, documents, etc. Annotation helps to understand and retrieve the documents very easily. For doing annotation, firstly important attributes have to be identified. Making annotations on documents is a hardwork because people have to read the documents fully to think which sentences have to be annotated. If people are automatically given the sentences which are most suitable for annotations, it is more easier. Also annotations can be done to webpage-metadata also. In this paper, a survey about annotations is discussed and a brief description about proposed methodology is given.
Detection and Prevention of Black Hole & Gray hole attack in MANET using Digital signature Techniques
Monika , Swati Gupta
A mobile ad-hoc network (MANET) infrastructure less dynamic network consists of collection of wireless mobile node that communicates with each other without the use of centralized network. Security in MANET is the most important concern for the basic functionality of network. The malicious node falsely advertises the shortest path to the destination node during the route discovery process by forging the sequence number and hop count of routing message . In this paper ,To avoid and remove this effect we will used Digital Signature Techniques
Assessing Version Consistency by Identifying systematic Code Changes during Runtime
P.Swapna Shankar , P.Niranjan, P.Shireesha
In compelling software revisions there survive various degrees of potential behavior change. The easiest strategy of adapting an request is to alter the performance of a system body, i.e., upgrading the method body to a classic version without modifying the entire request, for instance if a bug inside a single technique gets unchanging, or a quicker algorithm through the same features gets arranged. A next move in the direction of absolute revisions is the ability to alter the method signature, where not just the internals of a method, but also the endeavor signature alone, i.e., the quantity as well as types of variables, the review type or the technique name get altered. The final step regarding random modifications and completely energetically updateable techniques is the sustain for modifying global fields also fields inside of schemes, for instance in the case of class-based techniques the fields of elements as specific by their individual sessions. In this device we provide a runtime evaluation version persistence to evaluate the impact of the dynamic software revisions.
Offline Signature Identification Using Adaptive Window Position Technique
Chaitra Kulkarni .
Offline handwritten signature identification is the technique which is used to identification of the behavioural characteristics which is part of biometric characteristic of the each person. But this type of signature identification is subjected to the mal practices, so to prevent this the offline handwritten signature identification using adaptive window position technique, this technique uses the windows for segmentation of the signature image, the windows are of size n*n, the windows are placed on the signature image, after acquiring the position on signature image the window will fragment the images then generates the small sub images, before going to feature extraction phase pattern adjustment technique will applied, this leads to the simple and accurate calculations of the features, there are 10* 50 = 150 signature images are present in the dataset, which are signed by the 10 different persons, each person have signed his signature 15 times, 10 * 10 = 100 signature images are kept in training data set and 10 * 5= 50 signature images are kept in the testing dataset, the comparison between the two sub images takes place based on their similarity properties using an similarity correlation equation, using the adaptive window position technique we can easily find the signature of the genuine user. The accuracy of the proposed method is 94% .
Cloud Database Storage Model by Using Key-as-a-Service (KaaS)
J.Sivaiah , Dr.K.Venkataramana
In this paper we have studied about issues related to authenticity, integrity and security of data storage in cloud data centres. Security in cloud is under scanner due to which its adoption by IT sector is in slow stride. Using Cloud Storage, users can remotely store their data at less cost per byte and access services with high quality from a shared pool of configurable computing resources, without Operational Expenditure and Capital Expenditure. In cloud environment client’s data is stored outside the organization premises which make client insecure regarding his data at cloud storage facility as it may be accessed and modified by unauthorized users. So in this paper we have proposed a model Securing Cloud Data using Key as a Service (SCDKS)which will provide authenticity for accessing data at storage by providing keys to access file/data. CSP will authenticate every user by generating keys by Key as a Service (KaaS), KaaS will generate a unique key for every session dynamically when user wants to access data at cloud. Since key is generated for each session it cannot be used by other users or data cannot be accessed by them which ensure security, integrity to data. A sophisticated and threshold key generation algorithm is used in KaaS for generating keys for users after authenticated by Cloud Service provider.
Home automation is becoming more and more popular day by day due to its numerous advantages. This can be achieved by local networking or by remote control.
In existing system, the control of smart home is via Bluetooth and internet connectivity. There is only fire sensor in the smart home, if there is any problem due to that sensor there is indication through alarm like buzzer and there is an email alert to the user. Here the Bluetooth is used for short distance communication and internet is used for long distance communication.
Review of Evolutionary Optimization Algorithms for Test Case Minimization
Aditya Vikram Sharma .
- Multi-objective test suite minimization problem is to select a set of test cases from the available test suite while optimizing the multi objectives like code coverage, cost and fault history.[1] Regression Test suite optimization is an effective technique to reduce time and cost of testing. Many researchers have used computational intelligence techniques to enhance the effectiveness of test suite. These approaches optimize test suite for a single objective. Introduction of nature inspired algorithms like GA, PSO and BFO may be used to optimize test suite for multi-objective selection criteria. Main focus of our approach is to find a test suite that is optimal for multi-objective regression testing.[2]
Investigation of The Effect Of Water-Cement Ratio On The Modulus Of Rupture Of Concrete
Onwuka. D. Oa, Awodiji C.T.Gb and Onwuka, S.U
This paper presents the experimental results of the investigation of the effects of water-cement ratio on the modulus of rupture of concrete. The variations of the flexural strengths of concrete mixes with varying water-cement ratio within 28 days after curing and casting, were experimentally investigated. In all a total of 15 prototype concrete beams were cast (2 specimens from each mixture) and tested so as to determine their flexural strength (modulus of rupture). The design strength level attained ranged between 4.50Mpa to 6.30Mpa for water-cement ratio ranging from 0.48 – 0.62. The results obtained showed that the values of modulus of rupture of concrete, increased as the water-cement ratio increased, until an optimum value of 6.30Mpa was attained at a water-cement ratio of 0.58. However, the water-cement ratios above 0.58, was observed to have a very significant reduction effect on the flexural strength of the concrete. Setting time test for the cement paste used for the concrete, was carried out, alongside the grain size distribution analysis of the aggregates. The initial setting time was 60 mins (1 hr.) while the final setting time was 535 mins (approximately 9 hrs). Grain size distribution of aggregates shows that the sharp river sand is a medium-fine aggregate while the granite chipping is a medium coarse aggregate.
An effective Research Paper Recommender System based on Subspace Clustering
K. Naga Neeraja, Dr. B. Prajna
There are an increasing number of research papers getting published day by day. It becomes difficult for a researcher to closely examine all the research papers in their research field and find out the papers that are related to their research work for guidance. Recommender system helps the researcher by recommending papers based on the ratings given by other researchers in that field. Collaborative filtering is one of the most successful technologies for building recommender systems and is extensively used in many commercial recommender systems. Unfortunately the computational complexity of these methods grows linearly with the number of users and number of items, which in typical research paper domain can be several millions. To address these scalability issues, we present an effective recommender system based on the subspace clustering, which analyzes the researcher-paper matrix to discover the relation between different researchers and uses these relations to compute the list of research papers to recommend.
Improving Classification Accuracy Using Weighted Multiple Regression
Sonali S. Salunke, Swati Deshpande
The purpose of this system is to find out if a flight is getting delayed during departure and arrival then what are the reasons for the delay. Therefore we intend to aid the airlines by predicting the delays by using certain data patterns from the previous information. This system explores what factors influence the occurrence of flight delay. Classification algorithm is applied to classify flights into delay categories. Using OneR, a classification algorithm, models are developed to predict delay on both arrival and departure side. Discretization was applied using Weka, a data mining tool, to divide the delays on departure and arrival side into five categories viz; Negligible, Insignificant, Nominal, Significant, Indefinite. The result thus obtained from these categories was further analyzed to predict overall reason for delay. The delay predicted can be due to Weather, Security, Carrier, National Aviation System (NAS) and Late Arrival. The models further combine this result with Meteorological Terminal Aviation Routine Weather Report (METAR) to give the report of weather conditions at origin and destination airport. The results of data analysis will suggest that flight delays follow certain patterns that distinguish them from on-time flights. We may also discover that fairly good predictions can be made on the basis on a few attributes. Classification can be used for analyzing future data trends. It is important that the classification is appropriate so that the data prediction is accurate.
Samadhan B. Patil, Vaishali Ghate Dhanashree Tarodmalle
Probabilistic Determination of Class for an Unbalanced Dataset
Samadhan B. Patil, Vaishali Ghate Dhanashree Tarodmalle
Poker is one of the world’s most popular and widely played card games. In Poker, there is a fixed set of winning conditions and the player with the highest winning condition wins the game. The main part of the game is to bet strategically and in a calculated manner so that there is less chance of risk and the opponents are not able to guess the cards in the hand. To help players understand when and how to bet smartly, this application will be developed. This system provides knowledge to the users about their probability of winning based on the cards available to them. The system which has been developed is lightweight and easy-to-use so that all types of players can use it. The aim of this system is to help gamblers bet better thereby increasing their winnings, addiction to Poker gambling and also generate greater revenue collections for gaming consortiums.. The most important point of this paper is to show how we have used data mining and statistical probabilities to formulate an algorithm which gives out correct predictions of the winning hand. We formally define the system and outline the challenges that arose while developing technology to support it. We hope that this paper will encourage more research by the gaming consortiums and the gambling community in this exciting area of winning by probability calculations and card counting.
Comparitive Analysis of Watermarking in Digital Images Using 3-Level DWT & LBP
Jyoti Joshi , Pawan Kumar Mishra
Digital watermarking has rise as a new research area for preventing illicit copying and duplication. In this paper two methods 3-level DWT & Local Binary Pattern (LBP) algorithm for watermarking in digital images are discussed. In order to compare the robustness of both algorithms noise attack is used.
Social Group-Based Routing in Delay Tolerant Networks using Fuzzy Logic
Shadha K., Sherikh K. K.
Delay tolerant networks (DTNs) may lack continuous network connectivity. Routing is thus challenging since it must handle network partitioning, long delays, and dynamic topology and limited power sources in such networks. Social Group-Based routing is an optimal routing, the protocol maximize data delivery while minimizing network overhead and packet delay by efficiently spreading the packet copies in the network. Social grouping concept is used for routing effectively by selecting minimum hop path from the all possible path of tree contacts. The members of its own group should be known to all and message is forwarded to member of other social group only. Messages to the nearest group is forwarded without waiting for the other group member, by selecting most suitable relay node from the same group. The fuzzification of degree of connectivity that strengthen by frequent meeting of nodes and the mobility of nodes help to fuzzify the membership of nodes in the case of overlapping communities to select the better relay node. The protocol achieves higher delivery ratio, less network overhead and less average delay compared to other routing protocols
Ant Colony Optimization Algorithm for Software Project Scheduling Using Critical Chain Project Management
Haseeba N, Sajeer K.
Software project managers are often faced with challenges when trying to effectively staff and schedule projects. Fail in scheduling correctly the execution of tasks leads to lower quality product to be delivered late and may over budget. An adequate model for software project planning has to deal with not only the problem of project task scheduling but also the problem of human resource allocation. The allocation of scarce resources then becomes a major objective of the problem. To develop a well-organized method for solving these issues, here an event based scheduler and an Ant Colony Optimization algorithm is used. The event based scheduler adjust the allocation of employees at events and keep the allocation unchanged at non-events. To solve the complicated planning problem, a well-organized Ant Colony Optimization algorithm is further designed. With the existing methods, resources and time are consumed by wasteful techniques such as student syndrome in which a person will only start to work on an assignment at the last possible moment of the deadline. It also delays the completion of task which is the impact of Parkinson’s law. For solving the issues related with the existing methods, critical chain project management techniques are used. Critical chain project management recommends that task estimates are cut to half the length of a normal duration. It also uses safety buffers to manage the impact of constraint variation and uncertainty around the project. Incorporating critical chain to the existing method yields an optimal plan.
A Fast and Efficient RBTRC-MAC Protocol for Event-Driven Wireless Sensor Networks
Hasna P., Annes Philip
Event-driven wireless sensor networks have been identified as one of the key areas in the field of wireless communication, where most of the time the network will be in light traffic load situation. When an event has detected, large number of packets will be generated. A MAC protocol designed for this kind of WSNs should be able to swiftly adapt to both light and heavy traffic load situations of the network. Here, implemented a receiver-centric MAC protocol called RC-MAC that integrates duty cycling and receiver centric scheduling. To handle heavy traffic load situation triggered by an event, RC-MAC takes advantage of tree based topology and multichannel technique to assist medium access scheduling. Different parent children sets are assigned to different channels. During light traffic load situation, there will be idle listening of sender in existing RC-MAC protocol, when short beacon interval is used. It reduces energy efficiency of existing RC-MAC. Also, during heavy traffic load situation, packet scheduling is not used in existing RC-MAC. So, a randomization of beacon technique enabled RC-MAC is designed to provide energy efficiency in both short beacon interval and large beacon interval cases. Also, packet scheduling is introduced along with receiver-centric scheduling and hence provides better end to end delay compared to existing approach.
The demand for images video sequences and computer animation has increased drastically over the years. This has resulted in image and video compression on becoming an important issue in reducing the cost of data storage and transmission.JPEG is currently the accepted standard for still image compression. , but alternative methods are also being explored. Storing images in less memory leads to a direct reduction in storage cost and faster data Hence image compression has proved to be a valuable technique as one solution. In this paper we purpose the lossless method of image compression and decompression using a simple method called Huffman coding.
Enhancing Security of MANETs by Implementing Elliptical Curve based Threshold Cryptography
Ms. Bhavna Sharma , Mrs Vandana Madaan
Mobile Ad-hoc Networks or MANETs is a prominent technique in wireless networks which is receiving a great attention by different groups . A Mobile Ad-hoc network is a group of wireless mobiles nodes that tends to form a network without any fixed infrastructure having mobile and self-configuring nodes. Security in MANETs is a challenging issue hence it requires efficient security scheme to protect itself against malicious attacks and interceptions. Elliptical curve based threshold cryptography provides a promising solution to enhance the security of MANETs than other existing popular algorithms such as RSA. In this paper we have discussed the implementation of Elliptic Curve Cryptography (ECC) based threshold cryptography(ECC-TC) using GNU Multi precision Library (GMP) and discussed their advantages for employment in MANETs to counter its security issues.
Establishing a Multi-Application ADHOC Network for Smart city Design
M.Annapurna , S.Himabindu
With the event and application of the net of things, mobile broadband network, next generation, cloud computing and etc, informatization encompasses a tendency of upper smarter stage. IBM Corporation in Gregorian calendar month 2008 issued the conception of "Smart Earth". The principle of good Earth is that, sensors are embedded within the railways, bridges, tunnels, roads, buildings, water systems, dams, industrial instrumentation and medical instrumentation, then physical facilities are often perceived, thus info technology extends into physical world[2], constructing a "Internet of Things". Moreover, net of Things are often connected with net to whole number the human society and physical system. People, machine, equipment, and etc are often managed within the integrated system through laptop and cloud computing, that the people’s production and life are often managed additional exactly and dynamically to urge smarter, raise resource usage and productivity and improve the link between folks and nature. Some specialists have researched smarter town. “Smart City" is to create town smarter, conjointly to create the folks smarter within the town.This project aims at demonstrating a town automation system whereby all the homes within the town may be brought underneath one multi-tier duplex wireless adhoc network. The project uses RF Transceiver technology for achieving this.
Efficient Design Of Binary Adders Using Cellular Automata
G.Divya, Mr. S.Venkata Kiran
In this paper, we propose a novel quantum-dot cellular automata (QCA) adder design and that decrease the number of QCA cells used in previous existing designs. The proposed one-bit QCA adder design is based on a new algorithm that contains three majority gates and two inverters for the binary addition. A novel 64-bit adder designed in QCA was implemented. It yields speed performances higher than the existing designs. QCA adders area requirements are comparable with the RCA and CFA. The novel adder operates in the RCA fashion, but it can propagate a carry signal through a number of cascaded MGs significantly lower than the conventional RCA adders. Our proposed QCA Adder architecture will be used as a real time Digital Counter or Clock Circuits. Here we implemented a stop watch using this QCA adder architecture.
Comparative Study of Various Scheduling Algorithms in Cloud Computing
Kavyasri M N, Dr. Ramesh B
cloud computing is an emerging technology and serves as next generation platform, which allows the user to pay as they use. It permits access to remote and geographically distributed resources with the help of an important feature in cloud computing called virtualization. Cloud consists of number of virtual machines. In cloud there is need of number of virtual machines (VM) based on the requirement of user and cloud service providers. Scheduling is necessary to manage large number of VM requests. Scheduling is key technology in cloud computing, scheduling of tasks and resource allocation is challenging task in cloud. So we require scheduling algorithm. Primary consideration of scheduling algorithm is to provide proficiency to task and resource scheduling. Main objective of the paper is to give comparative analysis of existing scheduling algorithm in cloud platform where resources have varying cost and computational efficiency. In this paper we have surveyed on different types of scheduling algorithm and tabulated their various parameters, scheduling factors and so on.
Programmable Direct Memory Access Controller Design Using Advanced High Performance Bus Protocol
Ms. Sarojini Hoogar S, Mrs. Sapna M K
Direct memory access is process, which allows certain peripheral devices to access system memory without intervention of CPU. When CPU is performing I/O operation, it will occupy the entire duration of IOR/IOW operations and hence it is unavailable to do other operations. The DMAC is mainly helps in efficient utilization of hardware resources.
Implementation of FPGA Architecture for OFDM-SDR with an optimized Direct Digital Frequency Synthesizer
Aluru Koteswara Rao, A.Anasuyamma, M.Tech
A Software Defined Radio (SDR) is defined as a radio in which the receive digitization is performed at some stage downstream from the antenna, typically after wideband filtering, low noise amplification, and down conversion to a lower frequency in subsequent stages - with a reverse process occurring for the transmit digitization. In an SDR, Digital Signal Processing in flexible and reconfigurable functional blocks define the characteristics of the radio. Researchers in the area of Communication and VLSI have a growing tendency to develop state-of-the-art architectures for SDR. The backbone of SDR framework in the digital domain is a direct digital frequency synthesizer (DDFS). Using DDFS, we can generate the high frequency carrier waves in digital domain and modulate the message on it. The optimization of the DDFS includes Lookup Table based designs and Coordinate Rotation Digital Computer (CORDIC) based designs. Lookup Table based designs require huge ROMs for implementation and are declared to be area-hungry. On the other hand, CORDIC based techniques use iterative algorithms for the computation of Sine and Cosine functions and are computationally inefficient. Each sample of Sine and Cosine requires 4 multiplications and 2 additions. However, the design in “The Proposed Architecture for DDFS” is even simpler and faster. It utilizes 2 adders and 2 multipliers to generate a sample of sine and cosine. This design is more area and time efficient than CORDIC and look-up table based approaches.
Design and FPGA-Based Implementation of A High Performance 64-Bit DSP Processor
Jadapalli Pavani, K.Dhanunjaya(phd)
To meet the faster processing demand in consumer electronics, performance efficient DSP processor design is important. This paper presents a novel design and FPGA-based implementation of a 64 bit DSP processor to achieve high performance gain for reduced instruction set DSP processors. The proposed design includes a hazard-optimized pipelined architecture and a dedicated single cycle integer MAC to enhance the processing speed. Performance of the designed processor is evaluated against existing similar reduced instruction set DSP processor (MUN DSP-2000). Synthesis results and performance analysis of each system building component confirmed a significant performance improvement in the proposed DSP processor over the compared one.
An Efficient FPGA Implementation Of Multifunction Residue Architectures
Chembeti silpa, G.Mukesh,, M.Tech
A design methodology for incorporating Residue Number System (RNS) and Polynomial Residue Number System (PRNS) in Montgomery modular multiplication in GF(p) or GF(2n) respectively, as well as a VLSI architecture of a dual-field residue arithmetic Montgomery multiplier are presented in this paper. An analysis of input/output conversions to/from residue representation, along with the proposed residue Montgomery multiplication algorithm, reveals common multiply-accumulate data paths both between the converters and between the two residue representations. A versatile architecture is derived that supports all operations of Montgomery multiplication in GF(p) and GF(2n), input/output conversions, Mixed Radix Conversion (MRC) for integers and polynomials, dual-field modular exponentiation and inversion in the same hardware. Detailed comparisons with state-of-the-art implementations prove the potential of residue arithmetic exploitation in dual-field modular multiplication
Optimized Analytical Study to Show the Impact of CFO on the Performance of LTE Uplink
Punugoti Asritha .
Although tremendous progress has been made in the past decade on wireless communications domain, but still the impact of carrier frequency offset (CFO) on system performance especially on orthogonal frequency division multiplexing (OFDM) is an area of concern. As reported in the literature the conventional research works shows that the carrier frequency offset (CFO) occurs mainly due to oscillator instability and Doppler shift introduced in the channel. In this paper, the impact of carrier frequency offset (CFO) on the system performance is discussed and the impact of the carrier frequency offset (CFO) drawback even worse in uplink scenario where every user experiences the different offsetbecause of random nature of users. The usage of DFT based SC-FDMA over IFFT based OFDM shows good impact on system uplink scenario where DFT based SC-FDMA is free of PAPR which is common problem in IFFT based OFDM system as mobile handsets have limited battery power. Finally the Bit error graphs and constellation diagrams in the results section reveals the fact that the CFO impact on the system uplink is so severe that in order to save the user integrity data suppression techniques needs to be employed in an efficient way.
Evaluation of Energy Efficient Low Power Listening MAC Protocols in WSN
Komaldeep Kaur, Dr. Jyoteesh Malhotra
Medium access control protocols for wireless sensor networks are designed to be energy efficient. An energy efficient MAC protocols are those which reduce idle listening and overhearing .Idle listening may have become the main source of energy waste. To reach average power consumption, most of the time trans-receiver must shut down. This phenomenon allows the nodes to use a lower duty cycle, at no cost of overhead in many cases. Simulation and implementation results show that how fail rate and delay varies in MAC protocols like X-MAC,MX-MAC and SPECK MAC.
Cloud frameworks frequently has long running applications, for example, monstrous information handling which gives more chance to aggressor to adventure frameworks vulnerabilty and perform vital attacks.Sensitive information of an association is set in the control of outsider utilizing cloud computing,which presents singificant level of danger on the protection and security of data.Need of extraordinary equipment or secure part bolster necessity for respectability verification is enlightened in this exploration work This broad overview paper intends to expand and break down the various uncertain issues viz. Autocorrection of debased information and mallicious assaults undermining the distributed computing environment reception and dissemination influencing different partners and end clients connected to it.Key Words:Integrity attestation, Cloud computing, Stakeholders
A Fast and Reliable Tree based Proactive Source Routing in Mobile Adhoc Network
1Haseena M. K., Annes Philip.
Mobile Ad hoc Network (MANET) is the emerging technology used for various applications such as battlefield communications, emergence operations, search and rescue and disaster relief operations. It consist a group of mobile nodes. The most salient research challenges in this area include end to end data transfer, link access control and security. In order to improve the data transfer between these mobile nodes, many routing protocols are proposed. Opportunistic data forwarding has drawn much attention in the research community of multi hop wireless networking. Opportunistic routing in MANET is a challenging one. Use of efficient routing methods which supports opportunistic routing in MANET will give better throughput with minimum delay and overhead. Here, a tree based proactive source routing called PSR is described that integrates proactive routing and opportunistic data forwarding. It provides low routing overhead and delay without reducing the overall throughput. To maintain source routing, every node keeps a BFST of the entire network rooted at itself. The tree structure is periodically updated by broadcasting these information towards neighbour nodes to keep pro-activity in routing. Routes are updated and unwanted nodes are removed from the tree structure without affecting delay in communication. An efficient tree based PSR is proposed which have the following features. Routing overhead in PSR will be reduced by using streamlined differential update methods. Delay in path identification of PSR can be reduced by using shortest path algorithm. Control packet overhead at higher level nodes can be reduced by using mobile sink node.
A Survey on Techniques of Critical Infrastructure Monitoring
Madhuri Unde, Borkar Bharat
In our day today life electricity may be broken due to afternoon storms, snow storms and natural troubles. When these events occur we may think that it is due to some technical fault & the power will come after some time. If it is not coming, then we are calling to operator. Then the available artifacts and analytic techniques are applied to find the fault, determine the faulted area, complete repairs of that area and restoring service to that area. This is done by the control room staff by coordinating with field person. That’s why the idea of proposed work is to design a system which is very cost effective. This system can be used to monitor MV/LV application also.
Dealing With Concept Drifts In Process Mining Using Event Logs
Maddu. Uma Lavanya , Mr. Sunil Kumar Talluri
: Now a days in this e-world most of the business totally related to the process mining trends, it is been generated and checks the process flow in terms of changes in the whole system. Here a few process mining techniques are used to analyze the changes in the drifts, this drifts may be changes according to the process and it can be sudden change, gradually change or the random change .This changes can be visualized by using the Rapid Miner tool, in that a ProM framework is used to get the better Results. Not only it tracks the change it also finds the localization in the business Process. Here Feature Extraction and Generate Population method is used to find the relationship among the activities. Event logs are used to establish the process flow
Mr. A. Jamaludeen Mr. C. Senthil Kumaran Ms. R. Suriya
Challenges of Big Data: Current Analysis
Mr. A. Jamaludeen Mr. C. Senthil Kumaran Ms. R. Suriya
In today’s technology, the internet has made new sources of vast amount of data available to business executives. Big data is certainly one of the biggest buzz phrases in IT today.It is comprised of datasets too large to be handled by traditional database systems. Big data refers to data volumes in the range of exabytes (1018) and beyond. Such volumes exceed the capacity of current on-line storage systems and processing systems. To remain competitive business executives need to adopt the new technologies and techniques emerging due to big data. We outline some of the challenges of big data in various big sectors. The researcher focuses how Hadoop provides fully functional end to end solutions that address a real world problem. And also this paper indicates the way to success of industries with big data technology
A Study on Weibull Distribution for Estimating the Reliability
Dr. P. K. Suri , Parul Raheja
In this paper we estimate reliability of a system using Weibull distribution. Reliability is defined as when a product does not fail in an expected lifetime in specified conditions. In reliability applications Weibull Distribution is commonly used distribution. The Weibull distribution is a generalization of the exponential distribution. The two-parameter Weibull distribution can accommodate increasing, decreasing, or constant failure rates because of the great flexibility expressed through the model's parameter. This distribution is recognized today as industrial standard. This paper presents the maximum likelihood method for estimating the Weibull distribution parameters. Weibull distribution is widely used in engineering applications. Weibull distribution is analysis with the Weibull Data Plot depict failure analysis with extremely small samples. The Data plot is very helpful for engineers and managers.
Cost Effective Method for Detecting Clone Nodes in Wireless Sensor Network
Dilna V., Sajitha M.
In most of wireless sensor application security is one of the prime concern. Generally sensor nodes are not equipped with any tamper resistant hardware and they are deployed in a hostile environment, so the chance of occurring attacks should be greater. In node clone attack adversary will capture few nodes from the network, retrieving its credentials and creating large number of clones by reprogramming the nodes. And these clones may have the ability to subvert the whole network. Thus the detection of node clone attacks in a wireless sensor network is therefore a fundamental problem. In distributed environment many protocols are available to detect the clone attack. Thus far, various schemes have been proposed to detect replicas; however, most of them require expensive hardware like global positioning system (GPS) to obtain the location of a sensor node. In general, sensor nodes are equipped with limited set of resources, to suit for resource constraint sensor application; hence it is not practical to employ additional devices like GPS in them for the detection process. In Cost Effective Method for Detecting Clone Nodes in WSN (CEMDCN) protocol introducing a low priced and energy efficient solution for detecting clone nodes in wireless sensor network without using GPS in them. Extensive simulation shows that proposed method is efficient in terms of detection probability, memory and communication overhead. Also this is a better clone detection scheme in resource constraint sensor application.
EBHEAL: Distributed and Localized Mechanism for Coverage Hole Detection and Healing
Soumya P. V., Shreeja R.
One of the major functions of a wireless sensor network is the monitoring of a particular area. Coverage is considered as an important measure of quality of service provided by a wireless sensor network and also emergence of holes in the area of interest is unavoidable due to the nature of WSN nodes. If there is a hole in the network, during data transmission across the hole, the data will move along the hole boundary nodes again and again. This will cause the depletion of energy of nodes in the hole boundary nodes. So detection and healing of such coverage holes is an important factor of concern. Here a hole detection and healing technique based on the energy of nodes is described. Energy based EBHEAL has two phases, Hole detection phase and Hole healing phase. Hole detection phase will detect the existence of hole in the network and its characteristics. Hole healing phase make use of these characteristics to heal the hole. Energy of the nodes are considered to improve the efficiency of the hole healing phase.
Mobile Ad-hoc network is a collection of nodes that dynamically connected together to form a network without using any fixed infrastructure. Mobile nodes are connected by wireless links to form an arbitrary topology. As it is infrastructure less network, the information or data packets are send between the nodes with the help of radio signals and each node act as routers. MANET aimed is to provide communication capabilities to areas where limited or no predetermined communication infrastructures exist. MANETs are vulnerable to malicious entities that aim to tamper and analyze data and traffic analysis by communication eavesdropping or attacking routing protocols. For security issue one solution is to use anonymous routing in the network that cannot be identified by any other nodes or attacker or observer. High Security and privacy in ad-hoc networks has been a major issue, while it comes in the field of defense and other such sensitive communications. Most of the communication system provides security in routing and data content. Anonymous communications should focus on anonymity in identity, location and route of the participating nodes. Anonymous communication between the Manet nodes are challenging as the nodes are free to move anywhere. No centralized node is there to monitor or to control the other nodes. Here the chance of attack from malicious nodes is high. Anonymous routing protocols are crucial in MANETs to provide secure communications by hiding node identities and preventing traffic analysis attacks from outside observers.
Contrast Enhancement Detection on Digital Images - A Survey
Geethu N Nadh, Sreelatha S.H
Digital images are used for a variety of applications such as news media, film industry, military application etc. With the help of image editing software tools, it is easy to alter the content of an image. So the content of an image is no longer believable nowadays. Contrast enhancement is an important factor for image enhancement. There are various types of techniques to create forged images for various intentions. When an attacker manipulates an image, contrast enhancement is used for avoiding traces left by the image forgery. There are so many methods to enhance contrast of an image. So in order to detect an image forgery, it is necessary to perform contrast enhancement detection. This paper reviews various methods for detecting contrast enhancement in digital images.
Video Compression – from Fundamentals to H.264 and H.265 Standards
Sruthi S, Dr. Shreelekshmi R
In this era of modern technologies and Internet, Video Compression is extremely important for systematic storing and transmission of huge sized videos. In past two decades, there was a rapid development in the area of Video Compression Technology. To attain the best equilibrium between compression efficiency and perceptual quality, a number of video compression techniques have been developed. The first part of this survey paper gives an overview about Video Compression. The latter part gives a summary about different Video Compression Standards.
Model for User’s Trust in Cloud Service Providers in Cloud Environment
Kavita Rathi, Sudesh Kumari
Cloud computing is a current research area that provides elastic and flexible computing resources to best fit the today’s business needs. In place of several advantages offered by cloud computing environments, it deals with the major roadblocks in its way of growth. The major roadblocks considered are found as security and secrecy of data along with the trust on cloud service providers (CSP). This paper tries to find out the trust parameters which could help in increasing trust of users on the service providers and helping to encourage the use of cloud computing among users at large. This paper proposes a new user trust modelthat finds out the trust parameters that make strong impact on users’ trust in the cloud environment. The model helps the cloud service users (CSUs) to establish trust in the cloud service providers’ (CSPs) for availing their services. Also the CSPs can refer the model to enhance their services for getting more trust among CSUs.
Effects On Human Cells Due To RF And Electromagnetic Field
V. S. Nimbalkar .
The general opinion that there is gradual hazardas effect at the cellular level related to human health . The study of the low frequency radio frequency wave reveled that different dimension of EM wave have not shown any DNA damage directly burt there is concern about evidance of cellular effect of EM.The static of very low frequncy EM field lead to biological effect associated with re-distribution of ions. The effects induced by radiofrequency (RF) electromagnetic fields and microwave (MW) radiation in various cellular systems is still increasing. Until now no satisfactory mechanism has been proposed to explain the biological effects of these fields.this study is to investigate effect of MW radiation on cell proliferation.
Comparative Analysis of Different Technique for Detection of Noise in Restored Image
Puneet Kaushik Mridul Chawla * Gyanender Kumar
Image restoration is the process of reconstruction or recovering an image that has been corrupted by some degradation phenomenon. Degradation may occur due to motion blur, Gaussian blur, noise and camera mismatch. In this paper corrupted image have been recovered using Modified Lucy Richardson algorithm in the presence of Gaussian blur and motion blur. The performance of this algorithm has been compared with Wiener filter, Blind deconvolution and Lucy Richardson algorithm. The performance comparison done on the based on peak signal-to-noise ratio (PSNR), Mean Square Error (MSE) and Structural Similarity Index for Measuring Image (SSIM).The result shows that Blind deconvolution Method is better than Wiener filter and Lucy Richardson algorithm
A Survey over Various Variants of RS and Similar Cryptography Techniques
Pramila Patidar Namrata Sharma
The high growth within the networking technology leads a typical culture for interchanging of the digital pictures terribly drastic. Thus, it's additional vulnerable of duplicating of digital image and re-distributed by hackers. So the knowledge needs to be protected whereas sending it, Sensitive data like credit cards, banking transactions and social insurance numbers have to be compelled to be protected. For this several cryptography and decoding techniques are a unit existing that area unit accustomed avoid the knowledge stealing. In the recent days of the web, the cryptography and decoding of information play a serious role in securing the information in on-line transmission focuses chiefly on its security across the web. Totally different cryptography and decoding techniques are units accustomed defend the confidential knowledge from unauthorized use. In this paper, a review of the cryptography based techniques has proposed. This review contains the working, merits, demerits of the current encryption decryption techniques.
A Novel Approach for Enhancing Security of MANET Using Trust Based Method
1Shabana M. L., Anil K. Jacob
Mobile Ad-hoc Networks are a collection of two or more devices equipped with wireless communication and networking capabilities without a centralized infrastructure. The open medium and wide distribution of nodes make MANET vulnerable to attack from malicious nodes. In order to reduce the hazards from these malicious nodes, concept of trust is introduced in to the MANET. Trust Mechanisms secure data forwarding by isolating nodes with malicious intentions using trust value on the nodes. Different trust mechanisms are designed to enhance the security of MANETs. In this paper, the trust model has two components: trust from direct observation and trust from indirect observation. In direct observation, observer node directly evaluates the trust value of observed node based on its own opinion. In indirect observation, instead of taking arithmetic mean of trust values from all the neighbors, this method uses Dempster-shafer theory which provides a numerical measurement of degrees of belief about a proposition from multiple sources. As observer node needs to collects opinions from all its neighbors to evaluate the trust of observed node which is not in the range of observer node, it will cause congestion in the network. Thereby indirect observation takes some delay in trust value calculation. So selective deviation test and energy consumption filtering is applied to reduce delay in the network. By reducing the delay, packet delivery ratio in the network can be increased.
Using Genetic Algorithm For Optimization Of Mobile Agent In Wireless Sensor Network: A Survey
Harveen Kaur, Mandeep Singh Sra
A wireless sensor network (WSN) is collection of sensor nodes located at unattended areas. To fetch the data from sensor nodes, static or dynamic agents can be used. The dynamic agent is used instead of a static agent by implementing the Genetic Algorithm. In this algorithm, a network is subdivided in N number of clusters and cluster head is elected for each cluster which directly communicates the information to sink. Mobile agent (i.e. dynamic agent) moves in random fashion and fetch data from cluster heads. By thisalgorithm, network can remain alive for longer period of time as power consumption and packet loss rate willdecrease. In this paper, dynamic sink along with usage of Genetic algorithm is being proposed
Implementation of Systematic Error Correcting Codes for Matching Of Data Encoded
Amala A , Jasmine M
A Computing system is one,where an input data compared with a stored data to locate the matching entry.For example translation look aside buffer and Cache tag array lookup matching. In this paper, we propose new architecture in order to reduce complexity and latency for matching the data protected with an error-correcting code(ECC). It is based on the codeword of an ECC generated by encoding is usually represented in a systematic form and it consists the raw data and the parity information. The proposed architecture parallelizes the comparison of the data and that of the parity information.To reduce the latency and complexity, we propose a new butterfly-formed weight accumulator(BWA) for The efficient computation of the Hamming distance.The proposed architecture checks whether the incoming data matches with the stored data.
Breast Cancer Using KMSVM Techniques In Wisconsin Detect Prognostic Breast Cancer Data Sets
D. Rajakumari, C. Jayanthi
Event extraction is a particularly challenging type of information extraction. Most current event extraction systems rely on local information at the phrase or sentence level. However, this local context may be insufficient to resolve ambiguity in identifying particular types of events; information from a wider scope can serve to resolve some of this ambiguity. In this paper, we first investigate how to extract supervised and unsupervised features to improve a supervised baseline system. Then, we present two additional tasks to show the benefit of wider scope features in semi-supervised learning and active learning. Experiments show that using features from wider scope can not only aid a supervised local event extraction baseline system, but also help the semi-supervised or active learning approach. The resulting efficient nugget pool is used to guide users’ exploration. Among the five stages of NMS framework, we pay our main attention on solving the technical challenges existed in nugget combination and refinement.
A Novel Human Opinion Dynamics Based Optimization for Software Cost Estimation
Kriti Changle, Nisha Singh Sumit Kumar Bola
Software is exploited in most of the fields today and plays an important role in both economic and social development. Mature software industries rely on early software cost estimation. Accuracy in estimations allows the company to develop appropriate time plan and estimate the most feasible budget for the project. A major cause of failure of many software projects is the lack of accurate and early cost estimation. However, irrespective of great deal of importance, estimating the time and development cost accurately is still a challenge in software industry. Barry Boehm proposed Constructive Cost Model also known as, COCOMO Model which used basic regression formula with parameters derived from historical project data and characteristics of the current project for estimating the cost of software. This model is a high risk due to low accuracy and lack of reliability. This is where the need of optimization comes in. Various approaches like Genetic Algorithm have already been applied for tuning of the parameters of COCOMO in order to increase its accuracy and reliability. Regardless, that humans are the most intelligent social animals, an approach based on crowd dynamics, opinion dynamics, language dynamics is seldom used for optimization. Interaction between humans gives rise to different kind of opinions in a society. The process of opinion formation evolves from collective intelligence emerging from integrative forces of social influence with disintegrative effects of individualization. Opinion dynamics leads to efficient decision making and so, we propose an approach based on human opinion dynamics for effective and accurate software cost estimation.
An High Equipped Data Hiding Algorithm Based On Secret Fragment Visible Mosaic Images And Pixel Color Transformations For Secure Image Transmission
Pathan Mohd Aziz Khan .Praveen Kumar M.Tech Hod
Hiding the data in digital images has been area of interest in the digital image processing domain. Although so much work has been carried out in the literature to resolve the issues like increasing the data capacity, creating the secret image alike of target image but most of the works fails to meet the practical requirements. This paper presents an approach where mosaic image generation has done by dividing the secret image into fragments and transforming their respective color characteristics into corresponding blocks of the target image. Usage of the Pixel color transformations helps to yield the lossless recovered image based on the untransformed color space values. Generation of the key plays an important role to recover the data from the secret image in lossless manner. Finally the same approach can be performed on videos also which helps to eliminate the flickering artifact to achieve the lossless data recovery in motion related videos. The experimental results shows good robust behavior against all incidental and accidental attacks and compare to the conventional algorithms Performance evaluation has been increased in a significant way.
An High Equipped Power Constrained Algorithm For OLEDS Based On MSR
Dayaker S Yakub M.Tech
This paper presents a power-constrained contrast enhancement algorithmic program for organic semiconductor diode display based on multi scale retinex (MSR). In general, MSR, that is the key element of the planned algorithmic program, consists of power controllable log operation and sub band wise gain management. First, we decompose associate input image to MSRs of various sub-bands, and figure a correct gain for every MSR. Second, we have a tendency to apply a coarse-to-fine power management mechanism, that re computes the MSRs and gains. This step iterates till the target power saving is accurately accomplished. With video sequences, the contrast levels of adjacent pictures are determined systematically using temporal coherence so as to avoid unsteady artifacts. Finally, we gift many improvement skills for data processing. Experimental results show that the planned algorithmic program provides better visual quality than previous strategies, and a consistent power-saving magnitude relation while not unsteady artifacts, even for video sequences
Downloading and performance analyzing of WMV and MPEG files in Cloud Computing.
Manoj Kumar Tiwari .
Video Streaming has become a major application in today’s IT enabled society by many users. WMV and MPEG are the formats which are used in real time streaming in Cloud Computing. In this paper I analyze the performance of WMV and MPEG video format streaming using Java Tool and Video Inspector Tool.
With the advancement of traditional business to electronic business and then electronic business to mobile commerce, the technology changed rapidly. Mobile-Commerce (M- Commerce) has come into existence with the advancement of wireless technology which increases the number of mobile device user and gives a rapid development of e-commerce using these device. M-Commerce is a special type of E-Commerce where all transaction has to be done by mobile phones. Earlier mobile devices are used only for phone calls and SMS but nowadays mobile devices are part of our daily life for doing different type of transaction like mobile banking, mobile ticketing, mobile money transfer etc. More and more users are connected with Smartphone and access their mobile services by installation of applications.
Experimental design of adaptive algorithms to reduce SER in M-ary QAM System
Preetika Gupta , Ankur Singhal
This paper presents an experimental design of adaptive algorithms to investigation of Symbol error rate for adaptive training algorithms without involving with any modified or design algorithms, two conventional adaptive equalizer training methods, the least mean squares (LMS) and recursive least squares (RLS), are implemented in a Quadrature amplitude modulation over sampled received sequence and their performances are compared in different parameters like symbol size, step size and rotation. The algorithms were evaluated using convergence SER at SNR of 20 dB over different number of iterations to determine the convergence rate and constellation diagra
Today’s first technical need is a free platform for usability and if this platform could be customized by the user itself then it creates a unique community named as Open Source. This community provides many different technicalities in operating systems, applications, software, gaming etc. How this community takes its members and let them open their codes to the world for better use and modification. This is what we actually need to know about, how the biggest free and open source community gets its work, develops it, distributes it, use it and modify it without directly monitoring by the profit industries. Their work interacts with many e-commerce or e-businesses missions. Today every computer science student, lecturer, professor or administration should aware of this free and open source development phenomenon to participate with ongoing easy, batter, secure and modifiable
Text Watermarking Approaches for Copyright Protection
Gagandeep kaur, Sukhwinderbir .
With far reaching utilization of Internet and other correspondence advances, it has ended up to a great degree simple to imitate, impart, and appropriate computerized substance. Accordingly, validation and copyright insurance issues have emerged. Content is the most widely utilized medium going over the Internet other than picture, sound, and feature. The real piece of books, daily papers, website pages, promotion, exploration papers, authoritative reports, letters, books, verse, and numerous different records is essentially the plain content. Copyright insurance of plain content is a huge issue which can't be approved. The current answer for watermarking of plain content archives are not powerful towards irregular altering assaults and are inapplicable for various spaces. In this paper, we have clarified content watermarking and its different systems.
DDR3 Based Lookup Circuit For High Performance Network Processing
C.Jayachandra , Vali Babu
Double Data Rate (DDR) SDRAMs have been prevalent in the PC memory market in recent years and are widely used for networking systems. These memory devices are rapidly developing, with high density, high memory bandwidth and low device cost. However, because of the high-speed interface technology and complex instruction-based memory access control, a specific purpose memory controller is necessary for optimizing the memory access trade off. In this paper, a specific purpose DDR3 controller for high-performance table lookup is proposed and a corresponding lookup circuit based on the Hash-CAM approach is presented.
— MOSFETs have been the most popular devices for past five to six decades because of its excellent scalability. But during past few years semiconductor industry is facing various challenges related to scaling. These challenges are related to several fundamental and practical of underlying physics of device such as leakage currents, mobility, reliability, sub threshold swing and in turn degrading the device performance. MOSFET is most uses in inverter, power amplifier and switching devices for electronics. As IMOS will be a good alternative for small scale devices due to its good electrical performances. Its low sub threshold swing property will be very beneficial for high switching devices but this device faces a very critical issue of high operating voltage to operate in ON state and as device is scaled down chances of band to band tunneling is high which increase the leakage current in the device. This paper focus on the problem to reduce the required supply voltage and thresh-old voltage by optimizing its various physical parameter like its Gate length, Intrinsic length and Oxide thickness etc. Apart from this we propose some novel structure of device called Ultrathin IMOS and Silicide based UTIMOS based on the optimization result. These devices have better performance as compare to conventional LIMOS structure.In this paper, a detailed analysis has been done for Lateral impact ionization MOS.
A Study on Various Image Segmentation Techniques: Merits and Demerits
Abin Jose Joy, Meenu K Mathew
Image segmentation is the process of partitioning an image into meaningful components or structures based on color, shape, intensity etc. Image segmentation plays an important role in the field of image analysis, it can also be used for object recognition, visualization and many other image processing tasks. Image segmentation is used to classify or to cluster the image into several parts according to the features extracted from the image. The main goal of the segmentation is to make the image easier to analyze. The output of the segmentation is a set of segments that will cover the entire image. This paper presents the various image segmentation techniques, their merits and demerits.
Efficient Authentication Technique Based On Virtual Password
1Sufyan Panginikkadan ,Zainul Abid T.P.
The uses of internet in online communication have been increased and the threats against the internet security also increased. Here we discuss how to prevent user's passwords from being stolen by adversaries in online environments. The virtual password mechanism prevents user's passwords from being stolen by adversaries. Here propose a virtual password concept involving a small amount of human computing to secure user's passwords in online environments with the freedom to choose a virtual password scheme ranging from weak security to strong security. However, there is trade-off between simplicity and security conflict with each other and it is difficult to achieve both. Further propose several system recommended functions that provide a security and analyse how the proposed schemes defend against phishing, keylogger, shoulder surfing attacks, and multiple attacks. In user-specified functions, we adopt secret little functions in which security is enhanced by hiding secret functions.
The field information Retrieval deals with representing, storage and access to information items.IR is the most usual way of information access, mostly due to the increasing widespread of world wide web(WWW).Information Retrieval mainly deals with retrieval of unstructured data, especially textual documents, in response to a query or topic statement, which may itself be unstructured. In this paper we present the introduction to the information Retrieval and focuses on properties, key techniques and term weighting factor of information retrieval.
A Controlled Approach for Multicast Routing Protocol Using Branching Router
Anishma N , Sajitha M
Traditional multicast routing protocol faces many issues and challenges. To deal with it many approaches differentiating the branching routers from the non-branching ones have been introduced. But these schemes that are proposed have many issues concerning to multicast management, inefficient tree construction and excessive lookups during forwarding process of unicast and multicast packet. This paper is an improvement over novel branching-router-based multicast routing protocol. Here a router is selected to share MC's functionality, hence the overhead on MC is reduced resulting the reduction in join latency.
Implementation Of Clock Gating Logic By Matching Factored Forms
A.kirankumari, P.P. Nagaraja Rao
Clock gating is one among the most widespread circuit technique to scale back power consumption. Clock gating is sometimes done at the register transfer level (RTL). Automatic synthesis of clock gating in gate level has been less explored, however it's certainly additional convenient to designers.Clock gating consists of 2 steps: extraction of gating conditions by merging gating conditions of individual flip-flops, implementation of the gating conditions with minimum quantity of further gates.In this paper,We show a way to do factored form matching, within which gating operates in factored kinds ar matched, as way as possible, with factored kinds of the mathematician functions of existing combinable nodes within the circuit; further gates are then introduced, however just for the portion of gating functions that don't seem to be matched. sturdy matching identifies matches that ar explicitly gift within the factored forms, and weak matching seeks matches that ar inexplicit the logic and so are tougher to get.
“Dynamic Soil Structure Interaction Analysis for Piled Raft Foundation”
Chaithra T P, Manogna H N
Damages caused by recent earthquakes have pointed out that the seismic behaviour of a structure is highly influenced not only by the response of the super structure but also by the response of the foundation and ground as well. The effect of SSI becomes prominent for heavy structures resting on relatively soft soils. The main focus of the present investigation is to evaluate the seismic performance of the fifteen storey, reinforced concrete building with piled raft foundation using linear time history analysis by finite element based software SAP 2000. The various results such as time period, displacements, base shear and settlements are compared and the effect of piled raft foundation behaviour is evaluated.
Group Authentication Using Threshold Secret Sharing
1Parvathy Sudheer Zainul Abid T. P
Secret sharing has drawn much attention in the research community, Secret sharing refers to methods for distributing a secret amongst a group of participants, each of whom is allocated a share of the secret. The secret can be reconstructed only when a sufficient
number, of possibly different types, of shares are combined together, individual shares are of no use on their own. Message divides into n pieces and called shares or shadows, any t(threshold value) of them can be used to reconstruct the message. A threshold value is set based on the required number of context conditions for permitting data access, if a threshold number of context parameters are satisfied then the decryption key is generated from the corresponding key shares and the data is decrypted. Secret sharing schemes are ideal for storing information that is highly sensitive and highly important.
Study of Software Quality and Risk Estimation and Quality Cost Analysis using empirical study
Dr. P. K. Suri , Pooja
In this paper we calculate the cost of software product using Empirical study. Early prediction of software cost and quality is important for better software planning and controlling. In early development phases, design complexity metrics are considered as useful indicators of software testing effort and some quality attributes. Although many studies investigate the relationship between design complexity and cost and quality, it is unclear what we have learned from these studies, because no systematic synthesis exists to date. Referring to the famous statement of Tom DeMarco , “You cannot control what you cannot measure”. Quality Measurements are – as in any other engineering discipline – also in software engineering a cornerstone for both improving the engineering process and software products. Quality Measurement not only helps to visualize the abstraction of software development process and product but also provide an infrastructure to perform comparison, assessment and prediction of software development artifacts. The large part of software measurements is to, in one way or another, measure or estimate software complexity and quality due to its importance in practices and research. The relationship between cognitive complexity and software external quality depends on comprehending ability of software developer, tester or maintainer. This factor is not deterministic and hence, cannot be investigated any other ways than empirically. In order to improve quality an organization must take into account the costs associated with achieving quality since the objective of continuous improvement programs is not only to meet customer requirements, but also to do it at the lowest cost. This can only happen by reducing the costs needed to achieve quality, and the reduction of these costs is only possible if they are identified and measured. Therefore, measuring and reporting the cost of quality (CoQ) should be considered an important issue for managers. Risk management is the identification, assessment, and prioritization of risks followed by coordinated and economical application of resources to minimize, monitor, and control the probability and/or impact of unfortunate event or to maximize the realization of opportunities. Risk management’s objective is to assure uncertainty does not deviate the endeavour from the business goals.
Efficient Publish/Subscribe System Using Identity Based Encryption
Prathyusha M. Kumar., Mumthas T.K.
Publish/Subscribe (pub/sub) system is an emerging communication paradigm that offers efficient and decoupled information dissemination in distributed environments. Decoupling increases the privacy of each participants in the system . Publishers generate the flow of information as publications of interests expressed as subscriptions. The perfect decoupling is maintained by using the pairing based cryptographic mechanisms. The provisioning of basic security mechanisms such as authentication, confidentiality, access control are highly challenging in a content based publish/subscribe system. This work proposes a novel approach to provide confidentiality , authentication and access control in a brokerless content based publish/subscribe system using Hierarchical CP -ABE (Ciphertext Policy Attribute Based Encryption) scheme. So the new Publish/Subscribe system is very efficient with less computational time for cryptographic operations and also provide fine grained access control hierarchically.
Scalable and Efficient Data acquisition in Service Oriented Vehicular Adhoc Networks
Nishi K.M. , Indu P
Vehicular Ad Hoc Network (VANET) is the subpart of the Mobile Ad Hoc Network(MANET) that aims at enhancing the safety and efficiency of transportation systems. In VANET vehicles communicate with each other and with roadside units (RSUs). Service oriented vehicular networks are special types of VANETs that provides infrastructure based commercial services, including Internet access, real-time traffic management, video streaming, and content distribution. Many forms of attacks against service-oriented VANETs that attempt to threaten their security have emerged. The success of data acquisition and delivery systems depends on their ability to defend against the different types of security and privacy attacks that exist in service-oriented VANETs. Service-oriented vehicular security system allows VANET users to exploit RSUs in obtaining various types of data. When multiple users are connected to an RSU at a time it may results in RSU network overhead. So a scalable and an efficient system has been designed by applying certain scheduling algorithms there by avoiding Packet delay and traffic overhead.
Survey on String Transformation - A Query Based Approach
Shijina J Salim , Diliya M Khan
Data mining is a powerful area, which computerizes the process of searching valuable information from a large database. Its wide range of applications promises a future, where the data grow rapidly. Many problems in natural language processing, data mining, information retrieval, and bioinformatics can be formalized as string transformation. Proposed system implements string transformation in data mining field with the help of efficient algorithms. As its name implies string transformation includes a set of operators to transform a given string into most likely output strings. Insertion, deletion, transposition, and substitution are the operators for transformation. Transformation rules and predefined rule indexes are used here to avoid unwanted searches and time delay. Here the users can view the formation of possible outcomes from the given string. By extracting these, proposed system finds most appropriate matches with respect to the given string and gives them as output within seconds. Another important feature is to provide query reformulation. By using efficient methods, proposed system can introduce query reformulation with useful description about the given query. Query reformulation is also a transformation technique and it deals with the term mismatch problem. Here similar query pairs can mine from training data. Proposed system tries to transform a given query to original query and therefore make a better match between the query and the document and also give a brief description about this like a search engine. Challenge is compounded by the fact is that new information from the field is being added to the database on a daily basis. For this purpose, proposed system use a dictionary method to add details to the database and the information retrieved by text mining approach. Text mining is a new area of computer science and a sibling of data mining which fosters strong connections with data mining and knowledge management. Proposed system is an efficient system and need less time for the retrieval of data.