Siddu P. Algur1 , Basavaraj A. Goudannavar Prashant Bhat
Classification and Analysis of Web Multimedia Data using Principal Component Analysis
Siddu P. Algur1 , Basavaraj A. Goudannavar Prashant Bhat
Over the web, the size of multimedia data is increasing in a rapid way. Different types of web multimedia data are used by the web users for different applications. These multimedia data belongs to different categories of domains such as Entertainment, Sports, News and Discussion, Music etc. An automatic classification/prediction of web multimedia data without knowing the content is a challenging and complex research aspect. This paper proposes two approaches to classify web multimedia data viz., classification of web multimedia using dimension reduction technique and classification of multimedia data without reducing the dimensions. To reduce the dimension of the web multimedia metadata, we adopt Principal Component Analysis (PCA) technique to reduce the data dimensions (attributes). The proposed PCA technique involves orthogonal transformation of multimedia metadata values, construction of covariance matrix and computation of Eigen values to reduce the dimensions. The reduced and non-reduced multimedia data are classified separately using DT and KNN classifiers. The classification results of reduced and non-reduced dimensions of multimedia data are analyzed, compared and as a task of KDD.
Weighted Energy Efficient Cluster Based Algorithm in MANET
Sanehi Sirohi Manoj Yadav
A mobile adhoc network is a network in which nodes are dynamic in nature and has limited bandwidth and minimum battery power. For providing the scalable routing the nodes are divided into clusters, in clusters there should be a cluster head which contains all the information about its nodes, as in the flat routing every node perform the same role therefore network lifetime is less. The different schemes in clustering are based on different criteria. A cluster head is selected according to specific combination or metric such as identity, degree, energy, weight, mobility etc. Here, a new algorithm is proposed based on the existing algorithm “A novel weight based clustering algorithm for routing in MANET” the energy factor is included in the selection of cluster head. If the two nodes in the cluster are the candidates for the selection of cluster head then the energy of every node is calculated to select appropriate cluster head. This algorithm is named as “weighted energy efficient cluster based algorithm”. The results of this algorithm are compared with the existing one. The results are calculated on four parameters. These are end to end delay, packet delivery ratio, energy and throughput. The throughput of the newly proposed algorithm is better than the existing WBA algorithm. And the other factors are compared which shows high rate of WBA algorithm
Training Feed forward Neural Network With Backpropogation Algorithm
RashmiAmardeep Dr.K ThippeSwamy
Neural Networks (NN) are important data mining tool used for classification and clustering. NN learns by examples. NN when supplied with enough examples performs classification and even discover new trends or patterns in data. NN is composed of three layers, input, output and hidden layer. Each layer can have a number of nodes and nodes from input layer are connected to the nodes of hidden layer. Nodes from hidden layer are connected to the nodes of the output layer. Those connections represent weights between nodes. This paper describes popular Back Propagation (BP) Algorithm is proposed for feed forward NN algorithms. The Back-propagation (BP) training algorithm is a renowned representative of all iterative gradient descent algorithms used for supervised learning in neural networks. The aim is to show the logic behind this algorithm. In BP algorithm, the output of NN is evaluated against desired output. If results are not satisfactory, weights between layers are modified and the process is repeated again and again until an error is minimized. BP example is demonstrated in this paper with NN 2-2-1 architecture considering momentum. It is shown that most commonly used back- propagation learning algorithms are special cases of the developed general algorithm. The Sigmoid activation function approach is used to analyze the convergence of weights, with the use of the algorithm, toward minima of the error function.
Comparative Study of RSA and Probabilistic Encryption/Decryption Algorithms
N. Priya Dr. M. Kannan
Networks are virtual windows to the people which allow remote access to geographically distant resources without having to be physically present. This is achieved by sending data back and forth in the network. But networks are vulnerable because of their inherent characteristic of facilitating remote access. Hence network security plays a vital role in real time environment as it protects loss and misuse of the data in the network. There are many applications and algorithms running behind the scene for providing security while transmitting data in the network. One of the most important components of network security is cryptography and it consists of many algorithms that protect user from the adversary like trapdoor, eavesdroppers, hackers etc. There are many well known algorithms which serves this purposes and each have its own advantages and disadvantages. The aim of this paper is to outline the key concepts involved in two algorithms namely Goldwasser-Micali encryption and most widely used RSA. Some of the metrics used for comparison are encryption time, decryption time and size of cipher text with varying plain text sizes which are the key considerations for choosing an encryption algorithm. The comparison of encryption and decryption time with varying cypher text sizes for both the algorithms using math lab code and preliminary observations are depicted in this paper.
Identification of Human using Palm-Vein Images: A new trend in biometrics
Miss. Swapnali N. Dere Dr. A. A. Gurjar
In the ubiquitous network society people faced with the risk that others can easily access their information anytime and anywhere. Because of this risk, personal identification technology, which can distinguish between registered legitimate users and imposters, is now generating interest Traditional personal verification methods rely heavily on the use of passwords, personal identification numbers (PINs), magnetic cards, keys, smart cards, etc. No matter which method is employed, each of these offers only limited security. To solve these problems, biometric authentication technology, which identifies people by their unique biological information, is attracting attention
An Efficient Fair Resource Allocation in Multicast Networks
M. Sangeetha
A decentralized algorithm is presented that enables the different rate-adaptive receivers in different multicast sessions to adjust their rates to satisfy some fairness criterion. The problem of congestion control in networks, which supports both multirate multicast sessions and unicast sessions. A one-bit Explicit Congestion Notification (ECN) marking strategy to be used at the nodes is also proposed. The congestion control mechanism does not require any per-flow state information for unicast flows at the nodes. At the junction nodes of each multicast tree, some state information about the rates along the branches at the node may be required. This Paper calculates the throughput and rates of unicast and multicast packets for the given topology network receivers. The congestion control mechanism takes in to account the diverse user requirements when different receivers within a multicast session have different utility functions, but does not require the network to have any knowledge about the receiver utility functions. This paper compared the performance and fairness of unicast and multicast sessions.
Prediction of Breast Cancer using Random Forest, Support Vector Machines and Naïve Bayes
Madeeh Nayer Elgedawy
Machine learning techniques can be used to judge important predictor variables in medical datasets. This paper applies three machine learning techniques: Naïve Bayes, SVM and Random Forest to Wisconsin Breast Cancer Database. The three developed models predict whether the patients’ trauma are benign or malignant. The paper aims at comparing the performance of these three algorithms through accuracy, precision, recall and f-measure. Results show that Random Forest yields the best accuracy of 99.42%, which is slightly better than both SVM and Naïve Bayes that have accuracies of 98.8% and 98.24% respectively. These results are very competitive and can be used for diagnosis, prognosis, and treatment.
In this paper an attempt has been made to propose five query formation methods with professional usage point of view for clustering the dowry crime related legal precedents. K-mean clustering method is used on Tanagra open source for all the methods. The query formation methods are used to generate five types of queries to find the cosine similarity measure between the query Term Frequency Matrix (TFM) and Repository TFM. The repository TFM consists of 500 judgments related to dowry crimes and used in [27]. After the formation of clusters the performance metrics are computed and the results are analyzed in a twofold i) Cluster analysis for within the query formation method ii) Comparison of clustering results of different query formation methods Finally the conclusions and the future scope of the research presented at the end of the paper.
Analytical Solutions For Security Issues In Cloud Environment
Smaranika Mohapatra Dr. Kusum Lata Jain
Cloud Computing has become an important infrastructure in today’s computing. Globally, the computation is moving towards the Cloud Architecture. It is required that we utilize the methodology to get better results in computing environment and making it applicable in various branches. The main thing to take into account is the security where a few security issues related to cloud computing are derived from the ways used to create such actions; it includes how the methods are scheduled and what type of data can be adopted in cloud. If security is not durable and uniform, the flexibility and merits that cloud infrastructure has to give will have small dependability. In this paper we try to direct and establish the important security matters and provide some kind of solutions in this computing methodology and give a view on the ongoing trends for security in this technological outbreak
Green chemistry aims to provide environmentally benign products from sustainable resources, using processes that do not harm people or the environment. Most chemical reactions have been correlated in molecular solvents. Recently , have even, a new clan of solvents has emerged “ Ionic Liquids “.“. Ionic liquids (ILs) are new organic salts that exist as liquids at low temperature (<100oC). An important future of ILs is immeasurably low vapor pressure. For this reason they are called green solvents in contrast to traditional Volatile Organic Solvents (VOCs). ILs have many attractive properties such as chemical and thermal stability , inflammability , high ionic conductivity and a wide electrochemical potential window . Therefore they have been extensively investigated as solvents or cocatalyst in various reactions including organic catalysis and inorganic synthesis, Bio-catalysis and polymerization. Ionic liquids find application in alkylations, allylations, hydroformylations, epoxidations, synthesis of ethers, Friedel craft reaction, Diel-Alder Reaction, Knoevengal condensation and Wittig reaction. ILs have been claimed as being „„green solvents‟‟ and possible alternative to volatile organic solvents. This has been justified in some applications where ILs, because of their negligible vapour pressure and non-flammability, are used favourably instead of chlorinated solvents. An example is given by the development of an optimised process for degreasing and/or scouring metal, ceramic, glass, plastic composite material or semiconductor surface, by treatment of the surfaces in a solution comprising an IL .Some ILs have been the subject of toxicity and ecotoxicity studies and data are now available on a larger variety of organisms (bacteria, fungi, fish , algae.). Most studies have been carried out on imidazolium- and pyridinium-based ILs, with alkyl or alkoxy side chains. The variety of anions studied is limited mainly to bromide, chloride, hexafluorophosphate and tetrafluoroborate. Much less research has been devoted to the determination of the biodegrade-ability of ILs but the design of biodegradable ILs has been covered in recent papers. A high throughput screen based on the Agar Diffusion method was recently applied to test, in a first rapid approach, the toxicity of ILs towards microorganisms and to distinguish toxic and biocompatible ILs. My case study is about to produce effective yield of product from the reactants by choosing selective Ionic Liquids through Green mechanism. Now in modern area of ILs stems from the work on “Alkyl pyridinium “and “Dialkyl imidazolium” salts are used as green solvents.>
Development of Students Results Monitoring System Model
Sulis Sandiwarno
Information technology has a huge role in today's life. Development of Information Technology will be used by the users to help with the work. Application of Information Technology in education is to have good aim, in the world of information technology education can help all the activities and the learning process can also generate reports very quickly. This study focused on the development models of monitoring parents to students results. In this research method used is OOAD (Object Oriented Analysis Design). The results of the research that has been done is to help the process of monitoring the learning process effective and efficien.
Constant amplitude fatigue crack growth life prediction of 8090 Al-alloy by ANFIS
J. R. Mohanty
The aim of the current work is to predict fatigue crack growth life of 8090 T651 Al-alloy under the influence of load ratio by applying adaptive neuro-fuzzy inference system (ANFIS) technique. The model has been trained by minimization of root mean square error (RMSE) principle. It has been observed that the proposed model predicts well the fatigue crack growth life of the alloy with 0.801% deviation in comparison to experimental results.
Hybrid Technique Based Blind Digital Image Watermarking
Shubhangi Pande Santosh Varshney
Nowadays Internet is an excellent priority base source for everyone for sales and distribution and everything for digital assets, but because of this world wide sharing of digital information now there is an issue of copyright compliance and content management can be a challenge for us. Now a digital information can be used everywhere very easy way with or without using consent. Digital image Watermarking is an ultimate solutions that can add an extra security layer of protection for a digital images. In a digital image watermarking technique an image or an object embedded with some information carrying watermark, this object may be audio or video. These watermarking embedded techniques based on DWT –Twin Encoding technique drive the benefits from random sequence generated by Arnold and Chaos transformations. The proposed system before sending the multimedia data it apply the composite partition algorithm, watermark optimization techniques and the resulting file is Encrypted using RSA algorithm with public key and send it to the client side. Simulation result shows the performance of watermarking of image against different attacks. In the end, the performance of the proposed technique will be measured on the basis of PSNR, and NCC
A Review on Dynamic Clustering: Using Density Metrics
Megha S.Mane , Prof.N.R.Wankhade
Clustering of high dimensional dynamic data is challenging problem. Within the frame of big data analysis, the computational effort needed to perform the clustering task may become prohibitive and motivated the construction of several algorithms or the adaptation of existing ones, as the well known K-means algorithm. One of the critical problem in k-means, k-menoid, k-means or other clustering algorithms required to pre-assigned no. of k which cannot detect non-spherical clusters. With the existing RLClu algorithm needs users to pre-assign two minimum thresholds of the local density and the minimum density-based distance. Clustering is the process of data classification when none prior knowledge required for classification. To overcome these problems STClu clustering algorithm is proposed. In this algorithm a new metric is defined to evaluate the local density of each object, which shows better performance in distinguishing different objects. Furthermore, an outward statistical test method is used to identify the clustering centers automatically on a centrality metric constructed based on the new local density and new minimum density-based distance. Dynamic clustering is an approach to get and extract clusters in real time environments. It has much application such as, data warehousing, sensor network etc. Therefore there is need of such technique in which the data set is increasing in size over time by adding more and more data.
A mobile ad hoc network (MANET) is a self-configuring, infrastructure-less network of mobile devices which is connected without any wires. In MANET the node can move in any direction independently.The main challenge in building a MANET is equipping each device to continuously maintain the information needed and to properly route traffic. So, these networks may operate by themselves or may be connected to the larger Internet. They may contain one or multiple and different transceivers between the nodes. Finally in a highly dynamic, autonomous topology. Vehicular Ad-hoc Networks (VANETs) are used cars and cars and roadside equipment. MANETS can be used for facilitating the collection of sensor data for data mining for a variety of applications such as air pollution monitoring and different types of architectures can be used for such applications.
Implementation of OFDM wireless communication model for achieving the improved BER using DWT-OFDM
M. Praveen E. Adinarayana Dr .V.S.R.Kumari
Orthogonal Frequency Division Multiplexing (OFDM) is of the strong prospects as a future wireless communication system. OFDM is one of the key technologies which enable non-line of sight wireless services making it possible to extend wireless access method over wide-areas. Multicarrier modulation is such a plan, to the point that transmits the information by isolating the serial high information rate streams into various low information rate parallel information streams. Orthogonal Frequency Division Multiplexing (OFDM) is a sort of multi-transporter balance, which partitions the accessible range into various parallel subcarriers and each subcarrier is then adjusted by a low rate information stream at various carrier frequency. Compared to conventional approaches the 4th generation Long term evolution application has better spectral efficiency in terms of accuracy and high data rate, the 4th generation Long term evolution approach is formed by the collaboration of OFDM and MIMO. Although OFDM has many advantages over FDM but it suffers from inter carrier interference and inter symbol interference when multiple carriers are used and due to this interferences loss of Orthogonality happens, in order to overcome these interferences usage of cyclic prefix has became mandatory. But usage of cyclic prefix shows huge negative impact on bandwidth efficiency as the cyclic prefix approach consumes nearly 20% of bandwidth and BER performance too affected. In this paper a novel wavelet based OFDM model is presented which is mainly intended to provide good Orthogonality and better spectral efficiency using various modulation techniques, the unique thing in the usage of wavelet based OFDM is it does not need any spectral efficiency and absence of the cyclic prefix increases bandwidth efficiency when bandwidth increases simultaneously spectral efficiency increases. Finally the usage of the wavelet based OFDM shows improved BER over conventional FDM communication model. The simulation results indicates the usage of wavelet based OFDM in place of DWT based OFDM in LTE and finally the comparison between wavelet based OFDM and DFT based OFDM
Review of protocols used in Multicasting Communication
Ashish, Jyotsana pandey
In the age of multimedia and high speed network. Multicast is one of the mechanism by which the power of the internet can be further harnessed in an efficient manner. Multicast services have been increasingly used by various continues media application. Multicasting is the ability of communication network to accept a single message from an application and deliver copies of the message to multiple recipients of different location. This paper provides the role of protocols in multicast communication.
Specific Trait Identification in Margins Using Hand Written Cursive
Syeda Asra, Dr.Shubhangi DC
Margins in handwriting reflect an individual’s personality. Margins could either be narrow or wide with each showing certain personality traits about the writer. Narrow left margin indicate writer is attached to his past than to his future. Wide left margin indicates writer leaves behind ones past and continue moving. A wide right margin indicates writer may be afraid to take a future step. A narrow right margin indicates writer may be willing to take a forward step and that he may not be experiencing uncertainties at that period of his life. Balanced size indicates the writer balanced person when it comes to risk taking. The handwriting of 100 adults was submitted for graph logical analysis. The graphologist's answers to questions on the patient's personality, her description of his character and her assessment of his inclination towards past ,present and current were checked by the person’s own answers to the questionnaire, by the personality descriptions in the case-sheets, and by the results of the Progressive Matrices Test. In this work carried, the performance was as high as 95%. The feature extraction was done using Zernike moments.
Sentiment Analysis of Reviews for E-Shopping Websites
Dr. U Ravi Babu
The sentiment analysis is one of the popular research area in the field of text mining. Internet has become very popular resource for information gathering. People can share their opinion related to any product, services, events etc over internet. Websites like Amazon, Snapdeal, Homeshop18 etc are popular sites where millions of users exchange their opinions and making it a valuable platform for tracking and analyzing opinion and sentiments. “What other people thing” is being an important piece of information whenever we want to take any decision. Sentiment analysis is the best solution. This gives important information for decision making in various domains. Various sentiment detection methods are available which affect the quality of result. In this paper we are finding the sentiments of people related to the services of E-shopping websites. The main goal is to compare the services of different E-shopping websites and analyzing which one is the best. For this we use five large dataset of five different E-shopping website which contains reviews related to the services. “Sentiwordnet dictionary” is used for finding scores of each word. Then sentiments are classified as negative, positive and neutral. It has been observed that the pre-processing of the data is greatly affecting the quality of detected sentiments. Finally analysis takes place based on classification.
Ashirwad: An extremely cost efficient Swabbing Robot
Rupinder Kaur
This swabbing robot is highly beneficial for cleaning purpose especially in homes, Offices, Industries where cleanliness is a major concern. Many research organizations are busy finding the best outcomes through the artificial intelligence. Of course, Artificial intelligence is a branch of technology that makes computers thinks like human brain. This device will sweep, and mop the floor area with brush and other wiping components; also it collects the dust and other small parts in it. Mapping is used to instruct the moment of this mini device. The device is too easy to use, very cost effective and cleans every corner of the area. Being autonomous, it would work in your absence
Biology and computer science are two different sciences but both sciences are sister.In this research I intend to show that how biology is used in computation. In future, how biology is used to make enhancement in computer sciences. Use of biology in computer science is known as biological computation. Biological computation is a subfield of computer science and computer engineering using bioengineering and biology to build computers. One of them is DNA computing. DNA computing is a way that aims at harnessing individual molecules at the nano-scopic level for computational purposes. Computation with DNA molecules possesses interest for researchers in computers and biology. Due to its vast parallelism and high-density storage facility, DNA computing approaches are used to solve many problems. DNA has also been explored as an excellent material and a fundamental building block for building large-scale nanostructures, constructing individual Nano mechanical devices, and performing computations
Approaches for Camera Source Identification: A Review
Mr. Amol P. Khapare Mrs. D. A. Phalk
Identifying the characteristics and the originality of the any digital device has become more important in today’s digital world using digital forensics. This survey paper studies the recent developments in the field of image source identification. Proposed methods in the literature are divided into five broad areas based on source identification using Metadata (Exif), Image Features, CFA and Demos icing Artifacts, Lens Distortions and Wavelet Transforms. The methods and algorithm of the proposed approaches in each category is described in detail to use accurate technique in source camera identification.
Analysis of handwriting patterns of Dyslexic Children vis-à-vis the nondyslexics using Hamming Distance
Dr. Jyotsna Dongardive Malhar Margaj
Learning disabilities can affect a person's ability to speak, listen, read, write, spell, reason, recall, organize information, and do mathematics. Researcher has tried to study what could be the percentage of children having writing difficulties among all types of Learning Disability. In this paper, the researcher has focused particularly on Dysgraphia. The survey conducted with a group of Learning Disabled children and the researcher wanted each of them to identify certain words. This study is to investigate in which pattern of the text Dysgraphia child finds difficulty to understand and by how much error factor a Dysgraphia child is distinctly identifiable from nonDysgraphia one.
Double-Faced Data Hiding Techniques in Images using RIT: A Survey
Mr. Santosh Kale
RDH is the technique of reversible data hiding. It maintains the original image losslessly retrieved after data embedded is obtained while protecting the image content’s as confidential. In this survey paper, we studied different reversible data hiding methods. All existing methods embed data by reversibly vacating room from the encrypted images which may result into some errors on data extraction or image restoration. Also in literature survey it is shown that DE, interpolation technique, prediction and sorting, histogram modification are the generalized methods for hiding data, but these methods are implemented only in plain images. In recent past all these methods are used in encrypted images to enhanced security. All these methods have their own advantages but no single approach is feasible as well as applicable for all the case such as, data hiding, security & privacy, image recovery etc. We have analyzed that there is need of high security as well as maintaining quality of original image during transmission and exchange of image
The growing computing abilities of wise phones, these idle phones constitute a significant computing infrastructure. Therefore, to have an enterprise which gives its employees with wise phones, we reason that a computing infrastructure that leverages idle wise phones being billed overnight is definitely an energyefficient and price-effective option to running certain tasks on traditional servers. Every evening, many wise phones are blocked right into a source of energy for recharging battery. While parallel execution models and schedulers exists for servers, wise phones face a distinctive group of technical challenges because of the heterogeneity in CPU clock speed, variability in network bandwidth, minimizing availability than servers. Within this paper, we address a number of these challenges to build up CWC-a distributed computing infrastructure using wise phones. We next investigate whether continuous usage of the CPU affects the CPU efficiency. Our evaluations utilizing a test bed of 18 Android phones reveal that CWC’s scheduler yields a make span that's 1.6x quicker than other simpler approaches. We implement and evaluate a prototype of CWC that utilizes a manuscript scheduling formula to reduce the make length of some computing tasks.
Optimizing Information Rate in Polynomial Based Image Secret Sharing Scheme Using Huffman Coding
Nehal Markandeya, Prof.Dr.Sonali Patil
Image secret sharing is a technique to share a secret image among the group of participants. These individual shares are useless on their own, but when specified numbers of shares are combined together it gives the original secret image. These image shares has size greater than or equal to size of original image, it require high bandwidth while transmission. Because of the excessive bandwidth requirements of image there is need to reduce the bandwidth of images. To reduce the size of image, Huffman compression coding techniques are used. The purpose of this paper is to reduce the image share size in Thien and Lin’s secret sharing scheme using Huffman coding technique. The experimental result’s with high PSNR value of shows good image quality. Up to 40% reduction is achieved using compression technique, which results in reducing storage space and saving time while transmission images over network.
Image can be represented with minimum number of bits by using image compression. When images are transferred over the network it requires space for storage and time to transmit image. The present work investigates image compression using block truncation coding. Three algorithms were selected namely, the original block truncation coding (BTC), Absolute Moment block truncation coding (AMBTC) and Huffman coding and a comparison was performed between these. Block truncation coding (BTC) and Absolute Moment block truncation coding (AMBTC) techniques rely on applying divided image into non overlapping blocks. They differ in the way of selecting the quantization level in order to remove redundancy. In Huffman coding an input image is split into equal rows & columns and at final stage sum of all individual compressed images which not only provide better result but also the information content will be kept secure. It has been show that the image compression using Huffman coding provides better image quality than image compression using BTC and AMBTC. Moreover, the Huffman coding is quite faster compared to BTC.
Web Mining is extracting information from the web re-sources and finding interesting patterns that can be useful from ever expanding database of World Wide Web. Whenever we talk about data, we conclude that there is a huge range of data on World Wide Web. Due to heterogeneity and unstructured nature of the data available on the WWW, Web mining uses various data mining techniques to discover useful knowledge from Web hyperlinks, page content and usage log. Web Content Mining is a component of Data Mining. The main uses of web content mining are to gather, categorize, organize and provide the best possible information available on the Web to the user requesting the information. This paper deals with a preliminary discussion of Web content mining, contributions in the field of web mining, the prominent successful tools and algorithms.
Finding the Routing Table by Dynamic Processing Method in Distance Vector Routing
Jyotsana Pandey Ashish
Distance vector routing is a routing protocol for communication methods used in packet-switched networks. It makes use of distance to decide the best path for forwarding the packets. DVRP (Distance Vector Routing Protocol) uses routing hardware to find distances to all nodes within a network. DVRP method uses mainly two factors – distances and vectors. Distance is the number of count it needs to reach to the destination and vector is the trajectory of the message through the given nodes. DVRP uses the Bellman-Ford equation for calculating the paths and for updating the routing tables. This paper proposes an approach for calculation and updating of routing tables which is based on Dynamic processing.
Usage of Data Mining Techniques for combating cyber security
Farhad Alam Sanjay Pachauri
Cybersecurity is concerned with protecting computer and network system from corruption due to malicious software including Trojan horses and virus. Security of our network system is becoming imperative as massive sensitive information is transmitted across the network. In this research paper, data mining application for cybersecurity is highly explored. We discussed various cyber-terrorism or attack committed across the network such as malicious intrusion, credit card fraud, identity thefts, and infrastructure attack. Data mining techniques such as classification, anomaly, link analysis and soon are being applied to detect or prevent the aforementioned cyber-terrorism or attack. Recommendations were made and suggestions for further study was indicated are amazingly valuable in finding security breaks
Evaluating Prediction of Customer Churn Behavior Based On Artificial Bee Colony Algorithm
Riddhima Rikhi Sharma Rajan Sachdeva
As in the competitive environment, it becomes necessary to focus on retaining churn customers as well as attracting new customers. Various algorithms of Data Mining have been used for making distinguish between customers into loyal and churn. This paper represents the ant bee colony algorithm which has been used to get accurate results. The use of ABC with two best values local best and global best makes it more effective to obtain the more pleasing and comfortable results. PSO is used to search the best solution with two best values named pbst and gbst by using iteration with initial velocity and positions.
Data Hiding System with Mosaic Image for Protected Communication
Manjunatha N Mrs. Reshma M
Another safe picture transmission method is proposed, which changes actually a given gigantic volume mystery picture into a mystery section unmistakable picture called mosaic picture of a similar size. The mosaic picture, which seems, by all accounts, to resemble a discretionarily picked target picture and might be used as a camouflage of the mystery picture, is yielded by isolating the mystery picture into areas and changing their shading ascribes to be those of the looking at bits of the objective picture. Dexterous procedures are planned to lead the shading change handle so mystery picture might be recovered losslessly. A plan of taking care of floods/undercurrents in the changed over pixels shading values by recording the shading contrast in the untransformed shading space is also proposed. The information required for recovering the mystery picture is installed into the made mosaic picture by a lossless data hiding arrangement utilizing a key. Great test comes about exhibit the achievability of the proposed technique. Index terms: Color Transformation, data hiding, image encryption, secure image transmission, mosaic image
Recursive Formulation Based Parallel Self Timed Adder Design Based Architectural Design and CMOS Implementation Approach
Badavath Ravikumar, Fouziya Yasmeen
As technology scales down into the lower nanometer values power, delay, area and frequency becomes important parameters for the analysis and design of any circuits. As technology scales down into the less nanometer values power, delay, area and frequency becomes important parameters for the analysis and design of any circuits. In this paper, an asynchronous parallel self timed adder based on a recursive formulation for performing multi bit binary addition is being implemented. A practical implementation is provided along with a completion detection unit. The implementation is regular and does not have any practical limitations of high fanouts. A high fan-in gate is required though but this is unavoidable for asynchronous logic and is managed by connecting the transistors in parallel. Simulations have been performed using industry standard toolkits that verify the practicality and superiority of the proposed approach over existing asynchronous adders.
Behaviour analysis of STREE, SABR and SARDS under different simulation enviornements:A Case Study.
Roopashree H.R., Dr. Anita Kanavalli
The Wireless Sensor Network (WSN) has been continuous target for research because of Its potential use in data collection Techniques in different hostile and un secured environments. Various techniques and Methods has been explored by different researchers in the area of minimizing energy efficiency and maximizing security concerns in WSN, But still It is one of the key area to look further. In future WSN will be connected to Internet of Things (IoT) so researches need to study the behavior of proposed techniques in different simulation environments. In this paper we have case studied and shown the our earlier proposed Techniques i.e. STREE,SABR and behave in different simulation environments.
Research for Cipher Chip for the Encryption of Sensor data: A Survey
Shali Sara Abraham , Supriya L.P
The examination of remote sensor data encryption chip focus is proposed to update prosperity and upgrade the intelligent media remote sensor data's protection and quicken the information get ready to speed. The chip can encode and unscramble the relating plaintext and figure content with the assistance of a key. The key is appropriated randomly to make the data secure. This paper gives separates of the plan of the sensor chip. It gives the arranged standard of the chip, and guarantee about how the key is self–assertively passed on. It in like manner examination the working limit of the chip with the help of investigations. With the help of figure chip, the data can encode and decipher precisely. By laying out the hardware chip, it realizes the encryption and unraveling process, quicken the taking care of speed and make the information secured. The chip will be easily associated with sensor center points. It serves to lessen the degree of sensor centers and power usage will be reduced.
The present requests for cell phones are to save vi-tality while growing the mixed media administrations. Lamentably, batteries’ lifetime has not been as stretched out as it would be attractive. Because of this chopping down vitality utilization in each undertaking performed by these gadgets is basic. JOKER is foreign as an opportunis-tic routing protocol. This thought acquires oddities both the applicant determination and coordination stages, which allowance the expand of a network supporting multimedia traffic and enlarge the nodes’ energy capability. Perception of JOKER and BATMAN is appeared in this paper with diverse different execution, demonstrates its predominance such as video spilling as far as QoE while resolving power draining depletion in routing tasks
Foundation of Ethics and Practice in Amitābha Cult: Modern Social Perspective
Le Trong Nghia
Ethics is one of the most important areas of philosophy which is studied and analyzed not only in science but also in religion. The value and benefit of ethics can help one to have good relation between the members of society. The aim of secular ethics is to teach us how to live, work and treat others in the best way. The aim of religious ethics is to help us become not only a good man in this life but also attain holiness in the present time, and the future one. Therefore, ethics is very important in any religious teaching. Each religious teaching has its own ethical theory which stresses the important of ethics for attaining the final goal. In this paper, I will discuss about the Buddhist ethical theory and its practical application in the Amitābha Cult.
Maximization Influence Flow Using Modified Greedy Algorithm
Jyoti Rani , Prof. Ashwani Sethi
: The influence maximization problem is of finding a set of seed nodes in social network through which we can maximize the spread of influence. These seed nodes can be used in online marketing so as to maximize the profit. Marketers can get particular group of customers through online market. In this paper we are modifying greedy algorithm so as to maximize results of influence maximization.
Mining Web Graphs for Large Scale Meta Search Engine Results
Bandi Krishna Dr.V.B.Narasimha
Web is huge, To get efficient result s from the search engine is difficult task. Meta search engine (MSE) is search tool that sends user requests to several search engines or databases, aggregates the results, Merges or re-ranks them into a single list and displays them to users with the help of web graphs .MSE enable users to enter search query once and access several search engines simultaneously. This technique saves lot of time to the user from having to use multiple search engines separately by initiating the search at a single point. Today’s Most of MSEs Employ only a small number of general purpose search engines. Building a large scale MSE using numerous specialized search engines is another area that deserves more attention. Arising challenges from building very large scale MSE includes automatic generation and Maintenance of high quality search engine representatives needed for efficient and effective search results, and highly automated Techniques to be added into MSE. In this paper, we are implementing our study on how to merge the search results returned from the multiple component search engines into a single ranked list through web graphs. Web graphs are essential for producing effective and efficient results.
The most common computer authentication method is to use alphanumerical usernames and passwords. This method has been shown to have significant drawbacks. For example, users tend to pick passwords that can be easily guessed. On the other hand, if a password is hard to guess, then it is often hard to remember. To address this problem, some researchers have developed authentication methods that use pictures as passwords known as graphical password. This paper provides additional layer of security to normal textual password by using graphical password for authenticating the user. As graphical passwords are vulnerable to shoulder surfing attack hence one-time generated password is sent to users mobile. Using the instant messaging service available in internet, user will obtain the One Time Password (OTP).The OTP will be the information of the items present in the image to be clicked by the user. The users will authenticate themselves by clicking on various items in the image based on the information sent to them. Additionally, it provides accessibility to visually impaired people.
The Rationing distribution system also called public distribution system distributes food items to the poor. Major commodities include rice, wheat, sugar and kerosene. In this system QR codes will be provided instead of current ration cards. Users database is stored which is provided by Government. The Smart Card must be scanned by the customer to show the details of items allocated by government, and then it checks customer details with stored data to distribute material in ration shop. Biometric i.e. Fingerprint scanning will be done for security and authentication purpose.
Development of Embedded Application for Receiver Processor Unit
B Charan Singh, Vandana Khare
Electronic Warfare (EW) systems are used to meet the tactical and strategic requirements of the users for use on different platforms like ships, aircrafts, helicopters, ground vehicles and submarines. The ES system will have broad band and narrow band operations to meet the tactical and strategic needs of EW systems. Broad band operation is achieved by employing a Wide Open Receiver, which will achieve 100% Probability of Intercept(POI), whereas narrow band receiver will help in achieving higher sensitivity and better Range Advantage Factor(RAF) against Low Probability of Intercept(LPI) radars. Receiver Processor Unit (RPU) generates the controls for the Digitally Tuned Oscillator(DTO) and the switches in the BuiltIn Test Equipment (BITE) distribution network based on the BITE Command received from System Controller Display (SCD). The output of the RF Source is fed to the programmable attenuator to adjust the power levels required to check the dynamic range of the receiver. In this paper, the authors report development of an embedded application for RPU to generate the controls for various devices like switch filter bank, homodyne receiver etc. The embedded application is thoroughly tested for its functionality using given lookup table and the results are found to be satisfactory.
Survey on Efficient Certificateless Access Control for Wireless Body Area Networks
Jeena Sara Viju Sruthy S
Recent developments and advances in the field of wireless communication has led to the discovery of Wireless Body Area Networks (WBANs). The wearable computing devices made it easy to monitor the health issues of patients. The WBANs are widely used taking into account its numerous advantages. Many publications are available mentioning the various challenges of WBANs. In this paper, a survey is performed on the current state-of-art of WBANs based on the latest standards and publications. Open issues and challenges within each area are also explored as a source of inspiration towards future developments in WBANs
A Literature Survey on Password Extraction via Reconstructed Wireless Mouse Trajectory
Divya S, Anju J Prakash
Since the mouse movement data are not encrypted secret information such as password can be leaked through the displacement of mouse.Two attacks are proposed here, the prediction attack and replay attack, which can rebuild the onscreen cursor trajectories by sniffing mouse movement data. Two inference strategies are used to discover pass-words from cursor trajectories.This work is the survey to demonstrate the different techniques and tools in privacy leakage from raw mouse data.
Statistics is defined as the science of collecting, analyzing and presenting data.Data mining is a new discipline lying at the interface of statistics, database technology, pattern recognition, machine learning, and other areas. KDD has a spin that comes from database methodology and from computing with large data sets, while statistics has an emphasis that comes from mathematical statistics, from computing with small data sets, and from practical statistical analysis with small data sets. Statistical techniques are driven by the data and are used to discover patterns and build predictive models. And from the users perspective view with a conscious choice when solving a "data mining" problem is attack it with statistical methods or other data mining techniques. However, since statistics provides the intellectual glue underlying the effort, it is important for statisticians to become involved. KDD is statistics and data mining is statistical analysis. "Knowledge Discovery in Databases" is not much different. The main statistical issues in Data mining (DM) and Knowledge Data Discovery (KDD) is to examine whether traditional statistics approach and methods substantially differ from the new trend of KDD and DM.
Survey on Face Recognition in the Scrambled Dataset by Using Many Kernels
Sreehara .B, Lashma .K
Expressions of face are an important factor in the modern human interacting systems that find the difference between the human expressive states at different times. Scrambling or encryption of data in a captured image gives a solution to an enhanced system. Consider the Internet of Things (IoT) directed distribution of image/video, the encryption of face is considered as a major part for protection of privacy. For this, biometric verification is required to get the encrypted data. After the encryption procedure the face models become chaotic signals. Using various face recognition methods, we can obtain the data, which is a traditional method. This survey paper presents the different techniques which is used to getting the face from the decrypted data by using many kernel methods.
Survey on Integrated Design of Smartphone Interface with Multiple Image Processing Methods
Lija John, Vani V Prakash
The direct light from the projector is harmful to the speaker’s face when facing digital projector. So the digital projector with smartphone is used to reduce the shining light from projector and thus protect the speaker’s face. First, the smart phones capture the speaker’s location including projector screen. After captured the image it undergoes preprocessing methods are face detection, back-ground differencing, skin color recognizing method and ROI. After the preprocessing the location of the speaker's face is detected and the smart phone super imposed the black mask over the speaker's face [1].
Survey on Monitoring Aquatic Debris Using Smartphone-Based Robots
Sreelekshmi .B, Vidya .N
Marine debris is an important environmental issue. It provides threats to our ecosystems, water transport and life of living things in both land and water. So monitoring aquatic debris is very important to the entire world. Numerous techniques are implemented for monitoring aquatic debris. The ultimate goal of this paper is to discuss existing methods for observing debris in aquatic environment.
Survey on a Lightweight Authenticated Communication Scheme for Smart Grid
Sneha .U Liji Samuel
A "smart grid" is a reading meter which includes a various application and unit measures including reading meters. Smart meter collect the daily time information. Smart meter collect daily data and that data is encrypted. The main aim of this paper is to compare the existing systems with LAC scheme an various concepts used in LAC of secure communication data exchange. This survey paper presents the different techniques which is used in LAC of secure communication data exchange
Adaptive Harmonic Elimination in a Grid-Connected Three-Phase PV Inverter
G Chandra Sekhar P Rama Krishna
In this paper, a simple three-phase grid-connected photovoltaic (PV) inverter topology consisting of a boost section, a low-voltage three-phase inverter with an inductive filter, and a step-up transformer interfacing the grid is considered. Ideally, this topology will not inject any lower order harmonics into the grid due to high-frequency pulse width modulation operation. However, the non-ideal factors in the system such as core saturation-induced distorted magnetizing current of the transformer and the dead time of the inverter, etc., contribute to a significant amount of lower order harmonics in the grid current. A novel design of inverter current control that mitigates lower order harmonics is presented in this paper. An adaptive harmonic compensation technique and its design are proposed for the lower order harmonic compensation. In addition, a proportionalresonant-integral (PRI) controller and its design are also proposed. This controller eliminates the dc component in the control system, which introduces even harmonics in the grid current in the topology considered. The dynamics of the system due to the interaction between the PRI controller and the adaptive compensation scheme is also analyzed. The proposed system is simulated in MATLAB/SIMULINK platform
A Grid Connected Three Phase PV Quasi-Z-Source Inverter with Double Frequency Ripple Suppression Control
SivaSiddhartha Chalasani Ch Chinna Veeraiah
In a Grid connected three-phase photovoltaic (PV) system, there is double-frequency power mismatch existed between the dc input and ac output. The double-frequency ripple (DFR) energy needs to be buffered by passive network. Otherwise, the ripple energy will flow into the input side and adversely affect the PV energy harvest. In a conventional PV system, electrolytic capacitors are usually used for this purpose due to their high capacitance. However, electrolytic capacitors are considered to be one of the most failure prone components in a PV inverter. In this project, a capacitance reduction control strategy is proposed to buffer the DFR energy in three-phase Z-source/quasi-Z-source inverter applications. Without using any extra hardware components, the proposed control strategy can significantly reduce the capacitance requirement and achieve low input voltage DFR. Consequently, highly reliable film capacitors can be used. The increased switching device voltage stress and power loss due to the proposed control strategy will also be discussed. The proposed system is simulated in MATLAB/Simulink platform.
Raspberry PI 3 Based Control and Monitor Remote Machine Automation
Rekha Onkar Nehete Prof. A.S. Bhide
As we see around the whole world is getting converted into automation and people are living their lives more comfortably and easily as many advanced technology appliances are available in the market. As the technology expands more energy is required to operate the appliances. So the energy conservation is the main need nowadays. So the appliances should be switched off when there is no need to conserve more energy. The machines which are at remote places need to be control and monitor from any other place. In the large industries machines are located globally anywhere and every time that person cannot go there to operate and monitor the machines. So this projects builds such a system which can control and monitor functioning of the machines globally with the help of internet i.e internet of things. Here Raspberry PI acts as main controller which obtains input from user through internet and takes appropriate action. Raspberry PI controller will be connected to the appliances and machines with the help of relays. Weaved cloud services enables a secure internet connection and a billions of user can connect to the internet and can access the remote place machines. Here weaved cloud service binds public IP address of raspberry PI with fix IP address provided by server.
Ms. Samita Mokal Prof. Nilima D. Nikam Prof. Vaishali Londhe
Detection of File Level and Block Level Deduplication and attacks in Cloud Computing Environment
Ms. Samita Mokal Prof. Nilima D. Nikam Prof. Vaishali Londhe
In cloud computing, security and storage space management techniques are most important factors for improving the performance of cloud computing. Secure deduplication is a technique for eliminating duplicate copies of storage data, and provides security to them. To reduce storage space and upload bandwidth in cloud storage The basic idea in this paper is that we can eliminate duplicate copies of storage data and limit the damage of stolen data if we decrease the value of that stolen information to the attacker. Dekey new construction in which users do not need to manage any keys on their own but instead securely distribute the convergent key shares across multiple servers for insider attacker. User profiling and decoys, then, serve two purposes. First one is validating whether data access is authorized when abnormal information access is detected, and second one is that confusing the attacker with bogus information.
Prof. Vrushali Borase Prof. Gayatri Naik Prof. Vaishali Londhe
Brain MR Image Segmentation for Tumor Detection using Artificial Neural
Prof. Vrushali Borase Prof. Gayatri Naik Prof. Vaishali Londhe
A Brain Cancer is very serious disease causing deaths of many individuals. The detection and classification system must be available so that it can be diagnosed at early stages. Cancer classification has been one of the most challenging tasks in clinical diagnosis. At present cancer classification is done mainly by looking through the cells’ morphological differences, which do not always give a clear distinction of cancer subtypes. Unfortunately, this may have a significant impact on the final outcome of whether a patient could be cured effectively or not. This paper deals with such a system which uses computer based procedures to detect tumour blocks and classify the type of tumour using Artificial Neural Network Algorithm for MRI images of different patients. Different image processing techniques such as image segmentation, image enhancement and feature extraction are used for detection of the brain tumour in the MRI images of the cancer affected patients. Medical Image Processing is the fast growing and challenging field now days. Medical Image techniques are used for Medical diagnosis. Brain tumour is a serious life threatening disease. Detecting Brain tumour using Image Processing techniques involves four stages namely Image Pre-Processing, Image segmentation, Feature Extraction, and Classification. Image processing and neural network techniques are used to improve the performance of detecting and classifying brain tumor in MRI images.
An Overview of Distributed System for Wireless Sensor Network
Nalin Chaudhary Abhishek Choudhary Manoj Kumar
In this paper we are presenting an overview of distributed system for wireless sensor network. In terms of computation, the WSN localization algorithms can be classified into centralized and distributed schemes. Further each category divided in to corresponding methods to solve localization problem. In the centralized scheme, sensor nodes send control messages to a central node whose location is known. In this paper we are showing what a distributed system and wireless sensor network and discuss the relation between distributed computing theory and sensor network applications
Order preserving encryption (OPE) is an efficient tool to encrypt relevance score of the inverted index for ranked search in encrypted cloud data. From last few years cloud computing has a efficiently growing. So security is major issue for cloud computing .Existing system is based on searchable encryption (SE) with the popularity in cloud computing security and data. Considering the large number of data users and documents in cloud it is necessary for the search service to allow multi keyword query and provide results based on similarity ranking to meet effective data retrieval. In this proposed system to design secure methods of probability order preserving encryption and schemes for search in encrypted data.
ADFC Robot: Ashirwad A Low cost much efficient ADFC(Autonomous Dry Floor Clean-up) Robot
Rupinder Kaur1 , Kamalpreet Kaur Harpreet Kaur
Artificial intelligence is reaching local areas like homes, offices, hospitals and other public places at a rapid pace. Robots are stepping forward to do the task of their creators-Human beings. Many organizations are exploring this field of exploration where design, construction, operation and application are equally important. These mechanical beings are programmed to perform various tasks with 100% approximate of efficiency. The paper focuses on the simplified approach to reduce the cost and improved efficiency of the robot so to serve the community to higher extent. Rotation in the brushes and simultaneous mapping of path facilitate for better clean up as compared to the previous cleaners that are associated with high cost but uses static brushes and random path. It eliminates the use of vacuum cleaners as they are expensive. The rotation brushes projected in the front do the needful to collect the mess in the dustbin attached at the bottom of the robot. Steeper motors are used for the mobility of wheels and brushes. Ultrasonic sensor is used to for obstacle detection. Ultrasonic sensor gathers information from the surrounding to assist the robot to react upon. This robot is programmed to reach each and every part of the floor. Its wiping brush rotates to give better results as compares to the static one. Graphical user interface is provided to switch between different speeds modes.