Digital Signature Authentication for Android Smart Phones
Spandana.M, Bhaskar.T
Due to the innovations in Mobile technologies, Gestures play an important role for the Authentication in the mobile devices. With increasing popularity of the Authentications, the security risks are also growing and that is evident in the incidents of UnAuthentication in terms of various kinds. Therefore it is essential to have a mobile based Authentication system for security. Gestures allow users to interact with your app by manipulating the screen objects you provide. Security systems are rapidly being developed, as well as solutions such as remote control systems. However, even with these solutions, major problems could still result after a mobile device is lost. In this thesis, we present our upgraded Lock Screen system, which is able to support authentication for the user’s convenience and provide a good security system for smartphones. We also suggest an upgraded authentication system for Android smartphones.
Encoding is the process of converting data into a format required for a number of information. Encoding involves the use of a code to change original data into a form that can be used by an external process. Interleaving is a process or methodology to make a system more efficient, fast and reliable by arranging data in a non-contiguous manner. Interleaving divides memory into small chunks. It is used as a high-level technique to solve memory issues for motherboards and chips. By increasing bandwidth so data can access chunks of memory, the overall performance of the processor and system increases. Modulation is the addition of information (or the signal) to an electronic or optical signal carrier. Modulation can be applied to direct current (mainly by turning it on and off), to alternating current, and to optical signals to inimize the impact of an unintentional disruption, it is important to identify its presence. Jamming makes itself known at the physical layer of the network, more commonly known as the MAC (Media Access Control) layer. The increased noise floor results in a faltered noise-to-signal ratio, which will be indicated at the client. It may also be measurable from the access point where network management features should able to effectively report noise floor levels that exceed a predetermined threshold. From there the access points must be dynamically reconfigured to transmit channel in reaction to the disruption as identified by changes at the physical layer.
In the application based WSNs circumstance, vitality and transmission capacity of the sensors are esteemed assets and vital to consume competently. Data collection at the sink by individual sensor nodes results in flooding of the requests which results in most extreme energy use. To decrease this issue another data aggregation technique has been used in this paper which has enhanced the execution of the iLEACH principle by utilizing the LZW compressive sensing based between cluster based data accumulation; additionally called mixture data aggregation strategy, where gathering of nodes will be carried out on the principle of the available clustering procedures and gathering of cluster heads is likewise done to use the peculiarities to utilizes the cluster head based data aggregation. The proposed technique serves to diminish the energy utilization issue, furthermore transmission of the aggregated data in proficient way. The proposed technique has been designed and implemented in MATLAB by considering various issues of the WSNs. The comparison among the iLEACH and LEACH has shown the effectiveness of the proposed technique.
Enhancing Massive Data Analytics with the Hadoop Ecosystem
Misha Sheth, Purna Mehta, Khushali Deulkar
With the advent of the Information Age, organizations are establishing their virtual presence, doing away with static locations and easing into presence on cloud. There is a rise in the volume of data in various industries and distributed storage of this data demands for efficient parallel processing. With the arrival of big data also came the realization that this data can be utilized for intelligent decision making. In addition to the complexity of tools and infrastructures that are required to manage huge volumes of data, there is an urgency to identify and resolve the technologies that can properly take advantage of these volumes. Big data evolution is driven by fast-growing cloud-based applications developed using virtualized technologies. A subsequent development in tools to enable data processing in a distributed environment emerged, leading to the MapReduce framework. In this paper, we will see the various technologies that implement this framework with an emphasis on the components of Apache’s Hadoop Ecosystem: Pig, Hive and JAQL and their uses in data analytics.
Every user reminder application available in the market for mobile phones is time and date based. Also, the user has to make a manual typed entry for setting up a reminder. On the reminder being ON, the device constantly tries to match the device time and date with the saved reminder time and date, and the user will be alerted by a text pop up and an optional sound frequency alarm if it is a match. But while setting up many reminders the user will not be aware about the time and date, but he will be aware about the location where he wants the reminder. Some reminders are location personified rather than a particular date or time. In this project which would be a vast and very useful application, we have tried to design an application which helps the user in setting up easy reminders by using voice commands and which gives alerts about the reminder when he enters into the geographical sector specified in the reminder by the user.
New Active Power Filter Topology for Grid Support and Harmonic Mitigation in Interconnecting Renewable Power Generation Systems
Maloth Laxman , Nomula Ganesh
Renewable generation affects power quality due to its nonlinearity, since solar generation plants and wind power generators must be connected to the grid through high-power static PWM converters. This paper presents active power filter implemented with a four-leg voltage-source inverter. A Novel predictive control scheme is implemented to control the inverter. The main features of this control sceme is 1) To predict the future behavior of the variables to be controlled. The controller uses this information to select the optimum switching state that will be applied to the power converter. 2) To control power converter to inject power generated from RES to the grid, and 2)To act as a shunt APF to compensate current unbalance, load current harmonics, load reactive power demand and load neutral current The compensation performance of the proposed active power filter and the associated control scheme under steady state and transient operating conditions is demonstrated through simulations results.
Gesture Recognition for 3D Human Computer Interface: A Review
Ramandeep Kaur, Vijay Laxmi
In today’s world every individual is surrounded by smart technology. Everyone needs a technology that can interact and work. Research is going on to develop new methods of man-machine interfaces that are more user friendly. Touch based interaction has become a common method to operate mobile phones and tablet PCs. Another gesture recognition technology uses a camera that reads the movements of the human body and communicates the data to a computer that uses the gestures as input to control devices or applications. Gesture recognition enables humans to interface withthe machine (HMI) and interact naturally without any mechanical device i.e. without any physical contact. Thispaper presents a survey on different Gesture recognition ways currently in use and under research.
Sentiment analysis or opinion mining aims to use automated tools to detect subjective information such as opinions, attitudes, and feelings expressed in text. An important part of our information-gathering behavior has always been to find out what other people think. With the growing availability and popularity of opinion-rich resources such as personal blogs and online review sites, new challenges and opportunities arise as people now can, and do, actively use information technologies to seek out and understand the opinions of others. This survey covers techniques and approaches that are used for sentiment analysis.
A Survey on Various Watermarking Techniques for Video Data Copyrighting
Sakshi Batra , Dr. Rajneesh Talwar
As the last few years have witnessed blistering excrescence in video coding technology, one of the prevailing fields of interest is to expound an establishment with authentication and copyright protection methodology embedded within an efficient video codec. In this paper, we put through a scrutiny on available and accessible video watermarking techniques. In later past, there has been a discernible quickening in information substitution over the web and the broad utilization of computerized media. The mounting interest with reference to digital watermarking throughout the last decade is undeniably due to the boost in the need of copyright protection which has facilitated an enormous ascent in the applications of video watermarking in copy control, broadcast monitoring, finger printing, video authentication, copyright protection etc. The main aspects of such techniques are information hiding, capacity, security and robustness. Robustness is an essential element when it comes to Video Watermarking Techniques as in its presence, is not possible to eliminate the watermark without rigorous degradation of the cover content. In this paper, we introduce and institute the sundries of Video Watermarking and attributes necessitated to design a robust watermarked video for valuable application and focus on various domains of video watermarking techniques.
Decision Tree Based Denoising And Enhancement In Images
Remya P V, Anoop T R
Images are often corrupted by noises in the procedures of image acquisition and transmission. In this paper, an efficient denoising scheme and it's VLSI architecture for the removal of noises is proposed. To achieve the goal of low complexity, a decision tree based impulse noise detector to detect the noisy pixels and an edge preserving filter to reconstruct the intensity values of noisy pixels is used. Furthermore histogram equalization is used to enhance the quality of images. The performance will be comparable to the higher complexity methods and the design requires only low computational complexity and suitable to be applied to many real time applications So final architecture consists of denoising and Image enhancement. Then this method is applied for different types of noises and compare results of image corrupted by impulses and equalized image using peak signal to noise ratio, mean square error, structural content.
It is an increasing demand of high bandwidth and high data rate without loss of any information in any communication system. To meet almost all the requirements optical communication system is preferable. For long distance transmission optical amplifiers such as Erbium doped fiber amplifier (EDFA), Semiconductor Optical amplifier (SOA), Raman Amplifier, Parametric Amplifier etc. are being used at the receiver side. It is a very important that parameter of these amplifier are analyzed such a way that it can perform better for the specific optical communication application. Pumping scheme at receiver side is also a potential candidate for the ‘last mile’ transmission. In this paper performance of Erbium doped fiber amplifier for parameters like Bit rate, Bit Error rate, fiber length, Signal to Noise ratio (SNR), Q-factor, gain, Noise figure are analyzed in optsim5.0 software environment in L-band 1520-1610 nm range and for higher bit rate to get optimum signal at receiver side and also the effect of changing various parameters on system performance have been shown to improve the efficient optical communication.
Approaches For Automated Detection And Classification Of Masses In Mammograms
C. Rekha, G. Gayathri
Breast cancer is one of the most common cancer among women around the world. Several techniques are available for detection of breast cancer. Mammography is one of the most effective tools for early detection. The goal of this research is to increase the diagnostic accuracy of image processing and machine learning techniques for optimum classification between normal and abnormalities in digital mammograms. GLCM texture feature extractions are known to be the most common and powerful techniques for texture analysis. This paper presents an evaluation and comparison of the performance of two different classification methods used to classify the normal and abnormal patterns. The experimental result suggest that Artificial Neural Network is outperformed the other method.
Electronic-voting approach with an open cloud computing architecture
Mohan Reddy Palugulla
The concept of cloud computing has passed into every area and the academic lexicon in an ambiguous manner, as cloud dust is being sprinkled on an excess of emerging products. Discussing the complexity and protecting against the caprice of the moment, this paper exhibits the notion behind the hype of cloud computing and evaluates its relevance to electronic government and electronic voting information system. Adopting a cloud computing approach for electronic voting solutions is investigated, reviewing the architecture within the previously described context. Taking a further step ahead, this paper proposes a high level implementation of electronic voting system, supported by cloud computing architecture and cryptographic technologies, additionally identifying issues that require further research.
On the Use of Side Information for Text Mining using Clustering and Classification Techniques-A Survey
Subamanikandan A, Arulmurugan R
Text mining application, side information is available along with text documents. Such side information may be contain different kinds, such as links in the document, document provenance information, user-access behavior from web logs or other non-textual attributes. Such attributes may contain large amount of information in the clustering purposes. However, the relative information is difficult to estimate, when some of information is noisy data. In such cases, it can be risky to incorporate side-information into the mining process, because it can either improve the quality of the representation for the mining process, or can add noise to the process. In this paper, we design an algorithm which combines classical partitioning algorithms with probabilistic models in order to create an effective clustering approach. We then show how to extend the approach to the classification problem.
A Survey of Cloud Computing Service and Security issues
Amit Kumar Jha, Divakar Singh
This paper is the survey and concept of Cloud Computing Establishment and security issues. Cloud Computing is considered as one of the emerging arenas of computer science in recent times. It is providing excellent facilities to business entrepreneurs by flexible infrastructure. Although, cloud computing is facilitating the Information Technology industry, the research and development in this arena is yet to be satisfactory. Our contribution in this paper is an advanced survey focusing on cloud computing concept and most Security issues
Three Phase Grid-Tied Single Stage Reconfigurable Solar Converter Controlled By SVPWM Technique
T Vasu, Sri.A.Hema Sekhar
This project addresses a dc/dc converter & dc/ac converter suitable for an energy storage system with an additional function of galvanic isolation. An energy storage device such as an electric double layer capacitor is directly This project addresses a dc/dc converter & dc/ac converter suitable for an energy storage system with an additional function of galvanic isolation. An energy storage device such as an electric double layer capacitor is directly linked to one of the dc buses of the dc/dc converter without any chopper circuit. Nevertheless, the dc/dc converter dc/ac converter can continue operating when the voltage across the energy storage device droops along with its discharge. Theoretical calculation and experimental measurement reveal that power loss and peak current impose limitations on a permissible dc-voltage range. In this project verifies that the dc/dc converter can charge and discharge the capacitor bank properly. Moreover, the dc/dc converter & dc/ac converter can charge the capacitor bank from zero to the rated voltage without any external pre charging circuit.In this project, a new converter called reconfigurable solar converter (RSC) for photovoltaic (PV)-battery application, mainly utility-scale PV-battery application is presented. The major notion of the new converter is to use a single-stage three phase grid-tie solar PV converter to perform dc/ac and dc/dc operations. This converter solution is urging for PV- battery application, since it reduces the number of conversion stages, thereby developing efficiency and reducing cost, weight, and volume. In this project, a combination of analysis and simulation tests is used to show the attractive performance features of the suggested RSC.
A Review on Evaluation of Digital Image Watermarking Techniques
Kamalpreet Kaur, Khushdeep Kaur
An efficient strategy that solves many problems within a digitization is Watermarking. By embedding Intellectual Property data like the creator, licence model, creation date or another copyright information within the digital object, the digitiser can reveal they are the creator and spread this information with each copy, even when the digital object has been uploaded to a third party site. This can also be utilized to decide if a required work has been tampered with or copied. This paper describes methods to establish if an application need watermarking strategies and criteria for selecting for the most suitable type. This paper concentrates on evaluating the responsibility of the SVD, DCT and DWT dependent image watermarking for protected transmission
Automatic Toll Gate Management and Vehicle Access Intelligent Control System Using Arm7 Microcontroller
M. Jyothirmai, D. Prashanth, J. Vamsikrishna
The conventional or the traditional way of collecting the toll from the vehicle owners or the drivers is to stop the car by the toll gate stations and then pay the amount to the toll collector in the toll booth, after which the gate is opened either mechanically or electronically for the driver to get through the toll station. So in order to stop all these problems and inconvenience, we introduce an automated way of collecting the toll and traffic management. It is called Electronic Toll Gate Stations using RFID Technology.
In this paper we use ARM based LPC2148, PIC18F452, RF434, ZigBee and GSM. In this system three sub systems are present. Those are central database system, toll gate unit and vehicle unit. The vehicle unit consists of an active RF434, GSM modem, keypad and ignition control unit. The RF434 sends the necessary vehicle identification information to toll gate unit based on user request. GSM send the vehicle starting intimation to user and also receive the necessary command from user to stop the vehicle. Keypad is used for authentication password to access to start the vehicle. The toll gate unit contains the RF434 which reads the necessary vehicle identification information. When a vehicle comes in the vicinity of the toll gate the tag attached to the vehicle communicates with the toll gate station and the information of tag is sent to central data base station using ZigBee wireless communication protocol. At the other side the central data base system receives this information and compares from database for sufficient details and amount information. If the details are matched and sufficient amount is found then the successful information is sent to the corresponding toll gate station via ZigBee. At the toll gate if the received information is success then the toll gate will be opened for the vehicle to pass and the gate will be closed automatically based on IR sensor interfaced at toll gate
Utilization Of Statcom Control For Fsig-Based Wind Forms Under Asymmetrical Grid Faults
T Ananda, M.Lokanadham
The stability of fixed-speed induction generator (FSIG)-based wind turbines can be improved by a StatCom, which is well known and documented in the literature for balanced grid voltage dips. Under unbalanced grid voltage dips, the negative-sequence voltage causes heavy generator torque oscillations that reduce the lifetime of the drive train. In this paper, investigations on an FSIG-based wind farm in combination with a StatCom under unbalanced grid voltage fault are carried out by means of theory, simulations, and measurements. A StatCom control structure with the capability to coordinate the control between the positive and the negative sequence of the grid voltage is proposed. The results clarify the effect of the positive- and the negative-sequence voltage compensation by a StatCom on the operation of the FSIG-based wind farm. With first priority, the StatCom ensures the maximum fault-ride-through enhancement of the wind farm by compensating the positive-sequence voltage. The remaining StatCom current capability of the StatCom is con-trolled to compensate the negative-sequence voltage, in order to reduce the torque oscillations. The theoretical analyses are verified by simulations and measurement results on a 22-kW laboratory setup.
Improving Mystery For Information Protection In Cloud Computing
P.Sampath Kumar, Dr. N. Chandra Sekhar Reddy
Cloud computing gives information registering to store and recover the client information and it is the utilization of action of utilizing Machine fittings and programming. The essential use of Cloud computing is information stockpiling and limit of capacity for cloud clients. Guaranteeing an IT benefits that are given to a client over a system and giving an administrations base as an administration (Iaas), Stage as an administration (Paas), or programming as an administration (Saas). These days the Cloud security usage is the vital component by utilizing different cryptographic calculations, DES, AES, RSA and ECC. Cloud computing are confronting issues like information protection and other information security issues. Cryptography is instrument for information and machine security i.e. by encryption of information. This paper proposes about the diverse security systems, cryptographic calculation to address the information security and protection issue in Cloud storage keeping in mind the end goal to secure the information put away in the cloud framework. Cloud computing is a developing innovation that get to remote servers through Web to keep up information and applications. It consolidates the favorable circumstances of framework and utility processing. This paper communicates the essentialness of Cloud computing and different security emergency identified with information administration. It additionally incorporates different apparatuses for creating Cloud computing and administrations performed by Cloud computing with their key parts. The cloud is a virtualization of assets that keeps up and oversees itself. Moreover, it bargains about how the client can safely get to information, assets and administrations to satisfy their alertly evolving needs. In this paper, rules to create Cloud computing for training are given.
K. Poornima Naga Jyothi, D. Prashanth, J. Vamsikrishna
An Intelligent Transport Navigation and Vehicle Speed Monitoring System Using Arm9
K. Poornima Naga Jyothi, D. Prashanth, J. Vamsikrishna
Radio Frequency Identification (RFID) has attracted considerable attentions in recent years for its broad applications and complements to the current GPS navigation system when GPS signals are not available (such as in tunnels) or if the GPS position is ambiguous to a vehicle (such as at cloverleaf intersections). But in practice, GPS does not provide sufficient information for navigation due to its low positioning accuracy (5 to 7 meters).Moreover, even combined with map-matching technologies, GPS still cannot achieve lane level positioning and cannot provide information regarding the traffic direction in the current lane.
In this paper we use ARM based LPC2148, S3C2440A, PIC18F452, RF434, ZIGBEE, RFID READER. The proposed system contains mainly four nodes vehicle unit, two reader units, central unit. Vehicle unit is implemented on ARM7, Central unit on ARM9. The reader Units is placed in a distance of 50-100 mts to each. When a vehicle crosses first reader unit the tag attached to vehicle read by the reader unit and it starts timer send intimation to central station , when the vehicle reaches the second Reader unit there also the Tag is read by reader unit and it also send the intimation and time details to central station. By taking these details the central station calculates the speed of the vehicle and also detects the location of vehicle based on reader’s position and sends this information to Vehicle. If the vehicle is going with high speed than predefined then the speed control unit slowly reduce the vehicle speed. The central unit also provides the next location information to vehicle so that he can decide the route that he wants to go. The communication between Vehicle unit and Central unit is using ZigBee whereas the communication between reader units and Central unit is using RF434 wireless module
Low Power Data Compression Algorithm For Wireless Sensor Networks Using VHDL
K.Yamini, K S N Raju, K Miranji
A proposal and evaluation of a new data compression algorithm inspired from Run Length Encoding called K-RLE which means RLE with a K-Precision. This increases the ratio compression compared to RLE. In order to improve the compression results with different statistics of data sources an in-network processing technique is introduced in order to save energy. In-network processing techniques allow the reduction of the amount of data to be transmitted. The well known in-network processing technique is data compression and/or data aggregation. Data compression is a process that reduces the amount of data in order to reduce data transmitted and/or decreases transfer time because the size of the data is reduced.
The quality of Image is measured in terms of Resolution. The image clarity can be measured by Resolution. Better resolution can be generated by use of Good Sensors, but it can be very expensive .Instead of that we can use image processing methods to obtain High resolution image from low resolution image. It can be very effective and better solution. This Kind of Image Enhancement is called Super Resolution Image Reconstruction.
This paper focuses on the definition, implementation and analysis on well-known techniques of super resolution. Image super-resolution, a process to enhance image resolution, has important applications in satellite imaging, high definition television, medical imaging, etc. Many existing approaches use multiple low resolution images to recover one high-resolution image. As a result of the analysis, the critical examination of the techniques and their performance evaluation are achieved.
Super-resolution image restoration has been one of the most important research areas in recent years which goals to obtain a high resolution (HR) image from low resolutions (LR) blurred, noisy, under sampled and displaced image.
Iris Segmentation and Recognization Using Log Gabor Filter And Curvelet Transform
M.Jayanthi, B.Shalini
Biometric methods have been played important roles in personal recognition during last twenty years. These methods include the face recognition, finger print and iris recognition. Recently iris imaging has many applications in security systems. The aim of this paper is to design and implement a new iris recognition algorithm. In this paper, the new feature extraction methods according to log-gabor filters and curvelet transform for identifying the iris images are provided. Iris recognition is annular region between the sclera and the pupil of the human eye. In this region, there exists an extraordinary texture including many prominent features, on which the recognition is mainly relied. In the existing approach adopted the Scale invariant Feature Transform (SIFT) to extract the local feature points in both Cartesian and polar coordinate systems. Since it is very likely that many local patterns of the iris are similar, the recognition accuracy of the system based on SIFT is not as good as those of the traditional methods. A novel fuzzy matching strategy with invariant properties, which can provide a robust and effective matching scheme for two sets of iris feature points and the nonlinear normalization model, is adopted to provide more accurate position before matching. An effective iris segmentation method is proposed to refine the detected inner and outer boundaries to smooth curves. For feature extraction, instead of Log-Gabor filters we propose curvelet transform to detect the local feature points from the segmented iris image in the Cartesian coordinate system and to generate a rotation-invariant descriptor for each detected point. The proposed matching algorithm, which is based on the PFM method, is used to compare two sets of feature points by using the information comprising the local features and the position of each point.
Design of A High Throughput Elliptic Curve Scalar Multiplier
Ashna Paul, Ms. Divya S
Cryptography provides a method for securing and authenticating the transmission of information over the insecure channels. Elliptic Curve [EC] Cryptography is a public key cryptography .It replaces RSA because of its increased security with lesser number of key bits .Elliptic Curve scalar multiplication module will be available in majority of secure communication systems. The most important operation in Elliptic Curve Cryptosystem is the computation of scalar multiplication using karatsuba multiplier. In scalar multiplication of kP for given integer k and point P on elliptic curve. This work aims to design and implement elliptic curve scalar multiplier on a single field programmable gate array (FPGA).The hardware complexity is reduced using polynomial basis representation of finite field and projective co-ordinate representation of elliptic curves.
In software development, testing plays an important role. Hence, it is necessary to have knowledge about testing. This paper provides an idea of testing and test automation. Simplifying testing efforts is the main objective of test automation. We have presented a process of Test Automation using keyword driven approach. The input in the form of natural language is converted to machine readable code snippet and that will be executed under framework to produce reports. The technique is based upon looking for keywords that describes actions on the target and calling functions associated with those. Using this framework, we can improve reusability of automated test. The effectiveness and efficiency of testing can be increased with the help of Automating test automation.
Digital Image Steganography- Then, Now & Analysis of its Techniques
Pratik Shinde, Amrita Palmal, Rohan Singh
Steganography is a process of hiding/encrypting confidential data files into carrier/host files, and transmitting the obtained encrypted files from the sender to the receiver. It is the art of inconspicuously hiding data within data, so that unwanted recipients do not suspect and decipher the original, confidential message. Although people have employed the technique of hiding secret messages in plain sight using various novel techniques throughout ages, the recent surge in computer usage and technological advances has propelled the need for security and confidentiality to the forefront. Therefore, it is essential to develop an efficient tool which will incorporate all the security and confidentiality needs, and which shall withstand malicious intrusion attacks.It is important to understand that messages are not secure just by being hidden. Steganography is not all about keeping the messages hidden, but rather keeping their existence hidden. For this purpose, the report will discuss several useful algorithms, their implementation, their advantages and shortcomings, and more specifically, how these algorithms can be employed to develop highly effective steganography solutions concerning images as the carrier medium.
Our planet is blessed with various species of fauna and flora. It’s well known that plants play a crucial role in preserving the earth’s ecology and environment by maintaining a healthy atmosphere and providing sustenance and shelter to innumerable insect and animal species. Plants are also important for their medicinal properties and as alternative energy sources like bio-fuel. Today as many types of plants are at the brink of extinction. In our day to day life herbal plays vital role on human physic maintenance. Plant classification has a broad application perspective in agriculture medicine and is especially significant to the biology diversity research. In recent time’s computer vision have been successfully applied towards automated systems of plant cataloguing. The proposed system tries to bring atomization in the process of plant leaf classification such that without any precious knowledge of the leaf species, we are trying to detect the leaf. This will help the botanists in their study and speed up the process of identifying the species of plant. India is enriching source of plant species which is the base of ‘Ayurveda’ so the classification and identification of various plant species is one of the important phase. Our idea is to develop an automated tool which would detect and classify the plant leaf species after comparing with the trained sets. These trained sets are used by Artificial Neural Network (ANN) after image processing. The Neural Network would be trained for detection of edge and vein analysis. Earlier atomization was difficult, but now due to drastic development in technological field it is possible in just few steps. This paper proposes the automated tool for plant leaf recognition of leaf its digital image. Manual identification requires prior knowledge of species and is a lengthy process, thus the atomization technique helps to speed up the traditional method of plant leaf classification. Also the paper compares different methods used in plant leaf classification.
Fingerprint Matching with Ridge Ends and Virtual Core Point using Enhanced Concentric Ring Algorithim
Gurpreet Singh, Vinod Kumar
Fingerprints are the most common and widely accepted biometric feature for person identification and verification in the field of biometric identification. Fingerprints consists of two main types of features : (i) Ridge and Furrow structure: Ridge ends and bifurcation and (ii) Core Point: the of maximum curvature in central region of the fingerprint. This paper presents the implementation of a minutiae based approach to fingerprint identification and verification and serves as a review of the different techniques used in various steps in the development of minutiae based Automatic Fingerprint Identification System (AFIS). The technique conferred in this paper is based on the extraction of ridge termination and virtual core minutiae from the thinned, binarized and segmented version of a fingerprint image.
Mixed DWT-DCT Approached Based Image Compression Technique
Mahinderpal Singh, Meenakshi Garg
Image Compression is a method, which reduces the size of the data or the amount of space required to store the data. Digital image in their raw form require an enormous amount of storage capacity. There are various transformation techniques used for data compression. Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) are the most commonly used transformation.The Discrete cosine transform (DCT) is a method for transform a signal or image from spatial domain to frequency component. DCT has high energy compaction property and requires less computational resources. On the other hand, DWT is multi resolution transformation. In this paper, we propose a mixed DWT-DCT algorithm for image compression and reconstruction taking benefit from the advantages of both algorithms.
The voltage level change in a system due to the drop in the load voltage and change in loads causes the increase in reactive power demand. If that demand not meets by the power system, bus voltage is again decreases. These decrease bus voltage cause the speedy declension of voltage at particular place. This voltage declension of particular place will cause cascading effect on nearby regions, which will further become reason for voltage collapse. In this paper FACTS [1] [2] (Flexible Alternating Current Transmission System) controller such as STATCOM (Static Compensator) is used to operate within the particular voltage limit. STATCOM use in power system for dual purposes such as adsorb the reactive power or to extract the reactive power in transmission line.
Analysis of Supervised algorithm for Voice Query Classification
Amol Kamble
Categorization of voice queries is useful for the finding the intent of the user. By finding intent it is easy to monitor that user. By performing classification search engine can find the class of the query. By using that class it becomes easy for the search engines to retrieve the results which are already classified in that class. This classification is useful for the targeted advertisement depending on search queries. For that classification two algorithms are used and analyzed. This analysis is based on the different parameters.
Spurious Minutia Removal Technique using Euclidean distance approach
Komal Sharma, Rachna Rajput
Fingerprints are the oldest and most widely used form of biometric identification. Everyone is known to have unique, immutable fingerprints. As most Automatic Fingerprint Recognition Systems are based on local ridge features known as minutiae, marking minutiae accurately and rejecting false ones is very important. However, fingerprint images get degraded and corrupted due to variations in skin and impression conditions. Thus, image enhancement techniques are employed prior to minutiae extraction. A critical step in automatic fingerprint matching is to reliably extract minutiae from the input fingerprint images. This paper proposes the classification of false minutiae for better matching results. The fake minutia are rejected on the basis of 3 most common cases.
Novel and Hybrid Technique for Efficient Intrusion Classification
Richa Shivhare, Sushil Chaturvedi
The intrusion detection system (IDS) is one way of protecting a computer network. This kind of technology enables users of a network to be aware of the incoming threats from the Internet by observing and analyzing network traffic. The proposed technique involved four steps, first apply DBSCAN clustering which is used to make clusters, based on this obtained clusters we trained the network with by Back Propagation algorithm. We also apply Information Gain based Feature Selection method to identify the important features of the network. We trained the network once with all features and then reduced features this shows that we attain high detection rate and in efficient time. The developed network is used to identify the occurrence of various types of intrusions in the system. The performance of the proposed approach is tested using KDD Cup’99 data set available in the MIT Lincoln Labs. Simulation result shows that the proposed approach detects the intrusions with high detection rate and low false alarm and in high efficiency in terms of time.
A Study on Semantic Web Languages and Technologies
Anthony Narzary, Gypsy Nandi
The World Wide Web (WWW) has changed the way people communicate with each other, how information is spread and retrieved. The word semantic web includes techniques that promise to dramatically improve the current WWW and its use. The Semantic web area has seen rapid development in current era with the improvement of technologies every now and then. The Semantic web is propagated by the World Wide Web consortium (W3C), an international standardization body for the web. The main purpose of the Semantic web is to help users locate better, organize and access the information on the web. The language used should be a natural language so that users find it comfortable to deal with any web pages or Applications. This paper gives us the overview of web languages and technologies used now-a-days in present era.
Network Video Capture and Motion Detection Based On Embedded Linux with GSM Service
G.Mounisha, J.Vamsikrishna, D.Prashanth
In this paper, the structure of video capture system based on S3C2440 processor is presented. And the embedded system, video capture, short message service (SMS) alarm, and client video monitor are introduced. Video 4 Linux is used to get the camera video data, which is transferred to the Web Server, and the data is displayed on the client browser. The system can also be connected with mobile phones, using SMS to control alarm equipment. The system can be applied in intelligent anti-theft, intelligent transportation, intelligent home, medical treatment, as well as all kinds of video surveillance systems. Compared with video capture system based on digital signal processor (DSP), this system has the advantages of fewer modules, lower cost, higher intelligence, higher system stability, and higher security.
With the development of Broad Band, computer networks, and image processing technology, video capture has been widely used in image acquisition, security, health care, intelligent community, alarm, transportation and so on. But it also has many problems, such as high cost, low intelligence, poor stability, weak security. In order to solve these problems, S3C2440 microprocessor is adopted in this embedded video Acquisition system which is combined with the Linux operating system. Video capture is realized by the Video 4 Linux.
Effect Of Flow Charestric On The Heat Transfer Performance Across Low-Finned Fin Banks
Seif A. Ahmed
Air is a cheap and safe fluid, widely used in electronic, aerospace and air conditioning applications. Because of its poor heat transfer properties, it always flows through extended surfaces, such as finned surfaces, to enhance the convective heat transfer. In this paper, experimental results are reviewed and numerical studies during air forced convection through extended surfaces are presented. The thermal and hydraulic behaviors of a reference trapezoidal finned surface, experimentally evaluated by present authors in an open-circuit wind tunnel, has been compared with numerical simulations
Evaluation of Fusion Methods: Pan-Sharpening Methods
Tanya Mathur, Swati Chauhan
Image fusion in the field of remote sensing deals with combination of multispectral and panchromatic image. The idea of such combination is to derive best information out of it. A multispectral image is spectral rich whereas a panchromatic image is spatially sound so their combination gives better information in terms of best spatial and spectral parameters. This Paper aims at evaluating various fusion methods available so far and helps to analyze the comparisons between them. To determine the best approach it must have high computational efficiency, preserve the high spatial resolution and minimize the color distortion
Color constancy is the ability to estimate the color of light source. The color of light source may impact the appearance of object in the scene. Human has the natural tendency to recognize the actual color of object despite variations in the color of the light source. However, it is not easy for computer vision systems to discover the actual color of objects in the scenes. Several algorithms have been proposed to estimate the effect of color of light source on a digital image. A review of the color constancy techniques is presented in the paper. Also, the comparative analysis of various color constancy techniques is discussed
A Survey On Flower Image Retrieval Based On Saliency Map And Feature Extraction
Ashwin G. Parmar, Mukti S. Pathak
Content based Image Retrieval is a growing topic under image processing. The main purpose of CBIR System is to help users to retrieve revelent images based on their contents using several feature extraction technique such as color, texture and shape using saliency map. In this paper various extracting methods are discussed, analyzed and compared. To extract the color feature from the image the color moment will be used. To extract texture feature, the image will be in gray-scale and Gabor filter is performed on it. To extract Shape feature Zernike moment and Fourier Descriptor are used.
Fuzzy Logic has been extremely vital role in the field of computer science, Artificial Intelligence, control theory and mathematic. Fuzzy logic is the way of organising belief or idea that cannot be defined precisely but which depends upon their contexts – human way of thinking, reasoning and perception. In this review paper we also see that the basics of fuzzy logic as well as fuzzy logic system (Fuzzy Inference System) use as decision making technique under a linguistic view of fuzzy sets.
Clustering is the unsupervised learning problem. Better Clustering improves accuracy of search results and helps to reduce the retrieval time. Clustering dispersion known as entropy which is the disorderness that occur after retrieving search result. It can be reduced by combining clustering algorithm with the classifier. Clustering with weighted k-mean results in unlabelled data. This paper present a clustering algorithm called Minkowski Weighted K-Means. This algorithm automatically calculates feature weights for each cluster and uses the Minkowski metric (Lp) Unlabelled data can be labeled by using neural network and support vector machines. A neural network is an interconnected group of nodes, for classifying data whereas SVM is the classification function to distinguish between members of the two classes in the training data. For classification we use neural networks and SVM as they can recognize the patterns. The whole work is taken place in the Matlab.7 environment.
In today’s world where the data and sensitive information is distributed over the networks, secure and strong cryptographic techniques should be employed to maintain integrity, authentication, and privacy of data. The aim of this paper is to propose a new design for a hybrid encryption algorithm by making use of both symmetric-key and asymmetric-key algorithms. We will use RSA, AES, and SHA-1 for this scheme.
Gamma Distributions Model For The Breast Cancer Survival Data Using Maximum Likelihood Method
K. H. Khan, M. Saleem
The breast cancer censored data of 254 patients was considered for the survival rate estimates. The data [12, 18] was treated at the chemotherapy department, Bradford Royal Infirmary for ten years. Here in this paper Gamma probabilitydistribution model is used to obtain the survival rates of the patients (see [2], [6], [13]). Maximum likelihood method has been used through unconstrained BFGS optimization method [5, 8, 9, 10] (BFGS-Broyden Fletcher-Goldfarb and Shanno Method) to find the parameter estimates and variance-covariance matrix for the Gamma distribution model. Finally the survivor rate estimates for the parametric Gamma probability model has been compared with the non-parametric (Kaplan-Meier-[15]) method.
Jesica Fernandes, Srijoni Saha, Jasmine Faujdar, Prof. Nitin Shivale
Information-Theoretic Outlier Detection For Large_Scale Categorical Data
Jesica Fernandes, Srijoni Saha, Jasmine Faujdar, Prof. Nitin Shivale
Outlier detection is an important problem that has been reached within various research and applications domains in today’s world. It aims to detect the object that are considerably distinct, exceptional and inconsistent the majority data in input data sets. Many outlier detection techniques have been specifically developed for certain application domains. To identify abnormal data which forms non-conforming pattern is referred to as outlier, anomaly detection. This leads to knowledge and discovery. Many outlier detection methods have been proposed based on classification clustering, classification, statistics and frequent patterns. Among them information theory have some different perspective while its computation is based on statistical approach only. The outlier detection from unsupervised data sets in more challenging since there is no inherent measurement of distance between these objects. We propose two practical 1-parameter outlier detection methods, named ITB-SS and ITB-SP, which require no user-defined parameters for deciding whether an object is an outlier or not. Users need only provide the number of outliers they want to detect in different data set. Experimental results show that ITB-SS and ITB-SP are more effective and efficient than mainstream methods and can be used to deal with both large and high-dimensional data sets where existing algorithms fail to work .Outlier detection in many times known as anomaly detection in advanced technology for a wide range of real time applications like medical, industrial, e-commerce ,security and engineering purpose. Outlier arises due to faults in systems, changes in the system, human errors, behavioral and instrumental errors. Detection of these outliers helps in identification of frauds and faults before they arises and affect our system intensively with outcomes. The data sets like transaction data, financial records in commercial bank, demographic data are present in non-numerical attributes known as categorical data. Existing unsupervised method are applicable on numerical data sets. However they do not work with categorical type data.
Controlled Bilateral Filter And Clahe Based Approach For Image Enhancement
Gursharn Singh, Anand Kumar Mittal
Image enhancement is a region of improving the visual clarity of the image in digital image processing. In this paper, we propose a new algorithm using CLAHE and unsharp masking with bilateral filter. Enhancement of contrast and sharpness of an image is required in many applications. In applications like medical radiography enhancing movie features and observing the planets it is necessary to enhance the contrast and sharpness of an image. Unsharp masking is good tool for sharpness enhancement; it is an anti-blurring filter. By using unsharp masking algorithm for sharpness enhancement, the resultant image suffering with two problems, first one is a hallo is appear around the edges of an image, and second one is rescaling process is needed for the resultant image. The aim of this paper is to enhance the contrast and sharpness of an image simultaneously and to solve the problems. In the proposed algorithm, we can adjust the two parameters controlling the contrast and sharpness to produce the desired output.
Haptics is a recent enhancement to virtual environments allowing users “to touch” and feel the simulated objects they interact with. Current commercial products allow tactile feedback through desktop interfaces (such as the FEELIt mouse or the PHANToM arm) and force feedback at the fingertips through haptic gloves (such as the CyberTouch and the CyberGrasp).
Haptics is a recent enhancement to virtual environments allowing users “to touch” and feel the simulated objects they interact with. Current commercial products allow tactile feedback through desktop interfaces (such as the FEELIt mouse or the PHANToM arm) and force feedback at the fingertips through haptic gloves (such as the CyberTouch and the CyberGrasp)., etc. It is at present difficult to simulate complex virtual environments that have a realistic behavior. This task is added by the recent introduction of haptic toolkits.
The simulation of natural phenomenon is an important goal for this technology. Current systems may not be able to generate forces with the speed required to simulate real world situations. Accurate and high-speed devices must be perfected in order to create real-world simulations
Study Of Fault Prediction Using Quad Tree Based K-Means Algorithm And Quad Tree Based EM Algorithm
Swapna M. Patil, R.V.Argiddi
The paper intends to do a comparative study of the two clustering algorithms, namely K-Means and EM. Quad tree is used as a common algorithm to initialize both the clustering algorithms. The dataset is then clustered and classified separately by K-Means and EM algorithms. The motive of this paper is to prove the effectiveness of EM over K-Means. Classification and clustering of the dataset done via EM is seen to have lower faults as compared to clustering and classification done via K-Means algorithm
In recent years, cloud computing has become a popular paradigm for hosting and delivering services over the internet. The key technology that makes cloud computing possible is server virtualization, which enables dynamic sharing of physical resources. Through virtualization, a cloud service provider can ensure QoS delivered to the user while achieving higher server utilization and energy efficiency. Virtualization introduces the problem of virtual machine placement and also increases the overheads in load balancing. This paper discusses the various the various algorithms dealing with VM placement and load balancing in cloud environment.
A Literature Survey On Secure De-Duplication Using Convergent Encryption Key Management
Ms. Madhuri A. Kavade, Prof. A.C.Lomte
One vital challenge of today’s cloud storage services is the management of the ever-increasing quantity of data. To make data management scalable, de-duplication has been a well-known technique to condense storage space and upload bandwidth in cloud storage. Instead of keeping multiple data copies with the same content, de-duplication eliminates redundant data by keeping only one physical copy and referring other redundant data to that copy.
Now a day the most arising challenge is to perform secure de-duplication in cloud storage. Although convergent encryption has been extensively adopted for secure de-duplication, a critical issue of making convergent encryption practical is to efficiently and reliably manage a huge number of convergent keys. We first introduce a baseline approach in which each user holds an independent master key for encrypting the convergent keys and outsourcing them to the cloud. However, such a baseline key management scheme generates an enormous number of keys with the increasing number of users and requires users to dedicatedly protect the master keys which is inefficient an unreliable. For that purpose we are going to formally address the problem of achieving efficient and reliable key management in secure de-duplication. We propose Dekey, a new construction in which users do not need to manage any keys on their own but instead securely distribute the convergent key shares across multiple servers.
Histogram Equalization is a technique that is basically used for Image Enhancement. It is a simple and effective technique that applied to the images captured in bright or dark environment thus becomes the low contrast images. The basic purpose of histogram is to produce the output image, which is much better in the form of appearance than the input image. This paper presents what is histogram and how it is being applied to the image for its enhancement and techniques used to perform Histogram Equalization (HE).
Swati Vashisht, Dr. V. Bibhu, Tushar sharma, Smratika Sharma
An approach to Re-evaluate CERS method for News Mining
Swati Vashisht, Dr. V. Bibhu, Tushar sharma, Smratika Sharma
With the current growth rate of URLs we are at the age of online information overload and for many other domains such as web services data analysis. Text mining has been a key research topic for online information retrieval and information extraction. From online news and blog articles a human can often deduce information and knowledge for the prediction of market movements and other interesting activities occurring all around the world. However this recognition and comprehension process is very complex and requires some context knowledge about the domain in which trends are to detect.
The analysis of news source represents an important challenge of our times. News not only reflects the different processes happening in the world but also influences the economic, political and social situation. A news source contains an enormous amount of information which can be compiled together and analyzed.
In this paper we proposed an approach that applies clustering methods on news articles and then CERS (Cross Entropy Reduction Sampling) technique to make a news article more effective to search and less cumbersome to get exact knowledge.
Watermarking is the process of embedding data called a watermark into a multimedia object such that watermark can be detected or extracted later to make an assertion about the object. The multimedia object may be an audio, image or video. The aim may be to provide data authentication, data integrity, copyright and so on, depending upon the need. Watermarking can be done either in spatial domain or in frequency domain, the latter one proves to be more efficient than working in the spatial domain. Working with the watermarking in frequency domain requires transformation tools and in this paper we will discuss these various techniques giving more stress on wavelet transforms.
We briefly describe the intersection of game theory and bargaining problem as a topic of study, and explain the term Nash bargaining game. Very less work has been recorded in the field of game theory with bargaining problem. This paper presents the survey of various approaches which are used in the field of game theory with bargaining problem and elaborate the Nash bargaining game. Finally, we propose a future scope of bargaining problem in the field of zoology.
A Service-oriented Architecture Based on Windows Communication foundation for Health Monitoring and Tracking
Maheep Sharma, Mrs. Pooja Sapra
WCF (Windows Communication Foundation) developed by Microsoft, is an ideal SOA (Service-Oriented Architecture) implementation platform. WCF is designed using service oriented architecture principles to support distributed computing where services have remote consumers. SOA generally provides a way for consumers of services, such as web-based applications.SOA is applied to build the web-based management system for Lifeline due to its flexibility and reusability. All services are programmed on .NET. The results of test system which is constructed on .NET show that Lifeline is valuable for practice. The services are deployed, discovered and consumed as a collection of endpoints. A WCF client connects to a WCF service via an Endpoint. Web Services are applications that can be published, located, and invoked across the Internet. Web Services are implemented by SOA .Web Services are applications that can be published, located, and invoked across the Internet. WCF services provide better reliability and interoperability
The analysis of distance variation on microstrip Broadband bow tie antenna (BBTA)
Shruti Taksali, Suman Agrawal
This paper focuses on the effect of distance variation between the triangles in a bow tie antenna which basically consists of two triangles shaped metals facing towards each other separated by a distance and arranged in a configuration of a bow tie structure. It is generally feed through a coaxial cable at the centre of the separation distance to achieve an impedance matching. It is the preferred structure because of its simplicity and capability of transmitting and receiving broad range of frequencies and hence known as broadband bow tie antenna (BBTA).
Voruganti Ravi Kumar, Sk Subhan , Devireddy Venkatarami Reddy
A Novel Approach for Facial Expression Recognition Rate (FER) By Using Tensor Perceptual Color Framework
Voruganti Ravi Kumar, Sk Subhan , Devireddy Venkatarami Reddy
This paper proposes facial expression recognition in perceptual color space. Tensor perceptual color framework is introduced in this paper for facial expression recognition (FER), which is based on information contained in color facial images. TPCF enables multi linear image analysis in different color space, and demonstrate the color components give the additional information for the robust FER. Using this framework, the components ( in either RGB, CIELab or CIELuv, YCbCr space) of color images are unfolded to 2-D tensors based on multi linear algebra and tensors concepts, from which the feature are extracted by Log-Gabor filters. The mutual information quotients method is employed for the feature selection. Features are classified using multiclass linear discriminate analysis classifiers. Experimental result shows that color information has significant potential to improve emotions recognition performance due to the complementary characteristics of the image textures. The perceptual color spaces (CIELab and CIELuv) are better and overall for FER than color space, by providing more efficient and robust performance for FER using facial images with illumination variation.
A gesture recognition system is such a system that recognizes and differentiates between ‘gestures’. These gestures can be any type of facial or body gestures. Various facial expressions constitute facial gestures. Similarly the various gestures that can be made using our hand, or the palm to be more specific are called ‘HAND GESTURES’
In this project, we aim to develop such a system that is able to recognize some of such ‘hand gestures’ and differentiate between them, thus triggering a certain event corresponding to the gesture. We propose to develop a system that can successfully recognize hand into two dimensional hand gestures, using an IR sensor matrix for the purpose of recognition. We also plan to connect the recognition system to a computer keyboard emulation system so that appropriate gestures can cause corresponding keyboard events like pressing and releasing of a key.
we reduce the number of the required IR sensors to two and thus reduce the power consumption, which is mentioned as a critical issue in Even using the limited information from only two IR sensors, our system can achieve accurate gesture recognition using the proposed IR feature set and the classifier.
A New Topology for Speed control of Sensor less BLDC Motor with Reduced Commutator Switches and Improved Input Power Factor
G. Venu, Dr. S. Tara Kalyani
The Brushless Direct Current (BLDC) motors are one of the motor types rapidly gaining popularity. BLDC motors are used in aerospace applications, medical field, industrial automation equipment and instrumentation. As the name implies, BLDC motors do not use brushes for commutation; instead, they are electronically commutated. This paper proposes a new optimized drive for speed control of BLDC motor, In this operation mode, it approximates a voltage follower and the line current follows the line voltage waveform to a certain extent. The reduction in low-order harmonics and improved power factor is achieved without the use of any voltage or current sensors. The simplicity and reduced parts count of the proposed topology make it an attractive low-cost choice for many variable speed drive applications.
The ability to track, trace and control anything by anyone from anywhere on the planet has been mankind’s unfulfilled desire. The usefulness of GSM and GPS has made them popular in their own context; integrating these technologies can prove to be a flamboyant solution for many unsolved problems. The idea of this paper is to integrate these two technologies into one system and provide an effective application for vehicle tracking as well as personal tracking. To implement a multi tracking system use of the following two technologies can be made, firstly GSM(Global System for Mobile) which is a set of standards to describe technologies for Second Generation (2G) and GPS(Global Positioning System) which is a satellite-based navigation system consisting several satellites revolving around the earth. The system will provide solution for tracking and tracing of multiple movable objects at a same time, so the name Multi-Tracking System. We can see the current location of the object and other add-on features, for vehicles there will be live tracing and tracking via GPS, controlling its subsystem parts via GSM network using SMS or GPRS. The whole system will be implemented in Microsoft .Net Technology, for system components C#.net will be used and for web based parts ASP.net will be used.
Dr. Umesh Kumar, Dr. Ramesh Kumar, Prof. Surendra Sharma, Dr D N Singh
Electronic Waste Study & Facts in Bihar
Dr. Umesh Kumar, Dr. Ramesh Kumar, Prof. Surendra Sharma, Dr D N Singh
Increased market driven consumer behavior, increased purchasing power, competitiveness, fast changing socio economic pattern and demography of social set up and desire for ease in life style has changed the scenario of sophisticated toxic , deadly waste which requires special attaintion for handling and care for preservation of human kind generation. World convergence to global village resulting in transmigration of products and developed nation’s clever move to flow old obsolete discarded products in name of technology transfer to underdeveloped and developing nations is also responsible for e waste piling up. Disparity between have one and non having one is depleting but this is attracting world towards situation where one and all are being forced to sit on pool of deadly waste materials which will not see nations boundary but will pollute environment and atmosphere for all. In particular nation also situation of waste and waste handling is varied depending on populous demography and per capita income. States having less regard to norms and legislative frame work are worst effected compared to one which have sensible framework and are willing to address situation in attentive manner. Situation of Bihar state is in worst effected shape. Situation is similar to most of the underdeveloped nations scenario
It is the era of technology in which the world is developing. Therefore, people are highly dependent on carrying out their business (money) transactions through internet based services. Most of the fairing/billing systems are now available with online support. Hence to integrate all applications in one system we are introducing Universal Transport Billing System using RF-ID (Radio Frequency Identification) card. In public transportation system at present we use paper tickets which are printed by a small machine with a key pad. In this system we don’t know the details of passengers who are using the public transport. The details of the passenger are not taken in the existing system. The system requires user details in order to provide security during identification process. The details are also required in order to provide notifications via sms to the user. In our proposed system automatic ticketing is done with the help of RF-ID card. This card can be used for almost every transport system like buses, cabs, metros, auto-rickshaws and so on. It has details of passengers and it is placed on the RF-ID card reader and the destination is entered with the help of key pad. The amount is deducted from passenger’s account according to the distance the passenger as traveled respectively. Thus, users will not have to carry cash and the system becomes more convenient. The main advantage of the system is that the transaction & fare calculation is automated and secured. Disadvantage is the maintenance and the initial cost of installation for the system. The total system mainly acts to bring out the consistency in the public transport system that will conclude in uniform access of passengers in daily rides through an automated server being updated every single time the passengers travel by carrying the RFID based tickets.
A Block Based Novel Digital Video Watermarking Scheme Using DCT
Mohini Shinde, Sadhana Todkar, Pradnya Ubale
In recent year, almost maximum work is done on internet and many number of images, audios, videos is distributed over the network, so there may be chances to piracy this data and may be chance to perform illegal operation such as duplication, modification etc. So information hiding techniques become more important. For this we are using digital video watermarking scheme. In digital watermarking scheme, some type of digital data such as logo, name or label called as watermark which represent author’s ownership are embedded into desire host image. Here adding information to protect copyright, to authenticate, to prevent misuse etc. Afterword watermarked data extracted by extraction process.
High Security System Provided By Steganographic Technique Using Palm and Iris Scan
Minakshi Kumari, Prof. Somesh Kumar Dewangan
The project proposes to implement the biometric security system based on combination of iris and palm print with steganographic technique for authentication purpose. Here the data hiding approach involves to conceal secret personal informatics within their biometric for still enhance the privacy protection.
A Novel Reference On Driving AThreePhase Brushless DC Motor With A New Matrix Converter
Thirumuru Sravya Saranya, T.Ravi Kumar, S.Sridhar
This paper presents a novel matrix converter based drive technique for three phase brushless DC (BLDC) machines. In contrast to existing matrix converter based drives, the proposed technique uses a topology that has less number of semiconductors and a unique switching sequence to drive three phase BLDC machines directly from single phase mains supply lines without a DC link capacitor. The paper describes in detail both the topology and the unique switching sequence that is essential for the proposed technique. A model based on Matlab/Simulink is also presented to demonstrate the theoretical performance of the proposed drive system. The performance of the technique is evaluated using a model of 320 W BLDC machine, which is operated in both torque and speed control modes, and simulated results indicate that the performance of the machine is comparable to existing techniques. The proposed technique is simple to implement, cost effective with low component count, and can be easily adapted to applications such as PM synchronous motor drives or loads, which require a single to 3-phase power conversion.
Renewable generation affects power quality due to its nonlinearity, since solar generation plants and wind power generators must be connected to the grid through high-power static PWM converters. This paper presents active power filter implemented with a four-leg voltage-source inverter. A Novel predictive control scheme is implemented to control the inverter. The main features of this control scheme is 1) To control power converter to inject power generated from RES to the grid, and 2)To act as a shunt APF to compensate current unbalance, load current harmonics, load reactive power demand and load neutral current. The compensation performance of the proposed active power filter and the associated control scheme under steady state and transient operating conditions is demonstrated through simulations results. 3) The results are presented for both wind & solar type renewable power generation systems.
Energy Efficient Clustering Approach For Wireless Sensor Network
Priyanka Tripathi, Prof.Rajni Dubey
In the modern era of computing, wireless network plays vital role in building a strong infrastructure for faster and improved communication. Recent developments in the field of sensor devices have broadened the area of sensor networks, which leads to new protocol design, specifically for energy consumption with wireless sensor networks (WSNs). These sensor nodes are low power, light weighted and energy efficient.WSNs with hundreds or thousands of sensor nodes can sense data from multiple locations and forward it to a particular user's location. Such data are routed using several routing algorithms. In this work, we discuss the several routing aspects in regards to the energy model of wireless sensor networks for IEEE 802.15.4. Further, This IEEE 802.15.4 standard contain two types of devices first one is FFD that if Fully Functional Device. This device is further classified into three categories PAN coordinator, Coordinator and Devices. And also compare the results of default AODV with modified AODV to make energy consumption efficient.
Mobile Cloud Computing (MCC) is a combination of Cloud computing and mobile networks. It is a technique or model in which mobile applications are built, powered and hosted using cloud computing technology. The capabilities of mobile devices have been improving quickly than computers. Many researchers focus on the area of mobile computing and cloud computing. The mobile computing means to access shared data or infrastructure through portable devices like PDA, smart phone, tablet and so on. Independently from physical location and cloud computing means a virtual computing, distributed computing or resources sharing. Mobile uses the cloud for both application development as well as hosting. The most of application in mobile is cloud based application i.e. IE, social networking apps like facebook apps, that accessible through cloud (internet). It provides the user to interface the data and services on the cloud platform .The mobile computing needed to be limited energy than regular cloud computing.
Cloud communications and cloud-based computing are seemingly on a steep growth curve. Cloud computing is an increasingly popular paradigm for accessing computing resources. In practice, cloud service providers tend to offer services that can be grouped into three categories: software as a service (SAAS), platform as a service (PAAS), and infrastructure as a service (IAAS). Cloud computing, with the revolutionary promise of computing as a utility, has the great potential to transform how IT services are delivered and managed. Yet, despite its great promise, even the most seasoned professionals know little about cloud computing or how to define it.
FPGA Implementation of Binary Coded Decimal Digit Adder and Multiplier
Amruta Bhamburkar
Arithmetic has gained high impact on the overall performance of today’s financial and commercial applications. Decimal additions and multiplication are the main decimal operations used in any decimal arithmetic algorithm. Decimal digit adders and decimal digit multipliers are usually the building blocks for higher order decimal adders and multipliers. FPGAs provide an efficient hardware platform that can be employed for accelerating decimal algorithms.
Animated Pedagogical Agents: Face-to-Face Interaction in Interactive Learning Environments
Aanshi Varshney
Recent years have witnessed the birth of a new paradigm for learning environments: animated pedagogical agents. These lifelike autonomous characters cohabit learning environments with students to create rich, face-to-face learning interactions. This opens up exciting new possibilities; for example, agents can demonstrate complex tasks, employ locomotion and gesture to focus students’ attention on the most salient aspect of the task at hand, and convey emotional responses to the tutorial situation. Animated pedagogical agents offer great promise for broadening the bandwidth of tutorial communication and increasing learning environments’ ability to engage and motivate students. This article sets forth the motivations behind animated pedagogical agents, describes the key capabilities they offer, and discusses the technical issues they raise. The discussion is illustrated with descriptions of a number of animated agents that represent the current state of the art. Abstract The potential of emotional interaction between human and computer has recently interested researchers in human–computer interaction. The instructional impact of this interaction in learning environments has not been established, however. This study examined the impact of emotion and gender of a pedagogical agent as a learning companion (PAL) on social judgments, interest, self-efficacy, and learning.
Implementation of Can and Zigbee Networks Based Industrial Monitoring and Control Applications On Arm 7 Processor
B.Tulasi Swathi, D. Prashanth, B. Kishore Babu
In an industrial plant, physical process systems consist of machines and process equipment. They are individual devices or larger subsystems of their own. Process systems can be in different operational states, such as ‘maintenance’, ‘starting up’ or ‘operating’. In each state, they provide a set of capabilities that can be combined to perform the various stages of the process. In the course of control system design, control tasks identified in co-operation with users and other engineering disciplines are allocated to the control system and human operators. The automated parts should form a structured set of control activities corresponding to the physical equipment and processing tasks. In this paper we are implementing CAN and ZigBee networks based Industrial monitoring and control applications on ARM. The ARM Microcontroller having two interconnected CAN interfaces with advanced acceptance filters. The system consists of 4 sub systems. The sub systems one and two acts as sensor nodes. These node are acquire the sensor information, process the data and transmitting the through the CAN bus to third Sub system. The third subsystem based the LPC2129 which receives the data from the sensor nodes and processes the data, transmitted this data to central receiver node via wireless ZigBee communication. The Subsystem four is the central receiver systems which can collect the data from the node receiver unit and transmitted this data to data logging sever.
When the discussion starts about a computer based automatic facial feature extraction system which can identify face, gesture etc and estimate gender, age, expirations etc. The system asks for a dependable, fast, reliable classification process. This paper presents an approach to extract effective features for face detection and gender classification system. The proposed algorithm converts the RGB image into the YCbCr color space to detect the skin regions in the color image. Finally Gaussian fitted skin color model is used to obtain the likelihood of skin for any pixel of an image. For facial feature extraction we use Gabor filters at five scales and eight orientations. To solve the classification problem this system deploys Adaboost and SVM based classifier. Biometrics is an advanced way of person recognition as it establishes more direct and explicit link with humans than passwords, since biometrics use measurable physiological and behavioral features of a person. In various biometric applications, gender recognition from facial images plays an important role. In this paper gender recognition image sequence have been successfully investigated. Gender recognition plays an important role for a wide range of application in the field of Human Computer Interaction. In this paper, we propose a gender recognition system based on Neural Networks.The system comprises two modules: a face detector and a gender classifier. The human faces are first detected and localized in the input image. Each detected face is then passed to the gender classifier to determine whether it is a male or female. Both the face detection and gender classification modules employ the same neural network architecture; however, the two modules are trained separately to extract different features for face detection and gender classification.
Automated 3d-Image Extraction in Brain MRI Using Geometric Transformable Prototype
D Chamundeshwari , J Thirupathi , J Srinivas
This paper describes a new method called adaptive hybrid Transformable geometric segmentation that uses knowledge of tissue intensity properties and intensity in-homogeneities to correct and segment MR images. Use of the expectation-maximization (EM) algorithm leads to a method that allows for more accurate segregation of tissue types as well as better visualization of magnetic resonance imaging (MRI) data, we have described an unsupervised fuzzy segmentation method, based on new objective function, which seems well adapted and efficient for functional MRI data segregation. The proposed segmentation method is more robust than the FCM algorithm and BCFCM. The proposed segmentation uses an automatic algorithm for robust WM, GM, and cerebrospinal fluid (CSF) segmentation to facilitate accurate measurement of brain tissues. Both qualitative and quantitative results on synthetic and real brain MRI scans indicate superior and consistent performance. One popular family of brain tissue segregation methods is based on normalizing the brain scans by storing (or aligning) them to a pre defined realistic view of brain tissues.
Transient Analysis of Unreliable Server MX/G/1 Queue with Bernoulli Vacation Schedule and Second Optional Repair under Controlled Admissibility Policy
Deepa Chauhan
In this paper, we propose to study such a model which deals with the aspects concerning the control of the arrival process with second optional repair and Bernoulli vacation schedule. The paper deals with Mx/G/1 queueing system where after completion of a service the server either goes for a vacation of random length with probability or may continue to serve the next customer with probability , if any. Both service time and vacation time follow general distribution. Server is subject to random breakdowns according to Poisson process, followed by instantaneous repair. If the server could not be repaired with the first essential repair, subsequent optional repair is needed for the restoration of the server. Both essential and optional repair times follow exponential distribution. Unlike the usual batch arrivals queueing model, there is restriction over the admissibility of batch arrivals in which not all the arriving batches are allowed to join the queue at all times. The restricted admissibility policy differs during a busy period and a vacation period. We obtain the time dependent probability generating functions in terms of their Laplace transforms and corresponding steady state results explicitly. In addition, some performance measures such as expected queue size and expected waiting time of a customer are also obtained. The numerical results for various performance measures are displayed via graphs.
Matching Anonymized User Profiles In Mobile Social Networks
M. Harini, M.Srilakshmi, Dr. S.PremKumar
In this project, we try to study user profile matching with privacy-preservation in mobile social networks (MSNs) and introduce new type profile matching protocols. We first propose an explicit Comparison-based Profile Matching protocol (eCPM) that runs between two parties, a leader who initiates the communication and a communicator who responds. The eCPM permits the leader to get the result of attribute comparision between their profiles, while preventing their attribute values disclosure. We then propose implicit Comparison-based Profile Matching protocol (iCPM) that permits the leader to directly get some messages rather than the comparison result from the communicator. The messages unrelated to user profile are often divided into multiple classes by the communicator. The leader implicitly chooses the interested class that is unknown to the communicator.
Two messages per each class are prepared by the communicator, and just one message is often obtained by the leader as per the result of comparison on the candidate attribute. We additionally generalize the iCPM to permit complicated comparison criteria spanning multiple attributes as implicit Predicate-based Profile Matching protocol (iPPM). eCPM reveals the comparision result to the leader and provides only conditional anonymity while iCPM provides full anonymity. We enhance eCPM, referred to as eCPM+, by combining the eCPM with a unique prediction-based adaptive anonym amendment strategy.
Directive Contrast based Medical Image Fusion Using DWT
Mr.Yogesh Bute, Prof.V.N.Patil
now a day, multimodality medical image fusion has drawn lots of attention with the increasing rate at which multimodality medical images are available in many clinic application fields. The main motivation is to capture most relevant information from sources into a single output, which plays an important role in medical diagnosis. CT scans and MRI scans contains details regarding soft and hard tissues. For medical diagnosis, CT provides the better information on denser tissue with less distortion, while MRI offers better information on soft tissue with more distortion. In this paper, a fusion method is proposed for multimodal medical images based on Discrete Wavelet Transform Directive contrast fusion rule is used in this method. The source medical images are first transformed by DWT followed by combining low- and high-frequency components. Now to reconstruct fused image Inverse Discrete wavelet transform is performed. Experimental results and comparative study show that the proposed fusion framework provides an effective way to enable more accurate analysis of multimodality images.
Simulation and Analysis of 4 bit applications using 9T full adder
Maneesh Kumar Singh, Rajeev Kumar
the most timing critical part of logic design usually contains one or more arithmetic operations, in which addition is commonly involved. Addition is a fundamental arithmetic operation and it is the base for arithmetic operations such as multiplication and the basic adder cell can be modified to function as subtractor by adding another XOR gate. Therefore, 1-bit Full Adder cell is the most important and basic block of an arithmetic unit of a system. Hence in order to improve the performance of the digital computer system one must improve the basic 1-bit full adder cell based application. In this paper simulate the performance of 4 bit adder- subtractor, 4 bit Carry skip adder and 4-bit multipliers are designed using 9T full adder. All the simulation results are using TSMC-0.18µM CMOS Technology.
Intelligent Driver Assistance System using Image Processing
Amruta Kulkarni
Road safety is an imperative issue to be considered towards the aim of decreasing road accidents and fatalities thereof. The nature of traffic in India is identified to be highly heterogenic. Considering this an Intelligent Driver Assistance System is proposed in this paper. The said system, comprising of three phases: Image Acquisition, Image Processing and Reporting, employs a sequence of efficient image processing algorithms. The three modules of the Image Processing Phase work towards different aims to provide complete driver assistance in totality. These modules are: Prediction of crash; Least Congestion Route Suggestion and Lane Departure Detection. The proposed system makes use of onboard sensors for image acquisition. Canny edge detection algorithm is employed on the greyscale converted images to yield better results than other first order derivative filters. In the Crash prediction module, the system engages Kalman filters for trajectory analysis of the equipped vehicle and the surrounding vehicles in defined proximity. The proposed system is superior to its counterparts by virtue of its integrated modules providing comprehensive driver assistance.
Prof. M.M.More, Mahesh R. Bhujbal, Ashitosh K. Chourasiya, Balwantsingh D. Chauhan, Pankaj D.Pawshe
Dynamic Data Storage for Trustworthy Cloud
Prof. M.M.More, Mahesh R. Bhujbal, Ashitosh K. Chourasiya, Balwantsingh D. Chauhan, Pankaj D.Pawshe
The Side real Day many organization addresses problems nonstop growth of user huge increasing consumption of high performance of data storage. The dynamic data is very expensive and important to the organization and handle to that data organization needs very qualified or responsible people. Cloud service provider(CSP) offers the Lease facility for cloud model Storage -as- a-service which enables user to store his data in secured form on remote server. The Storage as service (SaaS) helps to reduce cost and maintenance level of organization end. Organization outsources their data on remote server to minimize the burden of huge local data and store on remote server. Data owner, Creates the specific level of security. If, any misbehavior by owner than it will pay for that to the CSP. Now a days, the Growth of cloud computing technology ratio also increasing in the world because of their security concern. Cloud computing many security issues related to securing data and examination of utilization of cloud by user. In this paper, We use the reliable Infrastructure of Cloud storage that provide the data owners all facilities offered by cloud and establishes the mutual trust between user and cloud.
The following four remarkable features offered by our proposed system:
It permitted owner to store dynamic and sensitive data and performance the full block level dynamic operation (i.e modification, insertion, deletion and append).
It permitted to registered user rights to access the owners files and latest version of updated dynamic data.
It permitted to owner having control the access of outsource data (i.e. grant & revoke control access ) .
Prof. M.M.More, Saurabh Amrutkar, Swati Daundkar, Rohit Jagtap, Rasika Parge
A Secured and Controlled Billing System by Agent for Cloud Computing Environment
Prof. M.M.More, Saurabh Amrutkar, Swati Daundkar, Rohit Jagtap, Rasika Parge
Day by day the use of cloud computing is increasing, so it is challenging task for cloud service provider to provide services and maintain data of provided service of cloud storage in very secure way for generation of bill .the transaction should be reliable effective trustworthy. There should be no interrupt in transaction. In previous system cloud faces many security issues and is considered unreliable by the client because of inflexible communication between CSP and client. In this paper, we propose billing system which provide effective and trust worthy solution for such problems .The system uses concept of CTA for confirmation of billing. CTA store information that will solve problem between client and CSP efficient way the mediator will be responsible to check if the services are provided according to the contract. The mediator will help client and CSP to verify everything .It will act like a third party hence it will be unbiased towards CSP or client.
Implementation Aspects To Secure Criticaldatain Public Cloud Network Using Opnet Simulator
Kajal Singhai, Rajesh Kumar Chakrawarti
publictoaccesstheseapplication,software and PlatformatlowcostwithoutanySecurityMechanism, but insidethisdeployment Cloudsometimecriticaldatarequired privacyfromglobaluserand atnowonlyexistingfirewall mechanism usedtoprotectthecriticaldatafromattackers by creatingVirtualprivatenetworkwhichiscostlymechanism forshort period. This paper describes thePublic Cloud Networks,Criticaldata, existingmechanismand proposed Architecturewithimplementationtool
Wireless Data Acquisition And Transmission System Design Using Arm 9
N. Anjali, D. Prashanth, B. Kishore Babu
Generally in industries there are traditional network communications like RS232,RS485,etc. which are limited for short distances. Data acquisition based on single chip has limited processing capacity and real time reliability. With development in Arm processor, efficient data acquisition and control in various fields can be achieved. Remote I/O data acquisition system developed in this project measures temperature, humidity, gas using WSN(wireless sensor networks) technology. WIFI Technology is a proficient used by mobiles, workplaces, home an d computer system all around the world. Wifi technology is a spectrum radio technology and OFDM radio technology therefore it is an alternative of wireless LAN. WIFI technology based on the IEEE 802.11 and wifi technology alliance so we can say that it is a synonym of IEEE802.11.This system is developed using ARM9(S3C2440A) hardware platform combined with WI-FI network. IT consists of two nodes one Transmitter node and one Receiver node. The transmitter node acquires sensors information ,process data and transmit it through WI-FI to Receiver node. The receiver node receives data from sensors node which is implemented on ARM9,process it and retransmits the controlling signal information via WI-FI communication and also display the sensor information on PC based GUI implemented using Mat Lab. The keil IDE software will be used to build the hex file for these C programs.