Effective Allocation of IPTV Services Through Virtualization in Cloud
Tulasi Arjunappa, Dorababu Sudarsa
Virtualized cloud-based armed forces can take show the way of geometric multiplexing from corner to corner applications to surrender imperative cost savings to the worker. On the other hand, achieving similar payback with real-time armed forces can be a challenge. Illustrate a system where a digital undersized computer screen service is delivered using the Internet Protocol over a set of connections shipping, which may take account of delivery by a broadband association. For colonized users, IPTV is often provided in negotiate with Video on Demand and may be bundle with Internet air force such as Web access and VoIP. The money-making bundling of IPTV, VoIP and Internet access is referred to as a Triple engage in recreation. Adding the itinerant voice service leads to the Quadruple Play competence. IPTV is in general abounding by a broadband machinist using closed network transportation. This bunged network come within reach of is in responsiveness with the delivery of TV content over the public Internet. This type of leave go of is far and wide called TV over Internet or Internet small screen. In businesses, IPTV may be used to deliver television content over community LANs and business networks.
A Review on Region Detection And Matching For Object Recognition
Manpreet Kaur, Silky Narang
Object Recognition is based on Objective and Subjective dimensions where the objectivity is guided by the parameters like colour, texture, shape, size and scale whereas the subjectivity is guided by the perception and cognition of the image interpreters .In this paper we aim to study various object recognition method based on region detection and matching. The motivation is to search and propose a method which can obtain the similar and significant objects in different images with the best accuracy and in less execution time.
Trust Based Opportunistic Routing Protocol for VANET Communication
Mayuri Pophali, Shraddha Mohod, T.S.Yengantiwar
Vehicular ad-hoc network (VANET) has created their importance in network area, because of their special characteristics. The important characteristics of VANET are high mobility, self organization, no restrictions on network size all these characteristics made VANET environment a challenging for developing efficient routing protocols. For the better performance in network VANET require a special support, which made a network fast, secure and efficient, the best solution to that is an Opportunistic routing. This paper build a trusted opportunistic forwarding model in VANET, it incorporates trust mechanism into OR and to enhance the security of routing and protect the network from malicious attack. This paper focuses on the ratio of throughput, delay and security must be good more than existing protocols. In this paper, TMCOR and TOMCOM routing protocol are proposed, which are trusted minimum cost opportunistic unicast routing protocol and multicast routing protocol.
Dynamic Query Forms for Non-Relational Database Queries
S.BhaskaraNaik, B.VijayaBhaskar Reddy
Modern scientific databases and web databases maintain large and heterogeneous data. These real-world databasescontain over hundreds or even thousands of relations and attributes. Traditional predefined query forms are not able to satisfyvarious ad-hoc queries from users on those databases. This paper proposes DQF, a novel database query form interface, which isable to dynamically generate query forms. The essence of DQF is to capture a user’s preference and rank query form components,assisting him/her to make decisions. The generation of a query form is an iterative process and is guided by the user. Ateach iteration, the system automatically generates ranking lists of form components and the user then adds the desired formcomponents into the query form. The ranking of form components is based on the captured user preference. A user can alsofill the query form and submit queries to view the query result at each iteration. In this way, a query form could be dynamicallyrefined till the user satisfies with the query results. We utilize the expected F-measure for measuring the goodness of a queryform. A probabilistic model is developed for estimating the goodness of a query form in DQF. Our experimental evaluation anduser study demonstrate the effectiveness and efficiency of the system.
Detection and Prevention Of Black Hole Using Clustering In MANET Using Ns2
Gurnam Singh, Gursewak Singh
A MANET contains wireless mobile nodes that communicate together without any need of infrastructure of network as well as any central base station. This is the reason it is widely used in area that has a limitation of infrastructure and even we can form huge group of people with fruitful communication through the use of mobile nodes in the MANET. Nodes in the MANETS are autonomous and managed by itself in the absence of infrastructure. Mobile ad-hoc networks are exposed to numerous attacks due to (a) dynamic behavior. (b) In MANETS, any node can join and leave the network at any time. Black node is a malicious node that drops the packets in the network by giving the false replay for any route request and also it does not contains any path for destination. The existing method identifies the black hole attack based upon the sequence number in the RREP message.
Here the proposed method eradicates the malicious black hole node at distributed level. For the implementation of our methodology NS2 tool is used. The overall results by the simulation increases the detection rate of malicious node and that leads to the increase in network performance by lowering the rate of packet drop ratio.
A Framework for Polarity Classification and Emotion Mining from Text
Sanjeev Dhawan, Kulvinder Singh, Vandana Khanchi
Nowadays Online Social Networks are so popular that they are become a major component of an individual’s social interaction. They are also emotionally-rich environments where users share their emotions, feelings, ideas and thoughts. In this paper, a novel framework is proposed for characterizing emotional interactions in social networks. The aim is to extract the emotional content of texts in online social networks. The interest is in to determine whether the text is an expression of the writer’s emotions or not if yes then what type of emotion likes happy, sad, angry, disgust, fear, surprise. For this purpose, text mining techniques are performed on comments/messages from a social network. The framework provides a model for data collection, feature generation, data preprocessing and data mining steps. In general, the paper presents a new perspective for studying emotions’ expression in online social networks. The technique adopted is unsupervised; it mainly uses the k-means clustering algorithm and nearest neighbor algorithm. Experiments show high accuracy for the model in both determining subjectivity of texts and predicting emotions.
H.T. Rathod , Bharath Rathod, K.V.Vijayakumar , K. Sugantha Devid
A new automatic finite element mesh generation scheme of all quadrilaterals over an analytical curved surface by using parabolic arcs
H.T. Rathod , Bharath Rathod, K.V.Vijayakumar , K. Sugantha Devid
This paper presents a new mesh generation method for a simply connected curved domain of a planar region which has curved boundary described by one or more analytical equations. We first decompose this curved domain into simple sub regions in the shape of curved triangles. These simple regions are then triangulated to generate a fine mesh of linear triangles in the interior and curved triangles near to the boundary of curved domain. We then propose, an automatic triangular to quadrilateral conversion scheme. Each isolated triangle is split into three quadrilaterals according to the usual scheme, adding three vertices in the middle of the edges which are either a straight segment or a curved arc and a vertex at the barrycentre(a point located at the average of three vertices) of the element. We have approximated the curved arcs by equivalent parabolic arcs.To preserve the mesh conformity a similar procedure is also applied to every triangle of the domain to fully discretize the given curved domain into all quadrilaterals, thus propagating uniform refinement. This simple method generates a high quality mesh whose elements confirm well to the requested shape by refining the problem domain. Examples on a circular disk, on a cracked circular disk and on a lunar model are presented to illustrate the simplicity and efficiency of the new mesh generation method. We have appended the MATLAB programs which incorporate the mesh generation scheme developed in this paper. These programs provide valuable output on the nodal coordinates ,element connectivity and graphic display of the all quadrilateral mesh for application to finite element analysis.
Design and Development of a Filtration Tool for an Online Job Recruitment Portal
Ayangbekun, Oluwafemi J. , Yusuf, Maryam O.
Online recruitment provides a number of benefits such as reducing advertising cost, unlimited access to job adverts, sophisticated management tools and easier interaction between advertiser and jobseekers. But with every system that has merits, it also has its demerits. The demerit for this system is majorly that, the system does not distinguish between qualified and unqualified job applications sent to the advertiser which clusters the advertisers profile or mail depending on where the application is to be received. The aim of this study is to ensure that the system contains a filtration tool which is an online test that contains questions from the graduate management admission test, in other to distinguish the qualified from the unqualified applications and determine the eligibility of the jobseeker for that position and to provide a common profile for jobseekers such as a basic curriculum vitae CV to make the short listing process easier for the advertisers rather than looking through thousands of CVs with different format.
The work included the design of an online job recruitment portal, where PHP, HTML, Java Script, CSS and MySQL are the programming language used to develop the system. Hence, the system is a web based application that links job seekers to job adverts of their choice and allows them to apply for the position(s). Therefore if the test is passed, job seekers application will be sent to the advertiser for short listing and the advertiser then picks the preferred candidate for the position.
Iris Segmentation & Recognition in Unconstrained Environment
Sonia, Arpit Bansal
Recently, Iris recognition systems have gained increased attention especially in non-cooperative environments. One of the crucial steps in the iris recognition system is the iris segmentation because it significantly affects the accuracy of the feature extraction and iris matching steps. In this paper, a new algorithm is being proposed to segment the iris images captured in visible wavelength under unconstrained environment. The proposed algorithm uses the various types of filters to smooth the iris image. The proposed algorithm reduces the error percentage even in the presence of noise include iris obstruction and specular reflection. The proposed algorithm starts with the acquired iris image and determining the expected region of the iris using the Fuzzy C-means clustering technique. The Canny Edge detection is used to detect the edges of the iris or eye. The Hough Transform is employed to estimate the iris radius and center. After detecting the edges and Hough Transform, a new efficient algorithm is developed to detect the upper or lower eyelid. Finally, the non-iris regions are removed and the results of the proposed algorithm on our iris image database demonstrate that it improves the segmentation accuracy and time.
Big data and Cloud computing both are most popular topics of recent days for organizations across the globe. Big data is a rapidly expanding research area spanning the fields of computer science, information management and it has become a ubiquitous term in understanding and solving complex problems in different disciplinary fields such as engineering, applied mathematics, medicine, computational biology, healthcare, social networks, finance, business, education, transportation and telecommunications. Cloud computing is a service delivered over the internet for computation, data accessing & storage by creating scalability, flexibility and minimum cost. It is a next generation platform for computation which offers various services and applications to the user without physically acquiring them
Multiscale Modeling For Image Analysis of Brain Tumor Detection And Segmentation Using Histogram Thresholding
Vani Vijayarangan
Brain Image based modeling of cancer enlargement join techniques from tumor reproduction and medicinal imaging. In this background, we present a new technique to adjust a strong mind chart to MRI’s of cancer tolerant. In arrange to launch communication among a strong atlas and a pathologic tolerant picture, cancer enlargement modeling in mixture with listing procedures are engaged. In a initial phase, the cancer is mature in the atlas based on a novel Multiscale, multiphysics model together with enlargement reproduction from the cellular intensity to the biomechanical intensity, reporting for cell propagation and hankie buckles. Extensive buckles are griped with an Eulerian method for limited part calculations, which can activate honestly on the picture voxelmesh. The familiarity of size of a cancer plays a significant in the cure of cruel cancers. Physical segmentation of mind cancers as of attractive significance descriptions is a demanding and occasion overwhelming mission. This article represents a new method for the finding of cancer in mind using segmentation and histogram thresholding. The suggested technique can be effectively affected to identify the outline of the cancer and its geometrical measurement. This method can be showed to be versatile implement for the practitioners particularly the surgeons occupied in this meadow.
Apurva Agarwal, Ravi Sankar Shukla, Pradeep Kumar Keshwani
Adaptive Data Rate Oriented Speed and Capacity Based Vertical Handoff Algorithm
Apurva Agarwal, Ravi Sankar Shukla, Pradeep Kumar Keshwani
In wireless cellular environment the handoff algorithm determine when the handoff will occur and at which base station. Heterogeneous network such as UMTS, WLAN, WiMax where vertical hand off is required further challenges the cellular network. This paper provides an adaptive multiple attribute vertical hand of algorithm which allows wireless networks to select a mobile terminal using fuzzy logic concept. The main contribution of this paper is to propose adaptive data rate oriented speed and capacity based vertical hand of schemes which not only considers velocity, signal strength but also considers network bandwidth utilization for achieving load balancing named as adaptive data rate adjustment. This enables the user in achieving quality of service in heterogeneous network and provides free moving experience. A performance study using heterogeneous network such UMTS, WLAN, WiMax, as example shows that our proposed vertical hand off algorithm is able to determine when the hand off is required and selects the best access network that is optimized to network conditions , quality of service requirement , user preference , service cost , mobile terminal conditions .
Performance Analysis and Study of Audio Watermarking Algorithms
Seethal Paul, Sreelakshmi T.G
Audio Watermark is a signature, embedded within the original signal, which should be inaudible to the human ear and resistant to any attempts to remove it.All audio watermarking schemes contain various parameters in common, in particular robustness,security, transparency, complexity, and capacity.This research work aims to devise a watermarking algorithm that improves the robustness,imperceptibility and speed without affecting the quality of the original host signal..In Echo hiding,the Signal to noise ratio obtained after watermark embedding is 15.3521and the BER is 6.25%.For LSB coding,the SNR obtained for the watermarked signal is 33.8329 and the BER is obtained as 6.25%.For Chirp spread spectrum, the SNR obtained from simulation results is 32.9604 and the error rate obtained is 2.50%.
Data dissemination and effective Path Management on Wireless Sensor Network
Tarun Mahajan, Sheetal
Energy efficiency is very important in wireless sensor network. When data is transmitted through WSN then Ant based algorithm is followed. Also same data may be transmitted multiple times. In order to solve the problem Selective data transmission can be followed. We here suggest Range based algorithm rather than Ant based algorithm. In Ant based algorithm major stress is toward identifying the validity of path but the method involved, utilize too much energy and resources. In range based algorithm concept of signal strength will be used.
A survey and comparative analysis on different algorithms for Blind Source Separation
Purnima Mittal
Blind separation for speech signal is the original purpose of BSS problem, and becomes the research attention of signal processing in recent years. Separation of speeches has an important theoretical importance in voice communications, acoustic target detection, etc. Blind source separation is a well-established signal processing problem. The sources to be estimated present some diversity in order to be efficiently retrieved. Assuming the transmitted signals to be mutually independent in a linear multiple-input-multiple-output (MIMO) memory-less system, the transmitted signal is subjected to Additive white Gaussian noise. The received signals are, hence, corrupted by inter-user interference (IUI), and we can model them as the outputs of a linear multiple-input-multiple-output (MIMO) memory-less system. In this paper, we have surveyed different techniques of blind signal separation.
Dynamic histogram equalization, PCA & MAX-DCT based multi-focus image fusion
Shivdeep kaur, Dr. Rajiv Mahajan
The idea of image fusion in multi-focus cameras to combine data from various images of the similar landscape in order to bring the multi focused image. Discrete cosine transform is an image fusion method which is extra appropriate and acceptable in real-time systems using discrete cosine transform based standards of motionless image or video. This review paper shows an arranged approach for fusion of multi-focus images which is based on variance calculated in discrete cosine transform domain. In this paper a new technique is proposed which will combine the PCA, Max-DCT and dynamic histogram equalization to raise the outcome. The proposed algorithm is calculated and implemented in MATLAB using image processing toolbox. The experiments have shown that the proposed algorithm outperforms over the available techniques
A Review on Performance Improvement of Wireless System via Smart Antennas
Chander Shekher, Preeti Gulati
Smart antennas have made great progress in field of communication.There are different working principles for different types of antennas in use. Each type of antenna has its own advantages and applications. It has applications in different areas wireless network system or OFDMA etc. Smart antennas enhance the capability of a mobile and cellular system. Smart antennas also help SDMA (Space division multiplexing) to increases in range. Multi path mitigation reduces the errors because of multi path fading concept of smart antennas. It also has one big advantage that is it has very high security.
Detection and Isolation of Packet Droppers in Wireless Networks
Dr.K.Kungumaraj , J.Vijayabharathi
Wireless ad hoc networks realize end-to-end communications in a cooperative manner. In this paradigm, multiple nodes coordinate to form a multi-hop route, when communication needs to take place between a source and a destination that are not within communication range. Thus, intermediate nodes are willing to carry traffic other than their own. For ad hoc networks deployed in hostile environments, a protocol-compliant behavior on behalf of all nodes of the network cannot be assumed. Selfish and/or malicious users may misconfigure their devices to refuse forwarding any traffic, in order to conserve energy resources or degrade the network performance. We address the problem of identifying and isolating misbehaving nodes that refuse to forward packets. We develop a comprehensive system called Audit-based Misbehavior Detection (AMD) that effectively and efficiently isolates both continuous and selective packet droppers. The AMD system integrates reputation management, trustworthy route discovery, and identification of misbehaving nodes based on behavioral audits. It consists of three modules: the reputation module, the route discovery module, and the audit module. All three modules are tightly integrated to ensure that multi-hop communications take place over paths free from malicious nodes.
Ms. Trupti M. Marawar , Prof. Wyomesh Deepanker , Prof. Kailash Patel
Automatic Discovery of Personal Name Aliases from the Web Using Lexical Pattern-Based Approach
Ms. Trupti M. Marawar , Prof. Wyomesh Deepanker , Prof. Kailash Patel
An individual is typically referred by numerous name aliases on the web. Accurate identification of aliases of a given person name is useful in various web related tasks such as information retrieval, sentiment analysis, personal name disambiguation, and relation extraction. We propose a method to extract aliases of a given personal name from the web. Given a personal name, the proposed method first extracts a set of candidate aliases. Second, we rank the extracted candidates according to the likelihood of a candidate being a correct alias of the given name. We propose a novel, automatically extracted lexical pattern-based approach to efficiently extract a large set of candidate aliases from snippets retrieved from a web search engine. We define numerous ranking scores to evaluate candidate aliases using three approaches: lexical pattern frequency, word co occurrences in an anchor text graph, and page counts on the web. To construct a robust alias detection system, we integrate the different ranking scores into a single ranking function using ranking support vector machines.
Recent studies in the field of computer vision and pattern recognition show a great amount of interest in content retrieval from images and videos. This content can be in the form of objects, color, texture, shape as well as the relationships between them. The semantic information provided can be useful for content based image retrieval as well as for indexing and classification purposes. This text data is particularly interesting because text can be used easily and clearly describe the contents of an image. Since the data is embedded in an image or video in different font styles, sizes orientation, color and against a complex background the problem of extracting text becomes a challenging one. In this paper we present a method to extract text from images with complex background.
Degradation of Keratinous Waste Products by Keratinolytic Bacteria Isolated from soil
Harison Masih , Sandeep Singh
Keratins are the most abundant protein in epithelial cells of vertebrates and represent the major constituents of skin and its appendages such as nail, hair, feather, and wool. World-wide poultry processing plants produce millions of tons of feather as a waste product annually, which consist of approximately 90% keratin feathers. These feathers constitute a sizable waste disposal problem. Several different approaches have been undertaken for disposing these feather wastes. In the present work soil samples were collected from barber shop and chicken shop waste dump of Allahabad. Among the bacterial isolates three strains (S-1, S-2 and S-3) showed evidence of keratin degradation. Strain S-1 and S-2 were identified as B. subtilis and strain S-3 was identified as B. licheniformis. The keratin degradation was evident by degradation of feathers, sheep wool and hairs used as substrate. Strain S-3 was found to be the best strain for the degradation of keratin waste.
Multi-Level Encryption using SDES Key Generation Technique with Genetic Algorithm
S.Devi, Dr.V.Palanisamy
Data transmission in real time environment has been invoked by trustworthy persons as well as in internet media it resides on secure communication channel. The introduction of internet increases security issue twice for exchanging the information electronically. Cryptography is the process of scrambling the data into unknown format. This process is done with the help of encryption & decryption algorithm. The basic two ideas behind the cryptographic techniques are substitution & transposition. The Caesar cipher substitution method is used to alter the plaintext characters with the help of automatic key. This paper presents a multi stage encryption algorithm. At the end of each stage an intermediate cipher is produced. The transposition is employed by using crossover method of genetic algorithm. Final ciphertext is derived from the combined effect of basic arithmetic & logic operations.
Mobile Ad Hoc Networks (MANETs) make use of advanced network routing protocols that categorize the notes and manage the communication of packets. ALERT (Anonymous Location - Based Efficient Routing Protocol) has remained highly efficient in terms of efficiency and maintenance of consistency. ALERT protocol is now applied to the packet communication to achieve secured encryption and decryption of data. This can also help in achieving a high degree of security within the network. ALERT based on cryptography proves to be reliable for multipath transmission in networks of varied complexities.
Comparative Study on Thresholding Techniques of Discrete Wavelet Transform (DWT) to De-noise Corrupted ECG Signals
Mitra DJ, Shahjalal M, Kiber MA
The electrocardiogram (ECG) signals which are recording of bioelectric potential caused by rhythmical activities are usually corrupted with various sources of noise. In this paper, Discrete Wavelet Transform (DWT) based technique is used to de-noise ECG signals. A comparative study is done on the soft and hard thresholding DWT for several ECG signals for different levels of signal to noise ratio (SNR). The SNR improvement (SNRimp) and Percent Root mean square Difference (PRD (%)) is analyzed. The results show that the soft-thresholding technique performs better than the hard-thresholding to de-noise ECG signal
Interactive Disk Platform for Teaching Kinematics of Machines
Akande Theresa T, Momoh John J
Course disk in either CD or DVD format can be use as teaching aids. The potential of DVD technologies has been disregarded by most of the higher education sector, especially in Nigeria. Educational research on new information and communication technologies centers more on analyzing the educational potential of products that dominate the marketplace than on developing tools particularly for teaching and learning. In Nigeria today one of the cardinal, yet ambiguous tasks in education is the evolution and employment of teaching and learning aids that effectively channel information in a format that is conducive to students as their preferred learning styles. Due to the fact that students prefer the conventional method of teaching in order to provide better understanding of conceptual theory and computational works that seem tedious manually, this platform has been developed to provide various animations, images, e-books and online resources on kinematics of machines, using the most advanced multimedia tools. All the teaching and learning materials acquired for kinematics of machines has been put in digital format and a DVD-R has been produced which contains all of the kinematics of machines software and text materials. All the video material has been digitized and various compression options are being investigated. The interactive platform is easy to integrate and access has been provided for students through a common icon based interface. This platform will help overcome the concerns of many academics about the Internet and the difficulty of integrating high quality media with a web site.
Ontology is a most essential technology in Data and Knowledge Engineering. Because of Ontology provide many advantages over Object Oriented Concepts, like Knowledge Sharing, reusability, Interoperability and Knowledge Level Validation and Verification. Ontology is a collection of concepts that represent knowledge in the domain and there exist common terminology to provide types, methods and relationship between those concepts in the domain. Ontology used in the form of structural framework in many field like Artificial Intelligence, Information Science, Semantic Web and etc., this concept identification presents an ontology building through the automatic and semi-automatic process. Most of the ontology learning technique developed using the Classifiers, NLP, probabilistic and statistical learning. For the concept identification it uses the process of statistical learning with the combination of text.
WordNet is a dictionary of word meanings/concepts. Hence there must be a standard representation of the concepts in order to simulate a lexical matrix on a machine. Marathi is an Indo-Aryan language spoken by about 71 million people mainly in the Indian state of Maharashtra and neighboring states. In WordNet, which is basically a semantic network, the different lexical categories of words (nouns, verbs,..) are organized into 'Synsets' (sets of synonyms). Each synset represents a lexical concept and they can be linked by different types of relation (Hypernymy, antonym, etc.). The WordNet for Marathi 36842 unique words grouped in more than 26988 Synsets, linked synset 24398 [1]. We would like to increase unique words, with that we want to give special efforts on wards which are oriented towards Marathi culture only for example culture aspects of words like that Lawani, Abhang, Fugaddi, Ringan etc.
Improvement in Divide and Conquer Scheme Using Relay Nodes
Archana Kumari, Vishal Walia, Rahul Malhotra
A wireless sensor network is a network of large number of sensor nodes, which can perform sensing, computation, and transmission of sensed data. Long distance transmission by sensor nodes is not energy efficient. One approach to extend the network lifetime while protecting network connectivity is to set up a small number of costly, but more powerful relay nodes whose main task is to communicate with other sensor or relay nodes. There are many issues in wireless sensor network like energy consumption, node deployment, network lifetime, etc. In this paper, we focused on the divide and conquer scheme which help to increase the throughput and reduce the packet loss in the network. To overcome these problems a new technique will be proposed. Experimental results show the throughput and packet loss of all the regions
Low Power Booth Multiplier Using Radix-4 Algorithm On FPGA
Prof. V. R. Raut , P. R. Loya
As the scale of integration keeps growing, more and more sophisticated signal processing systems are being implemented on a VLSI chip. These signal processing applications not only demand great computation capacity but also consume considerable amounts of energy. While performance and area remain to be two major design goals, power consumption has become a critical concern in today’s VLSI system design. Multiplication is a fundamental operation in most arithmetic computing systems. Multipliers have large area, long latency and consume considerable power. Previous work on low-power multipliers focuses on low-level optimizations and has not considered well the arithmetic computation features and application-specific data characteristics. Binary multiplier is an integral part of the arithmetic logic unit (ALU) subsystem found in many processors. Booth's algorithm and others like Wallace-Tree suggest techniques for multiplying signed numbers that works equally well for both negative and positive multipliers. This synopsis proposes the design and implementation of Booth multiplier using VHDL . This compares the power consumption and delay of radix 2 and modified radix 4 Booth multipliers. The modified radix 4 Booth multiplier has reduced power consumption than the conventional radix 2 Booth Multiplier.
Physical Unclonable Functions (PUFs) are circuits that exploit chip-unique features to be used as signatures which can be used as good silicon biometrics. These signatures are based on semiconductor fabrication variations that are very difficult to control or reproduce. These chip-unique signatures together with strong challenge-response authentication algorithm can be used to identify and secure chips. This paper expands the security avenues covered by PUF and FPGAs by introducing a new class of concept called “Soft PUFs.” We propose robust challenge- response authentication solution based on a PUF device that provides stronger security guarantees to the user than what previously could be achieved. By exploiting the silicon uniqueness of each FPGA device and incorporating a special authentication algorithms in existing FPGA fabric, FPGAs, embedded systems can be used for new security-oriented and network- oriented applications that were not previously possible or thought of.
A Novel Method for Classification of Lung Nodules as Benign and Malignant using Artificial Neural Network
Rohit B Kuravatti, Sasidhar B, Dr.Ramesh Babu D R
Automated Segmentation and Classification of lung nodules into benign and malignant is a challenging task and is of vital interest for medical applications like diagnosis and surgical planning. It improves the accuracy and assist radiologist for better diagnosis. In this paper, a new method is proposed for the classification of lung nodules using Artificial Neural Networks based on Shape, Margin and Texture features. In order to reduce the complexity of the algorithm and the computational load, use of fewer features is particularly important, while maintaining an acceptable detection performance. The proposed algorithm was tested on LIDC (Lung Image Database Consortium) datasets and the results were satisfactory in terms of accuracy in classification.
An Evaluation of the Arrangement Transmitter System by Direction of Arrival (DOA) Estimation
Manish Kumar Goyal Prof. Puran Gour
In this research paper, we evaluate the arrangement transmitter system by Direction of Arrival (DOA) estimation plan, and on the evaluation, we make clear problems related to the hardware and transmission background of the arrangement transmitter system [1]. Arrangement transmitter systems are often modeled without any capacity and the possessions of distinction in characteristics, joint coupling, built-up mistakes, etc., are rarely deliberate. DOA estimation using an arrangement transmitter is categorized based on the principle of beam and null steering. Both methods require the steering vector of the arrangement transmitter for estimation.
An Analysis of Wideband Phased Array Antenna System using with Micro strip Filter
Rakesh Kumar Sharma, G. Kumar, Braj Bihari Soni
Wideband phased array antenna system designed for multi-channel and multi-function process. Wideband phased array antennas consist of multiple fixed antenna elements, which can be energized in a different way and in order to manage the radiation pattern. The phased array antenna system is designed to work from 3 to 12 GHz. So each element is designed to work within the above bandwidth. Antenna elements are other important components of a phased array antenna. The dielectric substrate changes the effective dielectric constant of the micro strip line, and cause different phase shifts on the micro strip line.
Comparison between NICE and NICE-1 Algorithms In Virtual network systems
P. Suganya, G. Thailambal, M. Umadevi
Cloud computing has gained adequate attention in the recent times due to its ease and customizability. However, cloud security has been a never ending issue faced by the users. Attackers take hold of the cloud system to gain access to third party information. Network Intrusion Detection and Countermeasure selection (NICE) is one particular technique that adopts multipath distributed vulnerability detection with the help of OpenFlow network programming API’s. On the other hand, NICE-1 adopts a similar architecture but makes use of an efficient host-based IDS involving firewall that manages any number of attackers. The latter is compatible and cost effective when compared to the former.
Depletion of Energy Attacks in Wireless Sensor Networks
S Himabindhu, G Sateesh
Network survivability is the capacity of a network keeping connected under loss and intrusions, which is a major concern to the design and design interpretation of wireless ad hoc sensor networks. Ad-hoc low power wireless networks are in inquisition in both discerning and ubiquitous computing. The proposed method discusses about energy draining attacks at the routing protocol layer, which drains battery power. A innovative approach for routing protocols, affect from attack even those devised to be protected which is short of protection from these attacks, which we call energy debilitating attacks, which enduringly disable networks by quickly draining nodes battery power. These energy depletion attacks are not protocol specific but are disturbing and hard to notice, and are easy to carry out using as few as one malicious insider sending only protocol compliant messages.
This paper is a review on calculation of Bit Error Rate/Symbol Error Rate of Wireless mesh networks. WMNs have emerged as a flexible, reliable and cost effective way of providing broadband internet access over wide areas through multi hop communication. In this work the BER (Bit Error Rate) performances is shown analytically and by means of simulation for Rayleigh fading multipath channels and AWGN Gaussian channel. This work focuses on error performance of phase modulation schemes in different channel conditions and on the method to reduce bit error rates.
Secure Data hiding in Multi-level using Digital Invisible Ink based on Spread Spectrum Watermarking
T. Ezhil Sindhu, Guide - P. Sripriya
Watermarking schemes has been proposed as a protection of data or in other words preventing illegal copying of data, but still this security is yet to be well defined. The approach of watermarking is sometimes considered too theoretical. Hence, a whole new casting of watermark security is proposed, called the effective key length. Through this approach it would be a nightmare for any hacker who tries to find access to the original data and also digital watermarking technique is used as the embedding process for data hiding, apart from using security keys, because finding the key should not be the best mode in breaking the system therefore multiple level of security is proposed in this paper. The analysis paves way to the spread spectrum scheme to initiate the theoretical and practical application of the effective key length. For data hiding, ISS (Improved Spread Spectrum) offers a better security and robustness and it is adopted to implement digital invisible ink method.
QBIC, MARS and VIPER: A Review on Content Based Image Retrieval Techniques
Amit Singla, Meenakshi Garg
With the widespread of the network and development of Image technology, the older information retrieval techniques do not meet today’s demand. Recently, the content-based image retrieval has become the most developing topic and the techniques of content-based image retrieval have acquired great attention. In this paper, the basic components of content-based image retrieval system are discussed. Image retrieval methods based on color, texture, shape and semantic image are studied discussed and compared. Other related techniques such as relevance feedback and performance evaluation are also discussed. In the end of paper the problems and challenges are fetched out of each available technique. In recent areas of commerce, government, academics, and hospitals, large collections of digital images are being created and stored. Large amount of the digital image collections are the product of digitalizing existing collections of hardcopy photographs, diagrams, drawings, paintings, and prints. Usually, the only way of searching these collections was by keyword indexing, or simply by browsing related words. Digital image database however, open the way to content-based searching. In this paper we review some technical aspects of current content-based image retrieval systems.
Caesar Cipher Cryptographic Method Along With Bit-Manipulation to a Message to be encrypted and digest for RFID credit card security
Rohit Sharma, Dr. P.K. Singh
A lot of sophisticated problems we have to face during the security of RFID credit card. In this paper, author using a new approach to make RFID credit card more secure. The main objective of this paper is to provide a method along with encryption and digestion of incoming and outgoing data to RFID credit card. In this paper we presents a new cryptographic approach to exclude the repetitive terms in a message, when it is encrypted and digest [1], so that it is not easy for any adversary to retrieve or predict the original message from the encrypted message. . By using cryptographic approach, we can improve the security by encrypting the plain text data to cipher text data. If we individually using Caesar cipher substitution and cryptographic hash function, then obtained cipher text is easy to crack. I proposed a perspective approach on combination of techniques substitution and digestion. We can eliminate the fundament weakness by combining Caesar cipher with cryptographic hash function technique.
Data Mining Approaches for Diabetes using Feature selection
Thangaraju P, NancyBharathi G
Data mining is the process of analyzing data from different perspectives and summarizing it into useful information. Data mining is applied to find patterns to help in the important tasks of medical diagnosis and treatment. This project aims for mining the diabetes types with the help of feature selection techniques. The main interest is to identify research goals. Data mining has played an important role in diabetes research. Data mining would be a valuable asset for diabetes researchers because it can unearth hidden knowledge from a huge amount of diabetes-related data. The results could be used for both scientific research and real-life practice to improve the quality of health care diabetes patients. This article describes applications of data mining for the analysis of blood glucose and diabetes mellitus data.
The main purpose of this paper is to predict how people with different age groups are being affected by diabetes based on their life style activities and to find out factors responsible for the individual to be diabetic. The Best First Search Technique is implemented in this approach to fetch the data from the database. This approach is used to retrieve data in efficient manner and also in fraction of seconds without any delay. The greedy forward selection method also implemented in this approach to update the data in the database
Uma Upadhya, Dr. Shubhangi, D.C Rekha B Venkatapur
Code controlling in DTN’s for progressive packet arrival dynamically
Uma Upadhya, Dr. Shubhangi, D.C Rekha B Venkatapur
In Delay Tolerant Networks (DTNs) the core challenge is to cope with lack of persistent connectivity and yet be able to deliver messages from source to destination. In particular, routing schemes that leverage relays’ memory and mobility are a customary solution in order to improve message delivery delay. So when large files need to be transferred from source to destination, not all packets may be available at the source prior to the first transmission. This motivates us to study general packet arrivals at the source, derive performance analysis of replication based routing policies and study their optimization under two hop routing. Here we determine the conditions for optimality in terms of probability of successful delivery and mean delay and we devise optimal policies, so-called piecewise-threshold policies. We account for linear block-codes and rate less random linear coding to efficiently generate redundancy, as well as for an energy constraint in the optimization.
Weight Optimization and FEA Analysis of Al-Si Metal Matrix Composite Drive Shaft
Bhimagoud Patil, Fayaz Kandagal, Vinoth M.A
The main concept of this project is to reduce the weight of automotive drive shaft which is done through the fabrication of Al-Si matrix reinforced with SiC-Cenosphere composite material by stir casting with varying percentage of Cenosphere. Aluminium metal matrix composite have been used in many automotive components because of their properties such as low weight, corrosion free, high specific stiffness, ability to produce complex shapes and high specific strength etc. The effect of weight % of Cenosphere is studied and is used for Automotive drive shaft application. In automobiles the drive shaft is mainly used for the transmission of motion from the engine to the differential. The modeling of the drive shaft assembly was created using CATIA software. In present work an attempt has been to estimate deflection, stresses under subjected boundary conditions, analysis are carried out by Abaqus.
The Cloud can be considered as a platform or infrastructure that is responsible for execution of services and applications in a reliable and elastic fashion with predefined quality parameters to provide more Availability, agility, adaptability, reliability etc. In Brief, 'cloud' is an elastic execution environment of resources involving multiple stakeholders and providing a metered service at multiple granularities for a specified level of quality (of service). There are various services provided by cloud service providers to their users such as Saas, Pass, Iaas, these services are available to the cloud users depend upon some trust agreement across the globe. Trust is the another factor which is one of the most challenging issue in the cloud computing area for which various models has been proposed in last few years. The objective of this paper is to impart a clear idea about various services provided by cloud, there comparative approach and the role of various trust mechanism in cloud
Comparative Study on Currency Recognition System Using Image Processing
S. Surya, G. Thailambal
In the world, currency is the main role as medium of exchange where government of many countries introduces as banknotes and coins that is. Indian rupees, dollar, Yuan, dinar etc. in different appearance such as picture of our leaders, different color, size, serial numbers, watermarks. At same time people handled difficult to recognize currencies from different countries. The purpose of the paper is to help people solve these difficulties by comparing with another methodology. However, currency recognition system that are based on image analysis entirely are not sufficient. Our system is based on image processing and makes the process easy and user friendly to recognize all kind of banknotes. In this paper we have use Indian currency i.e. Rupees
Performance Analysis of MFCC and LPCC Techniques in Automatic Speech Recognition
Dr.Mukesh Rana, Saloni Miglani
As a result of ample development of computers, the various types of information exchange between man and computer are discovered. At present, inputting the data in computer by speech and converting the data into the another form for eg. Text. with the help of automatic speech recognition system and its recognition by the computer is one of the developed scientific fields. As each language has its specific feature, the various speech recognition systems are investigated for the different languages. In this paper, we have taken two algorithms known as MFCC and LPCC. These two algorithms are used for feature extraction. The performances of the two algorithms are compared to achieve better performance with high recognition rate and low computational complexity and the major advantage of comparing these two algorithms is that they improves the reliability of the system.
Character Recognition from Born Digital Images using Correlation
Soumya soman
Character recognition can be done on various image datasets. In this method Binarization and edge detection are separately carried out on the three colour planes of the image. From the binarized image Connected components (CC’s) are obtained and thresholded based on their area and aspect ratio. CC’s which contain sufficient edge pixels are retained. Also the text components are represented as nodes of a graph. The centroids of the individual CC’s are represented as the nodes of the graph. Long edges are broken from the minimum spanning tree of the graph. Pairwise height ratio is also used to remove likely non-text components. A new minimum spanning tree is created from the remaining nodes. Horizontal grouping is performed on the CC’s to generate bounding boxes of text strings. Overlapping bounding boxes are removed using an overlap area threshold. Non-overlapping and minimally overlapping bounding boxes are used for text segmentation. Vertical splitting is applied to generate bounding boxes at the word level . After the segmentation and Localization text is character rs are recognized by using Correlation.
Bhavatharini M G, Veena A, Thejas Chikkalingappa M G
Conserve Energy by Optimally Executing Mobile Applications in Mobile device or Offloading to Cloud
Bhavatharini M G, Veena A, Thejas Chikkalingappa M G
Mobile systems have limited resources, such as battery life, storage capacity, and processor performance. These restrictions overcome by computation offloading. Energy efficiency is a fundamental consideration for mobile devices. Mobile Cloud Computing (MCC) is a model for flexible growth of mobile device capabilities via universal wireless access to cloud storage and computing resources. Rather than conducting all computational and data operations locally, MCC takes advantage of the abundant resources in clouds to store and process data for mobile devices. Our objective is to conserve energy for the mobile device, by optimally executing mobile applications in the mobile device (i.e., mobile execution) or offloading to the cloud (i.e., cloud execution). Our framework for energy optimal execution of applications derived a condition when offloading is beneficial and amount of energy saved by optimal execution of application. Proposed system demonstrated significant gain in execution speed and battery life of mobile phones.
DWT Domain Data Encryption with Asymmetric key Cryptography
Miss. Snehal C.Dinde, Dr.Mrs.Shubhangi B.Patil
This paper discuses the attempt done to provide secret communication technique by hiding confidential data in image. Here steganography along with cryptography is used to strengthen the security. Steganography hides existence of data while cryptography protects the information by transforming it into an encrypted form. Asymmetric key cryptography technique – RSA is used to encrypt the secret data and then Haar- DWT transformation algorithm is used to embed the secret data which is in encrypted form. Experimental result shows image quality parameters like PSNR and Mean Absolute Error.
Prof. (Ms) A. B. Shikalgar, Prof. (Mrs) A. N. Mulla, Prof. T. A. Mulla
Adaptive Constructive Algorithm for Artificial Neural Networks
Prof. (Ms) A. B. Shikalgar, Prof. (Mrs) A. N. Mulla, Prof. T. A. Mulla
The artificial neural networks (ANNs) generalization ability is greatly dependent on their architectures. For a given problem constructive algorithms provide an attractive automatic way of determining a near-optimal ANN architecture. Many algorithms have been proposed in the literature and shown their effectiveness. In automatically determining ANN architectures our research work aims at developing a new constructive algorithm (NCA). NCA puts emphasis on architectural adaptation and functional adaptation in its architecture determination process as in most previous studies are determining ANN architectures. It uses a constructive approach to determine the number of hidden layers in an ANN and of neurons in each hidden layer. NCA trains hidden neurons in the ANN by using different training sets that were created by employing a similar concept used in the boosting algorithm, so as to achieve functional adaptation. The purpose of using different training sets is to encourage hidden neurons to learn different parts or aspects of the training data so that the ANN can learn the whole training data in a better way. In the research the convergence and computational issues of NCA are analytically studied. The experimental result in the research shows that, NCA can produce ANN architectures with fewer hidden neurons and better generalization ability compared to existing constructive and non constructive algorithms.
Disaster Event Dectection Reporting System Development Using Tweet Analysis
Sunil.D.N, B.Pavan Kumar, P. Nirupama
The implementation of a modern seismic network involves many different research and technological aspects related to the development of sophisticated data management and processing. The communication systems need to rapidly generate useful, robust, and secure alert notifications. Here we provide a general technical and seismological overview of ISNet’s complex architecture and implementation. The implementation of a modern seismic network involves many different research and technological aspects related to the development of sophisticated data management and processing. The communication systems need to rapidly generate useful, robust, and secure alert notifications. Here we provide a general technical and seismological overview of ISNet’s complex architecture and implementation twitter has received much attention recently. An important characteristic of twitter is its real-time nature. investigate the real-time interaction of events such as earthquakes in Twitter and propose an algorithm to monitor tweets and to detect a target event. To detect a target event, devise a classifier of tweets based on features such as the keywords in a tweet, the number of words, and their context. Subsequently, produce a probabilistic spatiotemporal model for the target event that can find the center of the event location.
Novel Resonant Pole Inverter for Brushless DC Motor Drive System using Fuzzy Logic controller
Gaurav Kumar Mishra, A.K Pandey
The brushless dc motor (BDCM) has been widely used in industrial applications because of its low inertia, fast response, high power density, high reliability, and maintenance-free reputation. Brushless DC motors have a permanent-magnet rotor, and the stator windings are wound such that the back electromotive force (EMF) is trapezoidal. It therefore requires rectangular-shaped stator phase currents to produce constant torque. The Motors possess high torque/weight ratio, operate at very high speed, are very compact and are electronically controlled. The advantage of these motors is the removal of brushes, leading to eliminate many problems associated with brushes. BDCM drives have been focused on the motor control strategies. Nevertheless, most of these converter topologies employ the hard-switching technique which causes high switching losses and severe electromagnetic interference. Recently, a number of soft-switching techniques, providing zero-voltage switching (ZVS) or zero-current switching (ZCS) condition, have been successfully developed.This paper proposed the fuzzy logic and PI based speed control of brushless DC motor using soft-switching inverter Hence all switches work in zero voltage switching condition.
A New Algorithm for QR Code Watermarking Technique For Digital Images Using Wavelet Transformation
Alikani Vijaya Durga, S Srividya
In the digital image processing copyright protection and authentication have become more significant, in order to achieve this digital different watermarking techniques are introduced for the security. We propose an algorithm in wavelet domain using wavelet transformation for a digital invisible watermarking is to embedded into a QR code image. In our method we embed a binary image logo considered as watermark embedded into one of selected wavelet subband. The experimental results show that our method has more robustness to attacks in different considerations and it can achieve a viable copyright protection and authentication.
A Novel Power Saving Framework for Emissive Displays Using Contrast Enhancement Based On Histogram Equalization
Geetha Manikya Prasad R, S Srividya
This paper proposes a novel power-constrained contrast-enhancement framework for emissive displays based on histogram equalization (HE). To deduce the impact of the overstretching artifacts of the old HE log-based histogram is applied. Power-consumption model is developed for emissive displays and formulate an objective function that contain power term and histogram-equalizing term. The Proposed Framework provides contrast improve and power saving. Proposed algorithm is extend to improve the video sequences, as well as still images contrast and perceptual quality assessment..
Natural disasters are increasing worldwide due to the global warming and climate change. We are focusing on Landslides disaster. However, this disaster is largely unpredictable and occurs within very short spans of time. Therefore technology has to be developed to capture relevant signals with minimum monitoring delay. Wireless Sensors are one of the cutting edge technologies that can quickly respond to rapid changes of data and send the sensed data to a data analysis centre in areas where cabling is inappropriate. The heart of this project lies with the use of a GSM + Zigbee + Sensor. Every sensor has Zigbee Tx mounted on it. When landslides happens sensor sense that and transfer data through router to the coordinator. Coordinator has GSM + Zigbee Rx. Coordinator receives it and this information is transmitted by GSM to the Control centre. GSM in the Control centre receives this and transfer this information via GSM to rescue team. We also check the status of sensor by sending message. Monitoring Landslides is very helpful to protect people and avoid accident. Because of landslides many accident occur on highway, hill station and railway track. So with the help of this system we can warn the main centre about where the landslides happen. We also check the status of tunnels and landslides prone area.
A Review on Fingerprint Recoginition Technique Using Real Minutia Identification
Komal Sharma, Vinod Kumar Singla
Fingerprint matching is one of the most efficient and successful technique for human identification with easy to implement and get accurate results. This paper presents a review on study and implementation of a fingerprint recognition system based on Minutiae based matching quite frequently used in various fingerprint algorithms and techniques. This approach mainly involves extraction of unique identification points called minutiae points from the captured fingerprint images and then performing fingerprint matching based on the number of minutiae pairings among two fingerprints in question. The spurious minutia are removed by identifying the false percept minutia at the extraction stage.
Development of a Computer-Based Test Platform with a FOSS
P. O. Ajewole, J. J. Momoh
With the advances in information and communication technology, increase in school enrolments and the need for more reliability in students’ assessment, Computer-Based Testing (CBT) is recently gaining more popularity as a means of administering tests in educational institutions and corporate organisations. This paper reports the development of a computer-based test platform for Centre of Entrepreneurship Development and Vocational Studies (CEDVS) of The Federal Polytechnic, Ado-Ekiti, Nigeria. A free and open source software (FOSS) called MOODLE was used to develop the platform. The platform was test run to administer tests to students offering Entrepreneurial Courses in the Polytechnic. Out of 202 students tested, 89% preferred CBT to paper-based tests, 92% agreed that the platform was interactive and easy to use while 12% believed that the use of CBT will make test taking difficult.
Introducing Health Care Analytics for Cancer Treatment
Nitesh Dugar, Surendra Yadav
Hospital supervision or healthcare administration is the field with reference to management, leadership and administration of hospitals, health care systems and hospital networks. Healthcare business these days generates huge amounts of complicated information concerning with patients, medical devices, electronic patient records and sickness designation, hospital resources etc. The huge amounts of data’s information and knowledge may be a key resource to be processed and analyzed for knowledge extraction that permits support for cost-savings and higher cognitive process. Data mining brings a collection of tools and techniques that may be applied to the present processed data’s information to getting hidden patterns that offer healthcare professionals an additional or extra source of information and knowledge for better decisions for the administration.
Ternary wide-bandgap chalcogenides LiGaS2 and BaGa4S7 for the mid-IR
Rena J. Kasumova, .A. Safarova, N.V. Kerimova
In this work we study the parametric interaction at the presence of the phase change of interacting pump, signal and idler waves. It is considered in the materials of mid-infrared range of spectrum for LiGaS2 and BaGa4S7 crystals. The threshold intensity of pumping is analyzed for these crystals under the conditions of the experiments. The values of refractive indices and angle of phase matching have been calculated for these crystals. It is shown that LiGaS2 and BaGa4S7 compounds could be used for nanosecond/picoseconds pumping of optical parametric converters at 1.064 mcm (Nd:YAG laser systems) without considering two-photon absorption.
Preserving Data Integrity and Public Auditing for Data Storage in Cloud Computing
M. Pavani, D. Jayanarayana Reddy Dr. S.PremKumar
In this project the utilization and combining of public key based homomorphic authenticator with random masking to achieve the preservation of privacy to the public cloud data auditing system which meets all the above requirements. If there is a situation for multi auditing tasks then we further extend the technique of bilinear aggregate signature to extend the main result to multi user setting where the third party auditor can perform the multi auditing tasks simultaneously.
The main issue is going on, on the outsourced data. Cloud computing produces some of the new and challenging security threats to the cloud users outsourced data. Cloud service providers are the separate entities. The data outsourcing will provide an insecure performance in the user’s outsourced data due to the unauthenticated performance of the strangers. In the existing scenario although the infrastructure of the cloud is good with an powerful computing devices but there were internal and external threats which leads to an issue to the data integrity. The status of the outsourced data was not clear. Although the well infrastructure cloud do not providing the exact status about the users outsourced data.
Ranking is an important problem in various applications. Many natural language processing tasks involve ranking a set of items. Sometimes we want the top items to be not only good individually but also diverse collectively. These ranking approaches are used to avoid redundant information as possible. Manifold Ranking with Sink Points (MRSP), is used to address the diversity and importance in ranking. We applied MRSP on two application tasks, update summarization and query recommendation. This ranking approach gives a strong performance of MRSP as compared to already existing ranking approaches.
The objective is to design an efficient vehicle identification system using the vehicle number plate. The system is implemented for a Housing society or a private colony. The system first loads the image of vehicle registration plate. OCR technique is used for character recognition. The original image which is stored in the application folder will be compared with the loaded image. The system will then verify whether the image is original or fake if the image is authenticated the gate will open indicated by glowing green led else gate will remain closed and a message will be sent to the security personnel.
Cloud computing is a new way of delivering computing resources and services. The health care environment is changing faster than ever before due to the demand of delivering higher quality medical services for less money, and increased competitively between health care services providers. Cloud technology is used to create network between patients, doctors, and healthcare institutions by providing applications, services and also by keeping the data in the cloud. This paper mainly emphasize on challenges, need, benefits and advantages of using Cloud computing in Health Care Systems.
Information systems in healthcare is found to play a very crucial role in today’s scenario as more and more patient data is accumulated and it has become extremely crucial for correct and early diagnosis leading to early and effective treatment. Hence the issue of security of patient data is increasingly gaining popularity. Implementing security techniques in health centers is very much necessary to provide the controlled access, confidentiality and integrity to the patient records. This paper presents a method using cryptographic means to improve trustworthiness of medical images, and healthcare information without compromising its quality to the end user. It also analyzes the security measures that have been implemented to provide the controlled access in the current project using MD5 algorithm and identifying its limitations to provide probable solutions.
The wireless nature of such networks allows users to access network resources from nearly any convenient location within their primary networking environment. Registered users can connect to the network from anywhere a router or another connected user is available without being identified or tracked. The onion routing network Tor is undoubtedly the most widely employed technology for anonymous web access which also gives good security for anonymous transmission of data. Wireless mesh network (WMN) is a new wireless networking paradigm. Unlike traditional wireless networks, WMNs do not rely on any fixed infrastructure. Instead, hosts rely on each other to keep the network connected. Wireless internet service provider is choosing WMNs to offer Internet connectivity, as it allows a fast, easy and inexpensive network deployment. One main challenge in design of these networks is their vulnerability to security attacks. In this paper, we investigate the principal security issues for WMNs. We study the security goals to be achieved. We identify the new challenges and opportunities posed by this new networking environment and explore approaches to secure its data communication.
Network has lots of connections and the server has n number of requests to be handled. To reduce the work load of the server, we create subsequent sub servers and they are used to handle requests from clients. Each client can login and get their required information from the sub server. A report is generated for all the requests handled by the sub servers and the server maintains the log. A fully distributed load balancing algorithm is presented to scope with the load imbalance problem. DSBCA algorithm is compared against a centralized approach in a production system and a competing distributed solution presented in the literature. The Real time results indicate that our proposal is comparable with the existing centralized approach and considerably outperforms the prior distributed algorithm in terms of load imbalance factor, movement cost, and algorithmic overhead. The performance of our proposal implemented in the Ha-doop distributed file system is further investigated in a cluster environment. The main reason for the formation of such clusters is that clustered overlays enable their participants to find and exchange data relevant to their queries with less effort.
A Review on Basic Principles of an E-Assessment System
S.S. Vora, S.A.Shinde
Assessment is the basic fundamental activity of all learning environments. Therefore all learning management systems (LMS) provide the assessment facilities like (e.g., the formation, implementation, and valuation of multiple choice tests as well as programming assignments).This paper mainly elaborates the principles of service oriented paradigms, those are applicable for routine evaluation of programming assignments as well as multiple choice questions. The most prominent aspect of this assessment solution is, it can assess the programs written in any programming languages. Moreover, it can be easily interfaced with different existing learning management systems. This paper also presents the design of a flexible e-Assessment system based on the Design Methodology Management(DMM) technology which provides a framework for the system development.
Image Compression Techniques Review: DCT, DWT, VQ and Fractal Compression
Mahinderpal Singh, Meenakshi Garg
Image compression is one of the most widespread techniques for applications that require transmission and storage of images in databases. In this paper we discuss about the image compression techniques, their need for compression, their characteristics, principles, and classes of compression and various algorithm of image compression. This paper discuss about available image compression algorithms based on Wavelet, JPEG/DCT, Vector Quantizer and Fractal compression. We also sum up the advantages and disadvantages of these algorithms for compression of grayscale images.
Robust Feature Based Automatic Text-Independent Gender Identification System Using Ergodic Hidden Markov Models(HMMs)
R. Rajeswara Rao
In this paper, robust feature for Automatic text-independent Gender Identification System has been explored. Through different experimental studies, it is demonstrated that the timing varying speech related information can be effectively captured using Hidden Markov Models (HMMs). The study on the effect of feature vector size for good Gender Identification demonstrates that, feature vector size in the range of 18-22 can capture Gender related information effectively for a speech signal sampled at 16 kHz, it is established that the proposed Gender Identification system requires significantly less amount of data during both during training as well as in testing. The Gender Identification study using robust features for different states and different mixtures components, training and test duration has been exploited. I demonstrate the Gender Identification studies on TIMIT database.
This paper presents the details of searching or extracting information from the web. It also discusses main tasks involved in web mining. It mainly focuses on the types of Web Content mining such as Unstructured, Structured and Semi-structured types. Finally, some tools that are used for mining is also focused in this paper.
Tutorial review on existing image compression techniques
Bhavna Pancholi, Reena Shah, Mitul Modi
Image compression is a technique that can reduce the storage space of images and videos. Moreover it is helpful to increase storage and transmission process performance.Image compression reduces the image fidelity, when an image is compressed at low bitrates.Hence, the compressed images suffer from block artifacts. The prime focus of image compression technique is to reduce the number of the image pixel elements without affecting the original image. It can be achieved by removing the redundancy present in the image. This paper shows review of different existing techniques which is available lossless and lossy compression such as Huffman coding, LZW coding, DCT, DWT, VQ compression etc. and based on the review guidelines are provided to choose best algorithm for an image.
Many images are degraded by the bad weather conditions due to smoke, fogs, dust and ash which are obstacle in clarity of images. Image processing techniques improve the quality of an image and enhance the maximum information from the degraded image. It is the process of combining two or more images into a single image; method is a fusion-based strategy that derives from two original hazy images inputs by applying a white balance and contrast enhancing procedure.The method performs in a pyramidal method, which is straightforward to implement. Then the resulting image will be more clear and enhanced from the prior. This thesis reports a detailed study performed over a set of image fusion algorithms regarding their implementation. The thesis demonstrates the utility and effectiveness of a fusion-based technique for dehazing based on a single degraded image.
Researchers have tried to make regression testing more effective and efficient by developing regression test selectiontechniques, but many problem remain, such as: Unpredictable performance, Incompatible process assumptions and inappropriate evaluation models. RTS techniques try to maximize average regression testing performance rather than optimize aggregate performance over many testing sessions. Regression testing is a crucial problem in software development as regression testing can be used not only for testing the correctness of a program, but often also for tracking the quality of its output. Both the research community and the industryhave paid much attention to this crucial problem. The paper try to do the surveyof current research on regression testing and current practice in industry and also try to find out whether there are gaps betweenthem. The observations show that although some issues are concerned both by the research community and the industrygay, there do exist gaps.
“The Advanced Prolong Montroller & Recording System” using Wireless, Distributed DAS Technology
Mr. Manesh V.Raut , Prof. Rupali S.Khule
Almost in most industries there is a mounting need for high-ceilinged quality, efficiency and machine-plant environment monitoring systems. Moreover, so many industrial operation monitoring is necessary to record, analyze or control in order to have implementations of proper, cost-effective and wastage free or maintenance free Production and also to reduce repair/replacing strategies. Therefore, here “The Advanced Prolong Montroller & Recording System” using Wireless, Distributed DAS Technology is proposed as good, important system engineering.
In this project, a wireless network for in-field operation, on time monitoring of multiple different area parameters like temperature, humidity is projected. The system network is arranged as able to sink operation-related data of remarkable areas to one central point, also called as master station. This master device is also automatically managing the communication with all remote monitoring stations distributed at different areas. Furthermore it will communicate with PC for data saving/recording & analyzing in the form of table & graphs.
Optimization of Repeater Spacing in Optical Fiber Communication
Mr.Ashwin V. Goswami, Mr.Mayur Parmar
Repeater spacing in fiber optic communication is optimized taking into consideration various parameters such as fiber attenuation, Stimulated Brillouin Scattering (SBS), Stimulated Raman Scattering (SRS), fiber attenuation, photodiode sensitivity and input power. In our work, we have taken different value of input power, change the different property of receiver as like APD Multiplier, different filter. Also, we have obtained results for various length of fiber and try to optimize the distance between two repeaters.
A Review of Invisible Image Steganography Using Mid Band DCT Coefficients
Shinu, Vijay Laxmi
Data hiding techniques have an important role with the rapid growth of intensive transfer of multimedia content and private communications. Steganography is the art of hiding information in different ways that prevent detection by unintended user. The Steganography used to transfer data and information from one source to other source through public channel in hidden way. Steganography hides the existence of message so that the information is transferred in stealthful way. Steganography means hiding a message (the embedded message) within a larger one (source cover file) in such a way that an observer cannot detect the presence of the hidden message. Many different carrier files can be used, like digital images which are the most popular on Internet. For hiding secret information in images, there exists a large variety of Steganography techniques some of which are simple and some are more complex than others and all of them have strong and weak features. Various applications have different requirements of the Steganography technique used. This paper intends to give an overview of image Steganography, its uses and techniques. It also attempts to identify the requirements of a good Steganography algorithm and briefly reflects on which Steganography techniques are more suitable for which applications.
Hadoop – from where the journey of unstructured data begins…
Mrs. R. K. Dixit, Sourabh Mahajan, Suyog Kandi
Hadoop is a “flexible and available architecture for large scale computation and data processing on a network of commodity hardware”it is processingtechnology for large scale applications. Nowadays unstructured data is growing at faster rate. Big data generated daily from social media, sensor used to gather climate information, digital picture purchased transactions records and many more. Ultimately there need to process Multi Petabyte Datasets efficiently. Failure in datasets is expected, rather than exceptional. The number of nodes in cluster is not constant. The Hadoop platform was designed to solve such problems as it provides application for operational and analytics. Today Hadoop is fastest growing technology provides advantage for business across industries in world.
Probabilistic Broadcasting based on Neighbor Coverage for CBR and VBR Traffic in MANET
Arathy O, Binu Mathew
Mobile Ad-hoc Networks are formed by nodes which can move dynamically without any controlling stations. These are communication networks in which every node can send and receive data within the network. After the deployment of network the next step is to find a path from source to destination. Routing protocols are used for the path discovery. Broadcasting of data is the primary way to establish a path in MANET. But it has problems like collision and contention. In order to reduce the overloading of the network the route discovery process has to be modified. This work proposes a new method to find the route in an ad-hoc network. The protocol selected for modification is Ad-hoc On-demand Distance Vector. In AODV the route is selected only when there is a requirement for a node to send data to another. In the proposed method route selection is done by the neighbor knowledge which is used to reduce the route overhead in the network. This method is based on a rebroadcast delay and rebroadcast probability. Simulation result shows that this approach can improve the average performance of broadcasting in various network scenarios. This method is simple and can be easily implemented in MANET.
Extended Round Robin Load Balancing in Cloud Computing
Priyanka Gautam, Rajiv Bansal
Cloud computing is a new platform for the developers where they can store and use their contents but as this is a new area and it is expected that sooner it will become very popular among all developers and its users. So, to avoid clusters and waiting in queues a load balancing is required. Most of the researchers are already working on it and we have also tried to work on same area with the use of round robin scheduling technique for load balancing.
Highly Secure Invertible Data Embedding Scheme Using Histogram Shifting Method
Parvathy S, Abubeker K M
This work presents a simple, easy to implement secure invertible data hiding method utilizing the advantages of both cryptography and steganography. The secret message to embed is first encrypted using CAESAR cipher method, and this encrypted message is further compressed using Huffman encoding. These encoded bits are embedded inside a host image thus creating the stego image. Data hiding method used here is the Histogram shifting method. In this method the maximum point and the minimum point of the histogram of an image is selected and slightly modifies the pixel intensities to embed data into the image. In the conventional Histogram shifting data hiding method the embedding capacity is limited by the peak value of the histogram. This proposed technique can overcome this limitation by compressing the input secret data bits to hide, thereby making possible to embed more data. Also encrypting the secret data before compression gives an add-on advantage of increased security to the secret message. Finally the stego image is compressed before transmission which allows efficient utilization of the allotted band width. The performance of this method is tested by plotting capacity versus PSNR value of the marked image. Experimental results show that this method gives improved visual quality of the marked image. Also this method gives high capacity and PSNR value when compared to other conventional data hiding methods.
Shweta Jain, Prof. Ashok Verma, Prof. Rashween Kaur Saluja
Fault Detection and Tolerant System (FDTS) for SaaS Layer in Cloud Computing
Shweta Jain, Prof. Ashok Verma, Prof. Rashween Kaur Saluja
The increasing popularity of Cloud computing as an attractive alternative to classic information processing systems has increased the importance of its correct and continuous operation even in the presence of faulty components. Fault tolerance is a major concern to guarantee availability and reliability of critical services as well as application execution. In order to minimize failure impact on the system and application execution, failures should be anticipated and proactively handled. Fault tolerance techniques are used to predict these failure and take an appropriate action before failure actually occur. In this paper, we introduce an innovative, system-level, modular perspective on creating and managing fault tolerance in Clouds. We propose a high-level approach at SaaS layer for hiding the implementation details of the fault tolerance techniques to application developers and users. In particular, the service layer hides the user from fault tolerance mechanism, and does not require knowledge about the fault tolerance technique applied and that are available in the Cloud and their implementations.
The fault tolerant technique applied shall use the heartbeat algorithm(s) and gossip algorithm to detect whether the application is working smoothly or not. In case the application is detected to be down then the proposed work deploys an application recovery mechanism applied at the SaaS layer, which will try to start, recover from failure or will reinstall the application so that users can use the same smoothly with minimum downtime.
A Review Paper on Energy Efficient Routing in Wireless Sensor Networks
Jamna Kaur, Rachna Rajput
The wireless sensor networks have been studied extensively in the recent years. Such networks are made of several thousand of sensors propagated in a geographical area. There are many different applications for such networks including military, environment monitoring, disaster, fire fighting and protection, and home applications. Sensors are very simple identical electronic devices equipped with a processor and small storage memory and a communication channel. The sensors can communicate to each other through wireless links, and most of the times we use radio frequency channels for the purpose of communication. The routing problem in the sensor networks has been studied by many researchers. Sequential Assignment Routing (SAR) is proposed in [5], and it takes into account the energy constraints by making a tree rooted in the central node. The tree starts to -grow toward the sensors on the paths with enough residual energy.
The World wide Web consists of millions of interconnected web pages that provide information to the user present in any part of the world. The world wide web is expanding and growing in size and the complexity of the web pages. That is why it is necessary to retrieve the best or the web pages that are more relevant in terms of information for the query entered by the user in search engine. In recent years, semantic search for relevant documents on web has been an important topic of research. Many semantic web search engines have been developed like Ontolook, Swoogle, etc that helps in searching meaningful documents presented on semantic web. To relate entities/texts/documents having same meaning, semantic similarity approach is used based on matching of the keywords which are extracted from the documents. In this paper we have a ranking scheme for the semantic web documents by finding the semantic similarity between the documents and the query which is specified by the user.
CMOS Digital Based Technology for Static Power Reduction in Nanoscale Circuits
Srivnivas kolli
In this paper an overview on the main issues in analog IC design in scaled CMOS technology is presented. Decreasing the length of MOS channel and the gate oxide has led to undoubted advantages in terms of chip area, speed and power consumption (mainly exploited in the digital parts). Besides, some drawbacks are introduced in term of power leakage and reliability. Moreover, the scaled technology lower supply voltage requirement has led analog designers to find new circuital solution to guarantee the required performance. Power gating with high-K transistors is then investigated to analyze the effects of such a combination. Finally, the results are compared and the effectiveness of the various leakage reduction techniques is analyzed. Threshold voltage change proved to have the most impact on performance and less of an impact on leakage reduction while power gating offered no significant performance drop and the highest impact on leakage power reduction.
Implementation of Discrete Wavelet Transmission for Color Image Transmission in OFDM
Leman Kumar, Yojana Yadav
Orthogonal frequency division multiplexing technique is most widely used for high speed broadcast purposes & it is one of the most useful technologies for the present and future wireless communications. The data bits are encoded to multiple sub-carriers using multicarrier modulation technologies when being sent simultaneously A modified Orthogonal Frequency Division Multiplexing (OFDM) system for robust progressive image transmission is analyzed .In this paper an image frame is compressed using DWT, and the compressed data is arranged in data vectors ,each with equal no. of coefficients. the simulation results are presented based on bit error rate (BER), the Peak-signal-to-noise ratio (PSNR) over AWGN channel. Based on the simulation outcome, to the color image and different parameter like PSNR,BER,SNR.
Efficient Routing in VANETs using Traffic Awareness
B. V. Visweswar Reddy, Dr. P. Bhargavi
The basic idea in VANETs is to have an adhoc connection between close by vehicles. The routing in vehicular networks is exposed to danger by the harmful nodes which aim to endanger the delivery of messages. Compromised nodes can extremely impact the performance of the network by capitalizing a number of attacks. To minimize these problems, a way of securing beacon-less routing algorithm for vehicular environments (S-BRAVE) against selective forwarding attacks using neighbouring nodes as guard nodes is developed. They watch for the message to be sent by the next forwarder, in case this vehicle does not forward the message, they take the responsibility of sending the message to the next hop. To increase the packet delivery ratio S-BRAVE routing algorithm is extended by including the traffic awareness of the roads, thereby routing the packets in a denser environment gives higher probability of delivering the packets to their respective destinations.
This paper examines the performance of Discrete Wavelet Transform based OFDM (DWT-OFDM) and Fast Fourier based OFDM (FFT-OFDM) system. A new interleaving scheme is applied to system for efficient data transmission over AWGN channel. The performance of the approach is evaluated on the DWT-OFDM and FFT-OFDM with interfacing and without interleaving process. Signal to Noise ratio (SNR) and Bit Error Rate (BER) is used as system parameter to evaluate the performance of the OFDM system. The simulation result of new interleaving scheme using PN sequence generator and different gain input gives better bit error rate of DWT based OFDM system
A Wheeling and Steering based route reconstruction approach in congested MPLS network
Mrs. Babita
MPLS-Traffic Engineering (MPLS-TE) provides various network recovery mechanisms such as rerouting and protection. In network routing algorithm is easily leading to the problem for some segments of a network are quite congested while other segments along alternative routes are underutilized. This thesis proposed a new route reconstruction approach wheeling and steering based route reconstruction approach in congested MPLS network. MPLS uses its routing technology to implement TE. This new algorithm is based on the advantages of MPLS. This is used to achieve load balancing of TE in MPLS network. The simulation of network is shown in NS2. The NS2 simulation showed that this approach can effectively balance the load between different links and improve the network performance with various perimeters as we can see in this paper in network communication simulation.
Various Techniques for Denoising EEG signal : A Review
Simranpreet Kaur, Sheenam Malhotra
The Electroencephalogram (EEG) signal is a biological non-stationary signal which contains important information about various activities of Brain. Analysis of EEG signals is useful for diagnosis of many neurological diseases such as epilepsy, tumors, and various problems associated with trauma. EEG measured by placing electrodes on scalp usually has very small amplitude , so the analysis of EEG signal and the extraction of information from this signal is a difficult problem. EEG signal become more complicated to analyze by the introduction of artifacts such as line noise, eye blinks, eye movements, heartbeat, breathing, and other muscle activities. Proper diagnosis of disease requires faultless analysis of the EEG signals. The problem of denoising is quite varied due to variety of signals and noise. Discrete wavelet transform provides effective solution for denoising non-stationary signals such as EEG due to its shrinkage property.