Seyed Vahid Sanei Mehri, Ehsan Akhtarkavan, Saeed Erfanian
Calculating the Area of the Union of Iso-oriented Rectangles Using MapReduce
Seyed Vahid Sanei Mehri, Ehsan Akhtarkavan, Saeed Erfanian
In this paper we aim to propose a faster algorithm for solving the problem of ‘area of the union of iso-oriented rectangles’. For this purpose we use MapReduce which is a powerful tool in parallel data processing to divide the task among P separate processors. We utilize the Interval Tree data structure and Sweep Line technique to obtain a solution with time complexity
Stress Analysis and Optimization of Crankshafts Subject to Static Loading
Mr.B. Varun
Crankshaft is one of the critical components for the effective and precise working of the internal combustion engine. In this paper a dynamic simulation is conducted on a crankshaft from a single cylinder 4- stroke diesel engine. A three-dimension model of diesel engine crankshaft is created using SOLID WORKS software. Finite element analysis (FEA) is performed to obtain the variation of stress magnitude at critical locations of crankshaft. Simulation inputs are taken from the engine specification chart. The dynamic analysis is done using FEA Software ANSYS which resulted in the load spectrum applied to crank pin bearing. The analysis is done for finding critical location in crankshaft. Stress variation over the engine cycle and the effect of torsion and bending load in the analysis are investigated. Von-mises stress is calculated using theoretically and FEA software ANSYS. The relationship between the frequency and the vibration modal is explained by the modal and harmonic analysis of crankshaft using FEA software ANSYS.
Effectiveness of Location Based Routing Protocols against Wormhole Attack in MANETs
Devendra Kumar, Deepak Kumar Xaxa
The mobile nodes in MANETs dynamically change their topology and hence require an efficient mechanism to communicate with each other. There are several routing protocols proposed for MANET environment categorized as Non-Location Based routing protocols and Location Based routing protocols. Among all routing protocols the Location based routing protocols are preferred in MANET as they are more efficient in routing compared with the Non-Location Based routing protocols. Also the MANET is vulnerable to several kind of attack such as black hole attack, wormhole attack, Sybil attack, flooding attack, gray hole attack etc. The Wormhole attack is one of the stronger active attacks which are difficult to avoid/detect in any network in which the two or more attacker nodes tunnel the network traffic information between source and destination pair from one network to another in the network. In this paper we focus our study on two location based routing protocols ALERT and GPSR and their effectiveness measure against wormhole attack based on parameters like throughput,end-2-end delay ,packet delivery ratio, packet dropping ratio and normalized routing load. The performance analysis is done for 10,20,30,40 and 50 nodes using the network simulator (NS-2.35).A comparative study is represented on above parameters for all five scenarios.
V. Uma Rani , Dr. M. Sreenivasa Rao , V. Theresa Vinayasheela
DISCLOSURE PROTECTION OF SENSITIVE ATTRIBUTES IN COLLABORATIVE DATA MINING
V. Uma Rani , Dr. M. Sreenivasa Rao , V. Theresa Vinayasheela
In collaborative data mining, data sets from various parties are submitted to a third party where they are combined, privacy of each data sets’ sensitive attributes are protected and data mining is carried out. In privacy preserving data mining, there is a need to extract knowledge from databases without disclosing information about individuals. Each participant will have sensitive and non-sensitive data in their local database. Therefore the most important challenge in privacy preserving multi party collaborative data mining is how these multiple parties conduct data mining without disclosing each participant’s sensitive data. In this paper we propose a two-level encryption algorithm for protecting sensitive attributes from disclosure and a generalization algorithm. This approach guarantees high level privacy with less amount of complexity as compared to the existing methods and also proves to be fast and efficient over dynamic queries.
K-Means is the most popular clustering algorithm in data mining. The size of various data sets has increased tremendously day by day. Due to recent development in the shared memory inexpensive architecture like Graphics Processing Units (GPU). The general – purpose applications are implemented on GPU using Compute Unified Device Architecture (CUDA). Cost effectiveness of the GPU and several features of CUDA like thread Divergence and coalescing memory access. Shared memory architecture is much more efficient than distributed memory architecture.
Machine learning is technique of artificial intelligence where a system is made to learn things through user defined instructions and rules. It is a method where by a computer system learns new ways of developing and growth through the data provided to them
Improving Intellectual Skills of Students by Analyzing their Performance and Classifying them Based on Bloom’s Taxonomy Using K-Means Clustering Algorithm
Ms. S. Sangeetha , Ms. M. Thiruvarutselvi
Teaching is the process of carrying out the activities which is effective in getting students to learn. The cognitive knowledge of every student in a classroom is not uniform. Some students are good at reasoning and some are not. This paper suggests aligning the questions in the question paper of periodical tests with Bloom’s Taxonomy. Assess the answer script of each student and categorize them using K-Means Clustering algorithm under different levels of the Bloom’s Taxonomy. This activity helps lecturer to get the reasoning strength of each student in a classroom. The lecturer can put more attention to improve intellectual skills of students having weak reasoning power. The aim of this paper is to improve the students’ performance through enhancing their ability.
Survey Paper for Dynamic Resource Allocation using Migration in Cloud
Ms. Renu Krishnan, Ms. Silja Varghese
Cloud Computing is a rampant technology nowadays because of its scalability, flexibility, availability of resources and the other features. In cloud computing resource multiplexing is done through the virtualization technology. Virtualization technology is acts as a backbone for provisioning requirements of a cloud based solution. The problems arising in cloud computing using virtualization must be solved. In present cloud computing environment, load balancing is one of the challenging issues. To present a better approach for solving the problem of VM resource scheduling in a cloud computing environment, uses CPU and network usage calculation. A load predictor is used to predict the load in future, according to the this , it allocate the resources. For multiplexing of virtual machines to physical machines is managed using the Usher framework. Introducing the skewness algorithm to measure the resource utilization of server and minimizing the skewness can improve overall utilization of server and for load balancing the migration technology is used. Using this migration, can achieve the green computing.
text mining is a technique to find meaningful patterns from the available text documents. The pattern discovery from the text and document organization of document is a well-known problem in data mining. Analysis of text content and categorization of the documents is a complex task of data mining. In order to find an efficient and effective technique for text categorization, various techniques of text categorization and classification is recently developed. Some of them are supervised and some of them unsupervised manner of document arrangement. This presented paper discusses different method of text categorization and cluster analysis for text documents. In addition of that a new text mining technique is proposed for future implementation
V.B. Maral, Mehul Sanghvi, Pankaj Rana, Wajeda Pathan, Heena Parihar
Security and Privacy in Private Cloud Storage
V.B. Maral, Mehul Sanghvi, Pankaj Rana, Wajeda Pathan, Heena Parihar
Cloud Storage is a new emerging technology in which users can outsource their data on cloud and access it remotely from anywhere, thus the user can be free from the burden of data storage and its maintenance. But it is true that the data outsourced on cloud storage is not under the possession of user, so there is always a risk of integrity of data. The Cloud Storage should be such that the user should store the data on cloud as they are storing data on local system without worrying about data integrity. So for this purpose an external third party auditor (TPA) is introduced which performs the task of public auditing and users can rely on TPA for security of outsourced data. TPA performs the auditing of data, but this should not bring new risk to user’s data security by checking the contents of data while performing auditing. In this paper, we propose a scheme privacy preserving public auditing which stores the user’s data securely on the cloud. The proposed scheme is highly efficient and provably secured against both cloud service providers and external third party auditor (TPA).
AN IMPROVED AODV ROUTING PROTOCOL BASED ON LOCAL ROUTE REPAIR MECHANISM
Arpit Dhingra, Anil Kumar
With the dynamic nature of Ad-hoc Networks coupled with the mobility of nodes results in breakage of the links due to ever changing topology of the nodes. When there is a substantial increase in the degree of mobility of the wireless network more link errors are likely to occur. When this happens, route repair is typically performed to establish a new route. Such route repair mechanisms suffers with the problems like high control overhead and long packet delay making them inefficient due to frequent failures of intermediate connections in an end-to-end communication. When there is a breakage in the intermediate link, it is favorable to discover a new route locally without resorting to an end-to-end route discovery. In this paper we propose ASRODV, an algorithm based on local route repair (LRR) mechanism such that the repair is confined to the vicinity of the broken link. The algorithm overcomes the limitations of standard AODV by decreasing the reaction time of recovery and the overhead of route maintenance. In the earlier AODV routing protocol, on the occurrence of a link break an error message is sent to the source stating that a link failure has taken place and further communication is stopped temporarily. Where as in ASRODV, nodes in the active path between source and destination act as a virtual source when link breakage occurs and there by the search process is continued till the link is formed to reach the original destination is reached. A comparison has been carried out between original routing protocol (AODV) and the improved algorithm (ASRODV) on various QOS parameters by simulating them in network simulator ns-2. The results demonstrate that the network performance can be significantly improved using the proposed local repair scheme.
In various fields and applications use of images are becoming increasingly popular like in field of medical, education etc. But the problem is that noise will be inevitably introduced in the image during image acquisition process. Another problem that arises after denoising process is the destruction of the image edge structures and introduction of artifacts. For this there are several techniques proposed by other authors for image denoising as well as for edge preservation. In this paper, we aim to provide a review of some of those techniques that can be used in image processing (denoising). This paper outlines the brief description of noise, types of noise, image denoising and then the review of different techniques and their approaches to remove that noise. The aim of this review paper is to provide some brief and useful knowledge of denoising techniques for applications using images to provide an ease of selecting the optimal technique according to their needs.
Enhance Performance of Random Testing using Randomized Algorithm
Jaypal D. Rangari, Swapnili P. Karmore
this paper implements and enhances performance of software random testing. Random testing is a base software testing technique that can be used to improve the software reliability as well as to discover software failures. Random testing is a black-box software testing technique where programs are tested by generating random, independent inputs. In proposed methods uses both Monte Carlo and Las Vegas Randomized algorithms. Monte Carlo has fast execution while Las Vegas has low execution time, but sometimes Monte Carlo algorithm gives false result while Las Vegas gives always correct result. In proposed method has two result sets, in first result set has executed test cases and in second has fails test cases. Initially test cases are tested using Monte Carlo algorithm and produced executed and fail result sets. The fail result set is again tested using Las Vegas algorithm because sometimes Monte Carlo gives false result. We present a technique that improves performance random testing. These results are very hopeful, given that evidences that our perception is likely to be useful in improving the efficiency of random testing
Controllability of mixed Volterra- Fredholm type Impulsive integro-differential inclusions in Banach spaces
R.Murugesu, T. Tamil Selvan, B. Gayathiri
The paper establishes a sufficient condition for the controllability of semilinear mixed Volterra-Fredholm type Impulsive integro-differential inclusions in Banach spaces. We use Bohnenblust-Karlin’s fixed point theorem combined with a strongly continuous operator semigroup. Our main condition only depends upon the local properties of multivalued map on a bounded set. An example is also given to illustrate our main results
This paper presents an explicit integration scheme to compute the stiffness matrix of an eight node linear convex quadrilateral element for plane problems using symbolic mathematics and an automatic generation of all quadrilateral mesh technique , In finite element analysis, the boundary problems governed by second order linear partial differential equations,the element stiffness matrices are expressed as integrals of the product of global derivatives over the linear convex quadrilateral region. These matrices can be shown to depend on the material properties and the matrix of integrals with integrands as rational functions with polynomial numerator and the linear denominator (4+) in bivariates over an eight node 2-square (-1 ).In this paper,we have computed these integrals in exact and digital forms using the symbolic mathematics capabilities of MATLAB. The proposed explicit finite element integration scheme is illustrated by computing the Prandtl stress function values and the torisonal constant for the square cross section by using the eight node linear convex quadrilateral finite elements.An automatic all quadrilateral mesh generation techniques for the eight node linear convex quadrilaterals is also developed for this purpose.We have presented a complete program which automatically discritises the arbitrary triangular domain into all eight node linear convex quadrilaterals and applies the so generated nodal coordinate and element connection data to the above mentioned torsion problem.
Humans can accurately interpret and analyze an expressed emotion, which is a hindrance for any machine or computer. This project, under the Machine Vision, aims at designing a system that detects and successfully recognizes human emotions without any human intervention. It makes use of Eigenfaces to quantify the test and training image database into appropriate vectors.It uses principal component analysis of the images of the faces reducing the dimensionality of the training set, converting only those features that are critical for face recognition into Eigen vectors.
Virtual Sense: An auto email downloading and news reading system for visually impaired people
Prajakta Made, Snehal Kadam, Darshana Jain.
Internet has become a very important factor in our day to day life. It is a wide media for communication and exchange of ideas for people staying in any nook and corner of the world. In this paper an idea is proposed to develop a speech interactive system to provide web application services. The main aim is to provide these services to the special ones who are unable to make use of the current system so efficiently. In this proposed work the main focus is on the web applications. It is tedious for the disabled people who are unable to access internet, this system will help them to download news, or even access their mails through speech. The idea is to incorporate several applications like Email Reader/Sender, News, Reader, Web Content, Blog, RSS Reader, Local System File Reader, Text/Document Reader, and Voice Command System. The proposed system provides access to these applications. So that they can be used by disabled people without the use of the hands to develop an interface between the computer and the user. This is an attempt to develop web application through speech interaction.
Dnyandevi W.Shrirao, Prof.Sulbha patil, Prof. Sonal Honale
An Implementation of Image Watermarking by Using Gradients
Dnyandevi W.Shrirao, Prof.Sulbha patil, Prof. Sonal Honale
Digital image can be copy from one place to another Place are easily , and our image are distribute from person to person and image easily uploaded on the internet. Images can be appeared widely in the internet and can be copied from the internet. To protect the image applied the watermark. Watermark is method to embed the data into an image. Data must be placed into an actual pixel. Watermarking are two types:1) Visible watermarking and 2) Invisible Watermarking. In proposed system invisible watermarking are used, Because it is more secure as compare to visible watermarking. In this scheme watermark are placed on the gradients vector. This system is more secure as compare to other watermarking scheme . Other scheme are based on coefficient based, hence by using filtering method and compression techniques watermark and image are separated from each other. Proposed system are more secure and in this technique gradient vector are used to embedded the data.
Pinkey J. Ratnani, Priyanka L. Patil , Rashmi A. Sonawane , Rohidas B. Sangore
IVRS BASED COLLEGE AUTOMATION
Pinkey J. Ratnani, Priyanka L. Patil , Rashmi A. Sonawane , Rohidas B. Sangore
An Interactive Voice Response is a system based on telephone which allows users to enter information and make menu selections using dual-tone multi-frequency (DTMF) signaling. Interaction of users is done with a computer by using their telephone as a terminal. The objective of the system is to reform the services provided to users by get rid of an operator. An IVR can be tailored to the particular needs of an organization, depending on information contained in database. An IVRS is righteous for handling repetitious enquiries.
Blind Noise Level Estimation and Blind Denoising Using Principal Component Analysis
Bheem Prasad Ram, Sachi Choudhary
Noise is important factor in the image processing area .which is effect to visual quality of image and application of image processing like color information, restoration, and enhancement etc. In this paper consider blind noise which has no proper parameter for measure in practically .But this blind noise is already exist in image. In this method estimate to the blind noise from an image using principal component analysis with use parameters as weak texture with low rank texture strength .In this process estimate to the blind noise of an images of each channel (Red ,Green, Blue )noise present in an image. This estimated channel noise is blind noise .which again adding this channel noise in an image by using Bayer pattern. Then Blind noisy image have been denoising by using adaptive non local mean principal component analysis with color demosaicking and finally again estimate to the blind noise level of blind denoising image. Those Estimated noise levels have denoised an image. This is finally possibility of denoised of blind noisy image. The result is better visual quality and smooth sharp image.
Complete Performance Analysis of L- band Optical Communication System for NRZ and RZ Format.
Navdeep Kaur , Gurpreet kaur
The performance of optical communication system is mainly determined by the various modulation formats in L band at different bit rates. For analyzing these conditions Optisys software is used for simulation of optical communication using WDM (Mux.) and WDM (Demux.) by comparing with Return-to-zero(RZ), Non-Return-to-zero(NRZ) modulation formats. It has been investigated that in L band the distance covered with more data rates in RZ formats gives better results as compare with NRZ. The high quality factor is obtained in case of RZ format. The study of optical communication systems is multidisciplinary involving a wide range of area including: optical communication design, channel modelling, communications and information theory. The value of wavelength with RZ at 35 Gbps is 1625 nm gives a bit error rate of 5.187e-010.
This paper is concerned with the designing of an Application specific microcontroller that can be implemented on any FPGA. ASuC is a collection of several generic processor modules that can be integrated to facilitate any particular application in most optimized way. We are designing our project in Verilog HDL using XILINX Design Suite 14.1. The proposed architecture and simulation waveforms are presented in this paper.
A Novel Scheme for Intrusion Detection & Anticipation of Black Hole & Gray Hole Attacks In AODV Based MANET using ZED
Arti Tiwari, Prof.Nilmani Verma
Mobile Ad hoc Network typically identifies an activity of Network elements through which combine to create a network requiring no preset infrastructure. Security would be the most commonly cited issue about wireless Ad hoc system. Wireless communities pose distinctive security issues. The Brand new Security Design for mobile Ad hoc network protects the information from a number of securities threats and as well leads to very much low computational complexness. The discovery of problems where these kinds of destructive node is present will make it easy for us to prevent that attack for much more transmission. This gives a proper technique to detect this particular malicious attack for example Black hole & Gray Hole. The actual propose Technique First safeguard the system from repudiation i. e. prevents generally sender or possibly receiver simply by denying affecting sending or receiving a data offer using Zero Knowledge Protocol (ZKP) for the reason that Authentication Structure to create certain the actual authenticity in the sender node. The device also used to detect Cloning Attack. Next, if an additional attack comes about the structure is helpful to providing a fix for discovery & decline of destructive attacks utilizing Extended Data routing Info (EDRI) table along with routing stand of AODV. In addition, it maintains a brief history of previously malicious node with regard to gray behavior. The EDRI procedure is displayed but just isn't implemented within NS-2 to supply a valuable performance element such as-Packet Delivery Ratio, Packet Deliver Rate by Number of Nodes, Latency, Average End to End Delay
Optimizing Node Position using Ant Colony Optimization Algorithm (ACO)
A.P.Leela Vinodhini
Modern research has offered confirmation signifying how a malicious user could perform coresidence profiling and public-to-private IP mapping to target and exploit customers which share physical resources. Twp steps are relayed for this attack they are resource placement on the target’s physical machine and extraction. In this paper, in part inspired by mussel self-organization, relies on user account and workload clustering to mitigate coresidence profiling. Users with similar preferences and workload characteristics are mapped to the same cluster. To obfuscate the public-to-private IP map, each cluster is managed and accessed by an account proxy. Each proxy uses one public IP
Address, which is shared by all clustered users when accessing their instances, and maintains the mapping to private IP addresses. In this paper gives the risk assessment for mussel behavior. This paper presented arguments to show how our strategy increases the effort required for an adversary to carry out a directed attack against a target set. This paper proved the experimental result from a risk assessment that suggests a reduced per-individual chance of being randomly victimized given a non directed attack
Lokesh Sharma, Vijayakumar Kubsad, Prasanta Kumar Paul
A Report on Dhruval Enterprise Resource Planning Solution Implementing Financial Module
Lokesh Sharma, Vijayakumar Kubsad, Prasanta Kumar Paul
Enterprise Resource Planning is a business/technology term for an information system based on a common database and common application tools that allow real-time information to be accessed, shared and compared easily and immediately across organizations, agencies, divisions or departments. The objective of this application is to improve the administrative and operational functions by making them less complicated and more efficient. This, in turn, improves the ability to manage resources well and give better customer service. Basically it includes application modules for the CRM, Accounting, Taxation, Payroll, Inventory, HRMS and other important aspects of a business. This application is built on strong backend database My SQL Server. User management of Dhruval has multi-layer protection and hence users will be able to access data only to extent of privileges they have been given. Package also includes back up facility including auto backup facility.
The cloud computing model for delivering computing services offers cheap access to a variety of standardized services from various providers. But after outsourcing a service to the cloud, the owner no longer controls the platform on which the service runs. The user is bound to trust the cloud provider for correctness, privacy, and integrity of its data and computations. Cryptographic mechanisms can reduce such trust by allowing the user to protect its data and computations, as well as to verify aspects of remote computation. Benefits of cloud storage are easy access, it means access to your knowledge anyplace, anyhow, anytime, scalability, resilience, cost efficiency, and high reliability of the data. Each and every organization is moving its data to the cloud, means it uses the storage service provided by the cloud provider. So there is a need to protect that data against unauthorized access, modification or denial of services etc. To secure the Cloud means secure the calculations and storage databases hosted by the Cloud provider. This paper describes different security issues to the cloud , different cryptographic algorithms adoptable to better security for the cloud.
Information Technology (IT) is an important field of study in a number of areas, Due to the numerous advantages of IT, Organizations are trying to implement IT applications to support their businesses. Therefore, this research aims to provide a better and clearer understanding of IT Strategy within organization by reviewing and analyzing current IT literature. In this research, we have tried to propose an IT strategy template after the reviewing of literature. The proposed template of effective IT strategy is believed to provide organizations with a practical synopsis of the IT strategy, which will in turn assist them to be successful with IT institutionalization within these businesses.
This paper aims at the prospect of developing a fully automated parking system for two wheelers and cars. This proposed system improvises upon the existing parking system by enhancing its security features and automating the parking process thus eliminating the need for manual intervention. For authentication and owner identification the parking system has an inbuilt Bluetooth reader. The user has to start his/her mobile’s Bluetooth for identification and registration. The Bluetooth reader fetches the user’s Bluetooth number and transfers it to database. The user has to re-start his/her Bluetooth at the time of exit. This eliminates the use of tokens or paper bills. The space management and automation is performed with the help of an ARM microcontroller which controls the mechanical motors to park the vehicle at an appropriate parking location [5].
RIVER: A RELIABLE INTER-VEHICULAR ROUTING PROTOCOL FOR VEHICULAR ADHOC NETWORKS
R.Mallika Devi, Dr.B.Srinivasan
Vehicular Adhoc Networks (VANETs), associate degree rising technology, would permit vehicles on roads to make a self-organized network while not the help of a permanent infrastructure. As a requirement to communication in VANETs, associate degree economical route between act nodes within the network should be established, and therefore the routing protocol should adapt to the quickly changing topology of vehicles in motion. This is often one in every of the goals of VANET routing protocols. In this paper, we have a tendency to gift associate degree economical routing protocol for VANETs, referred to as the Reliable Inter-Vehicular Routing (RIVER) protocol. Stream utilizes associate degree aimless graph that represents the surrounding street layout wherever the vertices of the graph ar points at that streets curve or run across, and therefore the graph edges represent the road segments between those vertices. In contrast to existing protocols, stream performs time period, active traffic observation and uses these information and different information gathered through passive mechanisms to assign a reliability rating to every street edge. The protocol then uses these responsibility ratings to select the foremost reliable route. Management messages ar accustomed determine a node’s neighbors; determine the responsibility of street edges, and to share street edge responsibility info with different nodes
An Enhanced Level Set Segmentation for gray Images Using Fuzzy Clustering and Lattice Boltzmann Method
Savita Agrawal, Deepak kumar xaxa
In the last decades, image segmentation has proved its applicability in various areas like satellite image processing, medical image processing and many more. In the present scenario the researchers tries to develop hybrid image segmentation techniques to generates efficient segmentation. Due to the development of the parallel programming, the lattice Boltzmann method (LBM) has attracted much attention as a fast alternative approach for solving partial differential equations. In this paper, first designed an energy functional based on the fuzzy c-means objective function which incorporates the bias field that accounts for the intensity in homogeneity of the real-world image. Using the gradient descent method, corresponding level set equations are obtained from which we deduce a fuzzy external force for the LBM solver based on the model by Zhao. This is a fast, robust method for denoising, independent to the position of the initial contour, effective in the presence of intensity in homogeneity, highly parallelizable and can segment objects with or without edges. assessment on medical and real-world images manifest the performance of the proposed method in terms of speed and efficiency. The work proposed in this paper concentrates on the gray level images.
Prof.R.R.Bhambare, Akshay Koul, Siddique Mohd Bilal, Siddharth Pandey
SMART VISION SYSTEM FOR BLIND
Prof.R.R.Bhambare, Akshay Koul, Siddique Mohd Bilal, Siddharth Pandey
Visual impairment and blindness caused by various diseases has been hugely reduced, but there are many people who are at risk of age-related visual impairment. Visual information is the basis for most navigational tasks, so visually impaired people are at disadvantage because necessary information about the surrounding environment is not available. With the recent advances in inclusive technology it is possible to extend the support given to people with visual impairment during their mobility. In this context we propose a system, named Smart Vision, whose objective is to give blind users the ability to move around in unfamiliar environment, whether indoor or outdoor, through a user friendly interface. This paper is focused mainly in the development of the computer vision module of the Smart Vision system.
In today’s internet world where nothing is protect, since the communication by transmitting of sending and receiving digital products over the open network occur very frequently the security of information is very important. Everyday so many techniques are obtaining to make message and image secure with high rate of security. Some of the encryption techniques used selective part of an image for encryption and some others apply encryption algorithm on whole image bit by bit. In this paper the existing works on the visual cryptography encryption techniques has been surveyed. Six visual cryptography encryption techniques are studied and analyzed well to analyze the performance of the encryption and decryption methods and also to ensure high rate of secure communication
Pushpalata D. Chandore, Devendra Vatsa, Nitesh Rastogi
Protection from Vampire Attacks on Routing Protocol
Pushpalata D. Chandore, Devendra Vatsa, Nitesh Rastogi
In sensing and pervasive computing ad-hoc low-power wireless networks are an exciting research. Prior security work has first focused on denial of communication at the routing or levels of media access control. This paper examine resource depletion attacks at routing protocol layer, which disable networks by quickly draining node's battery power. These “Vampire” attacks are not specific to any particular protocol, but rather depend on the properties of many well known classes of routing protocols. We find that all examined protocols are affected to Vampire attacks ,which are destructing, hard to detect, and are easy to carry out using as few as one malicious insider sending only protocol compliant messages. In case of worst case, a single Vampire can increase network-wide energy usage by a factor of O(N), where N in the number of nodes of network. The methods we discuss to mitigate these types of attacks which includes a new proof-of- concept protocol that bounds the damage caused by Vampires during the packet forwarding phase.
Gyanender Kumar, Charanjeet Singh, Deepender Dabas
Design and Simulation of Microstrip Patch Antenna for Wireless Communication
Gyanender Kumar, Charanjeet Singh, Deepender Dabas
This paper presents the design and simulation of broadband microstrip patch antenna using High Frequency Structure Simulator (HFSS).The two different configuration of broadband microstrip patch antenna, simple rectangular patch antenna & E shaped patch antenna are analyzed. The performance of designed antennas is analyzed in terms of return loss, bandwidth and gain. The substrate used in these two configurations is FR-4 having dielectric constant 4.4. After analyzing, the return losses and gain of two different antennas are calculated. The calculated result for return loss of rectangular patch & E shape patch are -21.93 db & -25.714 db and gain of rectangular patch & E shape patch are 6.787 db & 8.284db respectively
Cloud computing is emerging as a critical information communication technology to heavily impact our daily life in the future. It is an emerging trend in computing. This is happening at a time when there is increasing attention being paid to the need to manage energy consumption across the entire information and communication technology (ICT) sector. Data center energy has received much attention recently because there are very huge data centers used in industries today and environmentally, these systems can produce e-waste, harmful gases with heat. This paper focuses on different techniques to save over all energy consumption in the data centers, in the enterprises it is called as green cloud computers. Here i have explained and suggested the virtualization technique for saving energy, thus faciliating green cloud computing.
Mine Adverse Drug Reaction from Patient Electronic Database using Particle Swarm Optimization
M.Maheswari, Ms. J.Preethi
Data mining is the process of extracting or mining the knowledge from large amount of data. It is used in many real world applications. Mining the data from data set and analyzes the data for decision making process. Various data mining algorithms are applied to improve the decision making process. To discover the causal relationship from electronic patient datasets capture the causality among infrequent events. The proposed system to find the optimum membership function of a fuzzy system using particle swarm optimization. PSO has no evolution operators such as crossover and mutation.The particle swarm optimization concept consists of, at each time step, changing the velocity of each particle toward its and locations.All particles are moved to the test optimum solution.These mining adverse drug reaction signals are useful to discover patient critical condition and reduce the computational complexity for drug safety professionals.
Anselemi B.Lukonge, Dr.Shubi Kaijage , Ramadhani S. Sinde
REVIEW OF CATTLE MONITORING SYSTEM USING WIRELESS NETWORK
Anselemi B.Lukonge, Dr.Shubi Kaijage , Ramadhani S. Sinde
The cattle industry is an integral part of the world economy. The continued production of quality beef requires new and improved methods for long term monitoring of animal health. Additional benefits can be realized from this class of technology, such as the ability to identify the presence of disease early and thereby prevent its spread. An important element of health assessment is the ability to monitor vital data such as core body temperature. This paper reviews other cattle monitoring systems and the importance of employing wireless sensor networks for monitoring cattle core body temperature and location in the ranches along with their advantages and disadvantages.
Brain Tumor Detection and Classification Using Histogram Equalization And Fuzzy Support Vector Machine Approach
K. Vinotha
There are a number of different quantitative models that can be used in a medical diagnostic decision support system. The complexity of the diagnostic task is thought to be one of the prime determinants of model selection. Using histogram equalisation the input image is pre-processed and segment the suspicious portion from the image based on markov random field algorithm for segmentation method. Features are extracted based on texture, fractal and histogram features, finally the classification is done by using the support vector machine approach
Vehicle Number Plates Detection and Recognition using improved Algorithms: A Review with Tanzanian Case study
Cosmo H.Munuo, Dr. Michael Kisangiri
Invented in 1976, Number Plates Recognition (NPR) has since found wide commercial applications, making its research prospects challenging and scientifically interesting. A complete NPR system functions by vz steps, license plate; localization, sizing and orientation, normalization, character recognitions and geometric analysis. This paper is a review of NPR preliminary stages; it explains number plate localization, sizing and orientations as well as normalizations sections of the Number Plates Detection and Recognition-Tanzania Case study. MATLAB R2012b is employed in these processes. The input incorporated includes front and rear photographic images of vehicles, for proximity and simulation purposes the ample angle of image is 90 degree +-15. The captured image is converted to gray scale, binarized and edge detection algorithms are used to enhance edges. The output of this stage provides the input feature extraction, segmentation and recognitions.
Demographical Implementation of Graph Classification in Educational Network Using Graph Mining Technique
S.A Amala Nirmal Doss, Dr.S.P.Victor
Graphs become increasingly important in modeling complicated structures, such as circuits, images, chemical compounds, protein structures, biological networks, social networks, the Web, workflows, and XML documents. The Graph mining offers a convenient way to study structured datum with different level of implications. Our conventional setup initially focuses with dataset and its entity. This paper perform a detailed study of classified datum of graph classification towards variant clusters in the field of graph mining which can be carried out with identification and analysis strategies. We will implement our Integrated graph mining techniques with real time implementation of Educational network Domains. We will also perform survey analysis strategies for the successful implementation of our proposed research technique in several sampling domains with a maximum level of improvements. In near future we will implement the cluster mining techniques for analyzing the Graph sub structure behaviors.
Use of Dispersion Compensating Fiber in Optical Transmission Network for NRZ Modulation Format
Gurpreet Kaur, Navdeep Kaur
In this paper; dispersion compensating fibers are used to compensate for the positive dispersion accumulated over the length of fiber. Post dispersion compensation scheme is employed for dispersion compensation. NRZ modulation format is employed. The performance of this scheme is analyzed and then the optimization of this scheme is done by different wavelengths of C band. The investigation is done on detailed simulative analysis using optisystem.
UNDERWATER VIDEO PROCESSING FOR DETECTING AND TRACKING MOVING OBJECT
Srividya M. S., Hemavathy R., Shobha G.
In this paper, we present a vision system capable of analyzing underwater videos for detecting and tracking moving object. The video processing system consists of three subsystems, the video texture analysis, object detection and tracking modules. Moving object detection is based on adaptive Gaussian mixture model. The tracking was carried out by the application of the Kalman algorithm that enables the tracking of objects. Unlike existing method, our approach provides a reliable method in which the moving object is detected in unconstrained environments and under several scenarios (murky water, algae on camera lens, moving plants, low contrast, etc.). The proposed approach was tested with 20 underwater videos, achieving an overall accuracy as high as 85%.
This work focused on how a malignant user could attack customers which share physical resources and operate coresidence profiling and public-to-private IP mapping to target. Malicious user’s attack: resource placement (VM) on the target’s physical machine and extraction from the target’s physical machine. The proposed work depends on user account and workload clustering to reduce co residence profiling by mussel self-organization. Similar user behavior and workload types are belonged to same cluster. In order to unclear the public-to-private IP map, each cluster is supervised and used by an account proxy. In this work, each proxy uses one public IP address, which is used by all clustered users when accessing their instances, and maintains the mapping to private IP addresses. This work explained set of abilities and attack paths in which an attacker needs to launch attack for targeted co residence. It shows how our approach disturbs the critical steps in the attack path for most cases and then performs a risk
assessment to determine the probability an individual user will be victimized, given that a successful non directed exploit has occurred.
Survey Paper for WARNINGBIRD: Detecting Suspicious URLs in Twitter Stream
Mr.Sulabh.S, Mr.Siva Shankar.S
Today online social networks play an important role in daily life. There are various online social network(Twitter) are there and these shows tremendous growth in recent years. These kind of social networks allow users to make social connection with others. Apart from all these there are some security issues or security violations are there. This paperrelated to the system investigates correlations of URL redirect chains extracted from several tweets in Twitter. Because attackers have limited resources and usually reuse them, their URL redirect chains frequently share the same URLs. To develop methods to discover correlated URL redirect chains using the frequently shared URLs and to determine their suspiciousness. So collect numerous tweets from the Twitter public timeline and build a statistical classifier using them. Evaluation results show that our classifier accurately and efficiently detects suspicious URLs.
ASPECT EXTRACTION & SEGMENTATION IN OPINION MINING
Mily Lal, Kavita Asnani
Opinion mining or sentiment analysis is the computational study of people’s opinions, appraisals, attitudes, and emotions towards their aspects. Opinion mining is one of the significant areas of research in Web mining and Natural Language Processing (NLP) recently. With the growth of e-commerce, web documents are increasingly being used for decision making by individuals and organizations. This paper focuses on the identification of aspects related to customer opinions. Most of the recent works concentrate on explicit aspects only. Very few of them have dealt with implicit aspects. Here both explicit as well as implicit aspect is considered. A multi aspect review sentence will be segmented into multiple single aspects by segmentation because different opinions can be expressed on multiple aspects simultaneously in the same review. Opinions are polled to determine positive or negative comments for efficient decision making.
Enhancing Accuracy in Cross-Domain Sentiment Classification by using Discounting Factor
K.Aarthi, C.S.Kanimozhi Selvi
Sentiment Analysis involves in building a system to collect and examine opinions about the product made in blog posts, comments, reviews or tweets. Automatic classification of sentiment is important for applications such as opinion mining, opinion summarization, contextual advertising and market analysis. Sentiment is expressed differently in different domains and it is costly to annotate data for each new domain. In Cross-Domain Sentiment Classification, the features or words that appear in the source domain do not always appear in the target domain. So a classifier trained on one domain might not perform well on a different domain because it fails to learn the sentiment of the unseen words. One solution to this issue is to use a thesaurus which groups different words that express the same sentiment. Hence, feature expansion is required to augment a feature vector with additional related features to reduce the mismatch between features. The proposed method creates a thesaurus that is sensitive to the sentiment of words expressed in different domains. It utilizes both labeled as well as unlabeled data of the source domains and unlabeled data of the target domain. It uses pointwise mutual information to compute relatedness measure which in turn used to create thesaurus. The pointwise mutual information is biased towards infrequent elements/features. So a discounting factor is multiplied to the pointwise mutual information to overcome this problem. Then the proposed method uses the created thesaurus to expand feature vectors. Using these extended vectors, a Lasso Regularized Logistic Regression based binary classifier is trained to classify sentiment of the reviews in target domain. It gives improved prediction accuracy than existing Cross-Domain Sentiment Classification system.
AUTOMATED BLOOD VESSEL SEGMENTATION IN RETINAL IMAGE
Pradeepa.BMr., J.Benadict Raja
The First step in development of automated system for ophthalmic diagnosis is Automatic segmentation of blood vessels from retinal images. Automated blood vessel segmentation in retinal image is used to provide the information about diseases like diabetes, stroke, hypertension and Arteriosclerosis. The abnormal image, cottonwoolspots and hard exudates makes the vessel extraction process as difficult task. A Support Vector Machine (SVM) based supervised method is proposed for extracting blood vessel in retinal fundus image. In this work, the green channel image is enhanced by Gabor filter, and then various features are extracted by Thresholding method for SVM classifier. The method will be implemented on publicly available digital retinal images for vessel extraction (DRIVE) database.
Thakar Vivek R, Prof. Vrushank Shah, Prof. Yatin Patel
Improving Efficiency of IDS using alert Correlation
Thakar Vivek R, Prof. Vrushank Shah, Prof. Yatin Patel
Intrusion Detection Systems are designed to monitor a network environment and generate alerts whenever abnormal activities are detected. However, the number of these alerts can be very large making their evaluation a difficult task for a security analyst. Alert management techniques reduce alert volume significantly and potentially improve detection performance of an Intrusion Detection System. To Improve the effectiveness and efficiency of an Intrusion Detection System by significantly reducing the false positive alerts and increasing the ability. Proposed technique addresses the issues relating the optimality of decision-making through correlation in multiple sensors framework. The process is based on through Dempster Shafer rule. Moreover, the reliability factor for any Intrusion Detection System is also addressed accordingly in order to minimize the chance of false diagnose of the final network state. A considerable number of simulations are conducted in order to determine the optimal performance of the proposed prototype. In this paper we are introduce combines evidence from two homogenous and one heterogeneous ids using dempster-shafer algorithm
A Survey on: Pre-Emptive Migration of a video process using Genetic Algorithm on Virtual machine
Ranjitha K N, Sandhya S, N K Cauvery
In a distributed computer system the performance of the system is estimated how efficiently the work is divided across the participating nodes. Process migration is a specialized form of process management where by processes are moved from one computing host to another computing host. The most common application of dynamic process migration is load balancing. Load balancing is a computer networking method for distributing workloads across multiple computing resources, such as computers, a computer cluster, network links, central processing units or disk drives. Using this both resource utilization and job response time is improved and avoiding a situation in which some nodes are heavily loaded and others are idle. To develop an effective load balancing algorithm so many important issues are considered like load levels comparison, load estimation, system stability, amount of information exchanged among nodes, job resource requirements estimation, job’s selection for transfer, remote nodes selection etc. In this paper survey of different memory migration techniques and uses of genetic algorithm in the field of dynamic load balancing are discussed
Survey Paper for Resisting Web Proxy Based Http Attacks by Locality Behavior
Lakshmi B, Silja Varghese
A novel attack detection scheme is proposed to prevent web proxy based http attacks.Here the detection is carried out at server.The scheme utilize locality behaviors such as temporal and spatial localities to obtain features of proxy-to-server traffic. A Gaussian-mixture and Gamma distribution hidden semi markov model is used to describe the time-varying traffic behavior of web proxies. Soft control scheme is used as the attack response it convert malicious traffic into relatively normal one by behavior reshaping rather than rudely discarding so that legitimate users get service and prevent attacker from receiving service.DDoS attack is a key threat to internet applications by comparing existing network security systems the primary aim of the proposed system is to protect origin server from web proxy based http attacks.
OBJECT DETECTION AND TRACKING IN VIDEOS : A REVIEW
Nagalakshmi.C.K, Hemavathy.R , Shobha.G
The detection of moving object is important in many tasks, such as video surveillance and moving object tracking. In this paper, a review has been made on a video surveillance scenario with real-time moving object detection and tracking. The design of a video surveillance system is directed on automatic identification of events of interest, especially on tracking and classification of moving objects. The object tracking and detection is used to establish a correspondence between objects or object parts in consecutive frames and to extract temporal information about objects such as trajectory, posture, speed and direction.Tracking is detecting the objects frame by frame in video. It can be used in many regions such as video surveillance, traffic monitoring and people tracking. In static environment segmentation of object is not complex. In dynamic environment due to dynamic environmental conditions such as illumination changes, shadows and waving tree branches in the wind object segmentation is a difficult and significant problem that needs to be handled well for a robust visual surveillance system.
Fraud Detection Social Security and Social Welfare Data Mining
A.Jenifer Sophia, S.Parthiban
The importance of social security and social welfare business has been increasingly recognized in more and more countries. It impinges on a large proportion of the population and affects government service policies and people’s life quality. Typical welfare countries, such as Australia and Canada, have accumulated a huge amount of social security and social welfare data. Emerging business issues such as fraudulent outlays, and customer service and performance improvements challenge existing policies, as well as techniques and systems including data matching and business intelligence reporting systems. The need for a deep understanding of customers and customer–government interactions through advanced data analytics has been increasingly recognized by the community at large. So far, however, no substantial work on the mining of social security and social welfare data has been reported. For the first time in data mining and machine learning, and to the best of our knowledge, this paper draws a comprehensive overall picture and summarizes the corresponding techniques and illustrations to analyze social security/welfare data, namely, social security datamining (SSDM), based on a thorough review of a large number of related references from the past half century. In particular, we introduce an SSDM framework, including business and research issues, social security/welfare services and data, as well as challenges, goals, and tasks in mining social security/welfare data.
Studying and Comparing Automated Testing Tools; Ranorex and TestComplete
Neha Dubey, Mrs. Savita Shiwani
Testing automation tools enables developers and testers to effortlessly computerize the complete practice of difficult in software progress. The intention of this research paper is to carry out a comparing and studying the concepts, builds and features of automated tools such as the Ranorex and the Automated QA TestComplete based on criteria such as the hard work involved with generating test scripts, capacity to playback the scripts, end result reports, and expenditure. The elementary objective is to investigate the features and concepts supported by these two functional testing tools in order to access unconventionally what pros and cons of the tools and what could be the guidelines for its additional expansion
Augmented Reality is hot trend in mobile industry, allowing addition of external data on top of camera input. Augmented Reality is used in areas such as gaming, navigation, tourism, and education in full interest. Using Digital image processing algorithms and computer vision different iPhone and android applications are created. Our concept is much like what we see and what our brain understand. We propose here to implement a real world object by capturing and follow several image processing technique to get an effective augmented image
AN EFFICIENT MULTILEVEL PRIORITY PACKET SCHEDULING FOR WIRELESS SENSOR NETWORK
C.Vijayakumaran, K. Janaky
Scheduling real-time and non-real time packets at the sensor nodes is significantly important to reduce processing overhead, energy consumptions, communications bandwidth, and end-to-end data transmission delay of Wireless Sensor Network (WSN). Most of the existing packet scheduling algorithms of WSN use assignments based on First-Come First-Served (FCFS), non-preemptive priority, and preemptive priority scheduling. However, these algorithms incur a large processing overhead and data transmission delay and are not dynamic to the data traffic changes. In this paper, we propose a Dynamic Multilevel Priority (DMP) packet scheduling scheme. In the proposed scheme, each node, except those at the last level of the virtual hierarchy in the zone based topology of WSN, has three levels of priority queues. Real-time packets are placed into the highest-priority queue and can preempt data packets in other queues. Non-real-time packets are placed into two other queues based on a certain threshold of their estimated processing time. Leaf nodes have two queues for real-time and non-real-time data packets since they do not receive data from other nodes and thus, reduce end-to-end delay. We evaluate the performance of the proposed DMP packet scheduling scheme through simulations for real-time and non-real-time data. Simulation results illustrate that the DMP packet scheduling scheme outperforms conventional schemes in terms of average data waiting time and end-to-end delay.
A Systematic Approach to Cloud Security Using SeDas Platform
Karthik D U, Madhu C, Sushant M
Cloud computing is one of the cutting-edge technologies used by most of the people. Cloud users are requested to submit the personal private information to the Cloud by the Internet. When users do this, they hope that Cloud service provider (CSPs), will provide the privacy for the data. Self-destructing data mainly aims at providing user’s data privacy. All the data and decrypting key will get self-destruct after user specified time without any human intervention. Along with the privacy we can also be possible to achieve the Confidentiality and Integrity. The user data will get encrypted while uploading the file to the cloud using cryptographic algorithms. In order to provide Integrity we use MD5 or SHA algorithm to avoid modification of data. In this paper, we present a system that uses the active storage technique to achieve self- destruction of data based on the T10 OSD standard. A recovery mechanism is also provided to the legitimate users to obtain their data back by requesting to the cloud admin. A new key will be sent to the legitimate user either to the Email or to Mobile using this key he has to login to the SeDas platform to get back their data. So this approach is efficient to use and possible to achieve all the privacy preserving goals described.
Methodology for Selection of Spindle Drives for Milling Machines
Mikho Mikhov, Marin Zhilevski
This paper describes a methodology for selection of spindle electric drives for milling machines with digital program control. The offered algorithm takes into account the technological process features, the tools used, the processed material, and the mechanical gear type. A concrete example has been presented, illustrating the practical application of this methodology. A number of models for computer simulation of electric drive systems with dual-zone speed regulation have been developed, allowing study at various reference speeds and loads applied to the motor shaft. The research held as well as the results obtained can be used in the development of such electric drives for the studied class of machine tools
Delivering Of Real Time Services through Internet Protocol Using Cloud Computing
Nikshape S S, Deepak S S
cloud computing is nothing but set of services and resources which are offered through internet. Using Cloud computing technology services can be delivered to the user by making use of the technique called virtualization. By making use of statistical multiplexing’s virtualized cloud based services can yield significant cost savings across different applications. Achieving the similar cost savings with the real time services is a challenging task. In this paper we are looking for a lower provide costs for the real time iptv services through virtualized iptv architecture and intelligent time shifting of selected services. Here we are going to consider live TV and video on demand as applications and the deadline that has been associated with each of these two applications will be taken as an profit and effectively multiplexing both the services and a generalized framework will be provided to compute the amount of resources that are required to support multiple services without missing the deadline of any of the services. The problem is to find the best cost function of several cost functions and these cost functions reflect the amount of cost which is required to provide the service. Finding solution to the above problem gives the number of servers which are required at different time instants to support multiples / a multiple / the multiple services. We also implement a simple mechanism of time shifting of scheduled jobs of a simulator and study the reduction in the server load using real traces from an operational iptv network and results show that we are able to reduce the load as predicted by the framework.
Opportunistic Routing Protocols In Human Working Day Model Delay Tolerant Networks
Shermina V Anthru, T.P Jayakumar
Delay-tolerant Networks are used to enable communication in sparse mobile ad-hoc networks and other weak as well as challenged environments where traditional networking fails and new routing and application protocols are required. Experiences with DTN routing and application protocols has shown that their performance is extremely dependent on the mobility and node characteristics. Opportunistic routing protocol drives to improve wireless performance by making a good use of communication opportunities arising by chance. DTN protocols evaluation across many scenarios requires suitable simulation tools. This project presents the opportunistic networking routing simulator specifically designed for evaluating DTN routing and application protocols. Also provides users to create scenarios based on different synthetic movement as well as real-world traces. It offers a framework for implementing routing and application protocols with six well-known routing protocols
Cloud computing uses the internet and central remote servers to maintain data and applications. It allows consumers and businesses to use applications without installation and access their personal files at any computer with internet access. This technology allows for much more efficient computing by centralizing data storage, processing and bandwidth. The appearance of cloud computing has made a tremendous impact on the Information Technology (IT) industry over the past few years. Recently IT industry needs Cloud computing services to provide best opportunities to real world. Cloud computing is in initial stages, with many issues still to be addressed. Security is one of the major issues which hamper the growth of cloud. The idea of handing over important data to another company is worrisome; such that the consumers need to be vigilant in understanding the risks of data breaches in this new environment. The objective of this paper is to introduce a detailed analysis of the cloud computing security issues and challenges focusing on the cloud computing types and the service delivery types..
Location Privacy using Traffic-Aware Mix zones in Vehicular or Mobile Networks
Anju Pathrose, T.Poornima
In VANET for the purpose of saftey, vehicles need to periodically broadcast safety messages providing precise position information to nearby vehicles. However, this frequent messaging (e.g., every 100 to 300ms per car) greatly facilitates the tracking of vehicles, as it success to eavesdrop the wireless medium. As a result, the driver’s privacy can’t be protected. In order to protect personal location information we proposes the mix zone concept. Cryptographic Mix (CMIX) protocol is used here to improve location privacy of Mix-Zone. We propose to do so using pseudonym changes and cryptography. The paper is concluded with an investigation based on current results of upcoming elements to be integrated in our secure VC architecture.
Godbless Swagarya , Shubi Kaijage , Ramadhani S. Sinde
A SURVEY ON WIRELESS SENSOR NETWORKS APPLICATION FOR AIR POLLUTION MONITORING
Godbless Swagarya , Shubi Kaijage , Ramadhani S. Sinde
As the countries become industrialized, the pollution level to our environments increases and this pollution becomes a major problem for the health of the population and also affects the ecosystem. Although some standards are set with environmental authorities for wastes emissions in air, the monitoring and controlling of that standards are still a challenge in most industries especially chemical industries and cement factories. To avoid such adverse imbalances in the nature, an air pollution monitoring system is utmost important. Some of the pollution monitoring systems in cement factories are OPSIS, Uras26, Magnos27 and CODEL. These systems are installed to monitor the emissions from chimney only. Wireless Sensor Networks is an excellent technology that can sense, measure, and gather information from the real world and based on some local decision process it transmit the sensed data to the user. These networks allow the physical environment to be measured at any point, and greatly increase the quality of the environment. This paper reviews the importance of employing wireless sensor networks in air pollution monitoring in cement factories along with their advantages and disadvantages
Testing is one of the most important phase in software development life cycle. It actually plays a very important role in the software product success by improving its quality. There are so many challenges of testing are involved in Web-based applications. But interoperability and integration are the most important testing challenges that are associated with Web-based applications. Now a day’s Web-based applications importance and complexity is also increasing and they are evolving and emerging rapidly. This paper introduces the Integration and Interoperability issues of Web-based Applications.
Various Methods to improve the visual appearance of Black & White and Colored Images
Baljit Kaur, Monika Tuteja, Shally Gujral
The aim of image enhancement is to improve the visual appearance of an image, or to provide a “better transform representation for future automated image processing. Many images like medical images, satellite images, aerial images and even real life photographs suffer from poor contrastand noise. It is necessary to enhance the contrast and remove the noise to increase image quality. Enhancement may be used to restore an image that has suffered some kind of deterioration due to the optics, electronics and/or environment or to enhance certain features of an image.
Web Usage Mining For extracting Users’ Navigational Behavior
Divya Racha
In this era of internet, web sites are increasing tremendously in its volume and also becoming complex. Web Usage mining is the application of different data mining techniques on the web data which tracks user’s navigational behaviors using their history of records and extracts the information of user interest using patterns. The obtained results are used in different applications such as recommender systems, business intelligence and site improvement. This paper emphasizes on the idea of better understanding of user’s profile and site objectives, as well as the way users will browse web pages to better serve them withlist of web pages which are relevant to him by comparing with user’s historic pattern using different web usage mining techniques.This paper also discusses different applications of Web usage mining.
The recognition of wood species is needed is many areas like construction industry, furniture manufacturing. Wood is traditionally classified by human experts. But human identification of wood type is not accurate and the manual identification is a time consuming process. So in this paper, an intelligent recognition for identification of wood species was developed. This paper uses image enhancement as a preprocessing techniques and uses a new method which divides the image into several blocks known as image blocking. Each block is extracted using gray image and edge detection techniques. In this paper, GLCM (gray-level co-occurrence matrix) is used as texture classification techniques. The GLCMs are generated to obtain three features: contrast, entropy and correlation. The classification technique used to classify the wood species is a correlation. Our experimental results showed that the proposed method can increase the recognition rate up to 95%, which is faster and better than the existing system which gives 85% recognition rate.
A wireless sensor network can get separated into multiple connected components due to the failure of some of its nodes, which is called a “cut”. In this article we consider the problem of detecting cuts by the remaining nodes of a wireless sensor network. We propose an algorithm that allows (i) every node to detect when the connectivity to a specially designated node has been lost, and (ii) one or more nodes (that are connected to the special node after the cut) to detect the occurrence of the cut. The algorithm is distributed and asynchronous: every node needs to communicate with only those nodes that are within its communication range. The algorithm is based on the iterative computation of a fictitious “electrical potential” of the nodes. The convergence rate of the underlying iterative scheme is independent of the size and structure of the network
A Survey and Comparative Study of Routing Protocols in Wireless Sensor Network
Sakshi Sharma, Malti rani
Due to real time applications, wireless sensor network is latest research field in computer network. WSN consist of tiny, autonomous low cost, power sensor nodes with sensing and wireless communication capabilities. The design of routing protocols for WSN is influenced by challenging factors like energy consumption, scalability, fault tolerance, quality of service. In this paper, a survey of different routing protocols of WSN is discussed and comparative study is presented. The classification of routing protocols discussed is: location based protocols, layered and in network processing protocols, data centric protocols and mobility based protocols.
Automatic Texture analysis of cartilage for early detection of osteoarthritis
Anjani, Sanjeev Kubakaddi
Osteoarthritis (OA) is the most commonly occurring chronic disease affecting elder population. OA characterized by the degenerative change in the knee cartilage. Early detection of OA will be very useful for medical treatments. Osteoarthritis symptoms are swelling, loss of movement, severe pain, decrease in the function of the joint. This work is to segment the cartilage automatically from the knee MRI. Texture algorithm applied on the segmented cartilage region. This is checking the biochemical properties of the cartilage. The fluid content in cartilage region increases with Joint Space Width (JSW).Micro texture descriptors (MTD) algorithms are used for the texture analysis.
Pruning the Cloud Internal Data Stealing By Treachery Attacks
Shobha Agasibagil, Mr. G.Lingana Gowda
Cloud Computing is internet based computing provide computing services those are data, application, software and computing, these computing services are delivered to local devices through internet. To store and share personal and business information and access cloud computing enables multiple users. Insider means, users those who have valid authority on the cloud. Attackers are treated as remote users in the security perspective. If the attacker are not a remote users then that should be checked by the security systems. If a authorized user’s access details are stolen by an attacker can enter and access the cloud as a valid user. Distinguishing the real data of user and the attacker data, for this decoy information technology are used in the field of cloud computing. For detection of abnormal access of information and validating whether data access then it confuses the attacker with bogus information.
Realization of computerized text taxonomy through a supervised learning system
Mr. Suresh G S, Mrs. Sharayu Pradeep
The exponential growth of the Internet has led to a great deal of interest in developing useful and efficient tools and software to assist users in searching the web. Text is cheap, but the information i.e., knowing to which class a text belongs to, is expensive. Automatic categorization of text can provide this information at low cost, but the classifiers themselves must be built with expensive human effort, or trained from texts which have themselves been manually classified. Text classification is the process of classifying documents into predefined categories based on their content. Document retrieval, categorization and filtering can all be formulated as classification problem. Traditional information retrieval method use keywords occurring in documents to determine the class of the document. In this paper, we propose an association analysis approach for classifying the text using the generation of frequent item word sets (features), known as the Frequent-Pattern (FP) Growth. Naive Bayes classifier (Supervised classifier) is then used on derived features for final categorization
Fast Greedy Algorithm for Routing in Delay Tolerant Network
Aditya Pancholi, Sapna Grover
Delay-tolerant networking (DTN) is an approach to computer network architecture that seeks to address the technical issues in heterogeneous networks which may lack continuous network connectivity. Traditional routing algorithms try to establish a complete route from source to destination and then forward actual data. Due to lack of end to end connectivity, this is not possible in DTN. Also, security guarantees are difficult to establish in a network without persistent connectivity. This paper gives a fast greedy algorithm that intelligently selects next carrier node(s), optimizing the chances of successful delivery
Integrity Check Mechanism in Cloud Using SHA-512 Algorithm
Mrs.Shantala C P, Mr.Anil Kumar
Cloud computing is an alternative to traditional information technology due to its services. In this paper, a new theory has been introduced three way integrity algorithm. Here we are checking the accuracy of cloud service provider (CSP) and third party auditor (TPA). It gives an efficient data integrity mechanism between the client and the cloud by using RSA with digital signature on the message digest instead of on the whole data to make computations faster. Our public auditing system can be constructed from the above auditing scheme in two phases, setup and audit phase. The problem of verifying correctness of data storage in the cloud becomes even more challenging. However, this dynamic feature also makes traditional integrity insurance techniques futile and entails new solutions. We propose an extensible and emphatic distributed scheme with explicit dynamic data support to ensure the correctness of users’ data in cloud.
Group Co-operative Schemes for Optimal Multicast Capacity-Delay Scaling in MANET
Geetha.C, Mr. Justin Gopinath
Mobile ad hoc networks (MANET) is a complex distributed systems that comprise wireless mobile nodes which, dynamically self organize into arbitrary Adhoc network topologies, allowing devices to seamlessly networked in areas with no preexisting communication infrastructure. The recent interest in the research community is to understand the capacity changes under delay constraints in mobile ad hoc networks. In this project, we are concentrating on capacity-delay scaling optimality for multicast traffic pattern for an i.i.d. mobility model in mobile ad hoc networks. With the assumption that n nodes move in a unit square, with each serving as a source that sends identical packets to k destinations, we propose four group schemes of which the achievable capacity and delay are analyzed:
(1) Non-cooperative non-redundancy scheme,
(2) Non-cooperative redundancy scheme,
(3) Cooperative non-redundancy scheme,
(4) Cooperative redundancy scheme.
With intelligent cooperation scheme, each destination acts equivalently as relay and helps other destinations get more opportunities of receiving packets with capacity sacrificed. The project work is design and implement the following group schemes and to simulate using NS2. The results are will be compared with using algorithms and without using algorithms to show the increase in the performance of the proposed algorithm.
V.K. Pathak, Kamal Mehta, Seema Sahu & H.L. Manker
Mtbf Analysis Of The Two-Unit Series System With Repair Facility
V.K. Pathak, Kamal Mehta, Seema Sahu & H.L. Manker
This paper discusses mean time between failure and availability analysis in a two-unit series repairable system with three states under consideration. First two states are taken as up states and the third state is down state. A single repair facility is available which repairs the units on first-come-first-serve basis. Taking failure rates as identically distributed exponential random variable and repair rate as exponential, Mean time between failure and Steady State Availability are calculated. Denoting the failure times of the components as λ and repair time as μ expression for steady state availability and MTBF have been derived using semi-markov process and regenerative point technique. The difference equations are developed and solved using Laplace transform.
Ambulance Controlled Traffic System Using RFID Technology with LabVIEW Simulation
S. Chandrakanth Sagar, Dr. M. Narayana
Traffic management on the road has become a severe problem of today's society because of growth of the urbanization. This leads to traffic jam at the traffic junctions which in turn causes delay to ambulances. In order to overcome this problem, this paper presents a simple ambulance controlled traffic system. The main objective of this system is that to control the traffic, allowing an ambulance to arrive at a particular location without it having to stop anywhere until the destination is reached. This system includes RFID technology and LabVIEW software. An RFID reader reads the ID number from the corresponding ambulance RFID tag and then it is sent to microcontroller LPC 1768H, which is programmed, with the help of embedded C instructions. This microcontroller is capable of communicating with input and output modules. The RFID readers provide the information to the microcontroller so that it compares the received ID with default ID’s stored in its memory. If the obtained ID gets matched with any of the ID’s , then a green signal is given along the path of the ambulance or else no change in the signal takes place. The signal won’t change from green color until the same tag is detected by the other reader in another route. If the tag is detected in other route, then a normal traffic signal operation is performed. This system includes the simulation observation. The operation which is performed on hardware circuit is similarly observed on front panel of the LabVIEW. Moreover, the designed system has simple architecture, fast response time, ease in understanding the working module, user friendliness and scope for further expansion
Development of Wireless Communication Networks: From 1G to 5G
Shivam Jaiswal , Ajay Kumar , Neha Kumari
Wireless technology is growing very fast these days. Until very recently, wired network was needed to get online. Even wired telephones are becoming a thing of past. Mobile networks have evolved tremendously in last 4 decades [1]. Cellular concept was introduced with 1G (‘G’ stands for generation) networks. Today, 4G technology is getting ready to storm the markets and research on 5G has begun. Mobile communication is continuously one of the hottest areas that are developing at a booming speed, with advanced techniques emerging in all the fields of mobile and wireless communications.5G communication systems are being developed to solve the various problems the current communication systems (3G, 2.5G) are facing.4G & 5G will be an intelligent technology that will reduce the number of different technologies to a single global standard. This paper gives evolution of mobile wireless communication networks-1G to 5G how they are different from each other what advantages and disadvantages they have.
Refinement of visual secret sharing scheme without image size expansion
Ms. Smita Patil, Prof. Ms. Jyoti Rao
The basic idea of the Visual Cryptography is to encrypt a secret image into n number of meaningless share images. The Visual Cryptography technique cannot leak the encrypted information of the shared secret by virtue of any combination of the n share images combined together. The share images are printed on transparencies and distributed as shares such that, when the shares are superimposed, a concealed secret image is discovered. The human visual system can recognize the shared secret image without using any computational devices. It needs neither cryptography knowledge nor complex computation. The Visual Cryptography technique for multiple secrets is proposed, which encrypts more than one secret into the equivalent number of share images .The traditional visual secret sharing scheme uses a pre-defined pattern book to generate shares, which leads to a pixel expansion on share images. Thus to minimize the pixel expansion problem in VC scheme a new system is invented which can share two binary secret images on two rectangular share images without pixel expansion. The proposed approach, not only has good contrast, but also has an excellent recovery quality for secret image and the critical problem of pixel expansion is minimized.
H.T. Rathoda, Bharath Rathod , K. T. Shivaram, K. Sugantha Devi
A New Automated Scheme of Quadrilateral Mesh Generation for Finite Element Analysis
H.T. Rathoda, Bharath Rathod , K. T. Shivaram, K. Sugantha Devi
This paper presents a novel mesh generation scheme of all quadrilateral elements for a convex polygonal domain. This scheme converts the elements in background quadrilateral mesh into quadrilaterals through the operation of splitting. We first decompose the convex polygon into simple sub regions in the shape of quadrilaterals. These simple regions are then triangulated to generate a fine mesh of six node triangular elements. We propose then an automatic triangular to quadrilateral conversion scheme. Each isolated triangle is split into three quadrilaterals according to the usual scheme, adding three vertices in the middle of the edges and a vertex at the barrycentre of the element. To preserve the mesh conformity a similar procedure is also applied to every triangle of the domain to fully discretize the given convex polygonal domain into all quadrilaterals, thus propagating uniform refinement and quadrangulation. This simple method generates a high quality mesh whose elements confirm well to the requested shape by refining the problem domain. Examples are presented to illustrate the simplicity and efficiency of the new mesh generation method for standard and arbitrary shaped domains. We have appended MATLAB programs which incorporate the mesh generation scheme developed in this paper. These programs provide valuable output on the nodal coordinates ,element connectivity and graphic display of the all quadrilateral mesh for application to finite element analysis.
This paper presents study on the market of Windows8. How it integrates both desktop and mobile. Our work in this paper is the study of Windows 8 for desktop, Windows 8 for mobile and business model of Windows8. At the end we have concluded the merits and demerits of windows8 desktop and mobile on basis of our study. Also, how Windows8 is better than its predecessors
FSR: Ferry-based Secure Routing Algorithm for Delay Tolerant Networks
Sapna Grover, Aditya Pancholi, Sonika Arora
A delay tolerant network(DTN) is a collection of infrastructure-less nodes, with no communication medium. These nodes cooperate dynamically to meets certain immediate needs. Therefore, each node acts as a router also beside being merely a host. Security issues have thus become more challenging in these networks due to its dynamic nature. Thus these networks are vulnerable to different kinds of attacks because of which security has always been a major concern.
This paper uses ferry-based [11] mechanism for providing security and maintaining consistency throughout the network.
Development of PCU Value of Vehicle under mix Nature Traffic Condition in Cities on Congested Highways
A.R.Khanorkar, S.D.Ghodmare
Highways in India are different from other roads of the country. Traffic on Indian roads consists of a mix characteristic type of vehicles. These vehicles in the highway have widely different static and dynamic characteristics. Traffic is essentially consists of bicycles, two-wheelers, three-wheelers, Light commercial vehicle, cars and trucks this work aims to study of traffic flow on Indian highways by evaluating Passenger Car Unit (PCU) of different vehicle categories at different section of highwaysaround Nagpur city. Our aim is to work out the passenger car unit PCU for different types of vehicles under non homogeneous traffic conditions. Field data collection was conducted at four highway links and at outside of urban areas and the different types of vehicles in Non Homogeneous traffic, for a wide range of mix traffic volume and width of highways. This PCU is utilized to increase the area of shoulder linearly for free speed of a vehicle with width. I observed that from the study of traffic volume and roadway conditions that the PCU value of a vehicle significantly changes with change in volume of mix traffic and width of roadway. The capacity of highways also increases with use of shoulder area and its positive effect on PCU value for type of vehicle increases with increases lane width This is resulting due to the high proportion of the road space occupied (high area occupancy) by heavy vehicles even when less number of heavy vehicles are present on the road. Because of this, maximum PCE value for both trailer and truck is increasing at low proportions and remains constant at high proportion. The relationship between the volume and speed at different highway section developed a second-degree curve. This relationship is used to calculate capacity of highways.
Reuse of software is required to improve the quality and productivity. If their need some changes in the existing software than software can be reused rather than creating software from scratch. Software reuse is the process of creating software systems from existing software rather than building them from scratch. This paper gives details about various concepts of reuse. For making some changes in software no need to change the whole software working particular changes can be done with reuse in existing software. This paper explains about the things in software that can be reused, various techniques of reuse and advantages and drawbacks of reuse. A component reuse is famous for reuse. The quality of software depends on the quality of the reusable component.
An NLP Method for Discrimination Prevention Using both Direct and Indirect Method in Data Mining”
Sampada U. Nagmote, Prof. P. P. Deshmukh
Today, Data mining is an increasingly important technology. It is a process of extracting useful knowledge from large collections of data. There are some negative view about data mining, among which potential privacy and potential discrimination. Discrimination means is the unequal or unfairly treating people on the basis of their specific belonging group. If the data sets are divided on the basis of sensitive attributes like gender, race, religion, etc., discriminatory decisions may ensue. For this reason, antidiscrimination laws for discrimination prevention have been introduced for data mining. Discrimination can be either direct or indirect. Direct discrimination occurs when decisions are made based on some sensitive attributes. It consists of rules or procedures that explicitly mention minority or disadvantaged groups based on sensitive discriminatory attributes related to group membership. Indirect discrimination occurs when decisions are made based on nonsensitive attributes which are strongly related with biased sensitive ones. It consists of rules or procedures that, which is not explicitly mentioning discriminatory attributes, intentionally or unintentionally, could generate decisions about discrimination
Re-ranking of Images using Semantic Signatures with Duplicate Images Removal & K-means clustering
Sayali Baxi, S.V.Dabhade
Image Search engines mostly use keywords and they rely on surrounding text for searching images. Ambiguity of query images is hard to describe accurately by using keywords.Eg:Apple is query keyword then categories can be “red apple”,”apple laptop” etc.Another challenge is without online training low level features may not well co-relate with high level semantic meanings. Low-level features are sometimes inconsistent with visual perception. The visual and textual features of images are then projected into their related semantic spaces to get semantic signatures. In online stage images are re-ranked by comparing semantic signatures obtained from semantic space obtained from query keywords. Semantic space of a query keyword can be described by just 20 – 30 concepts (also referred as “reference classes
MR Image Reconstruction with L1norm Minimization and Total Variation Denoising
Danny Joseph, Manu Raju
This paper is a meek venture to introduce the compressed sensing magnetic resonance image reconstruction. I have devised this algorithm by minimizing total variation (TV) and L1 norm regularization with total variation denoising. MR image reconstruction has gained a tremendous far-reaching impact in the present technologically advanced world. The original problem will be bifurcated into two entities; L1norm and Total Variation [1,2]. Eventually it can be effectively solved. It helps the expedite reconstruction of the MR image through an iterative framework. An additional application of a denoising technique to this method was found to be very efficient and reliable. A comparative view of the computational complexity and reconstruction accuracy of the present method with the earlier approaches will open our eyes to the effectiveness of it based on the numerical results
Algorithmic Reduction and Optimization of Logic Circuit in area and Power Tradeoffs’ with the Help of BDD
Gaurav Sharma
The design complexity and increasing speed of very-large-scale integration (VLSI) chips implies a significant increase in the power consumption. So, many different design approaches have been developed by researchers to reduce the power. This paper presents an algorithmic technique based on hybridizing Symbolic Manipulation Techniques based on BDDs with more traditional explicit solving algorithms. To validate the approach, the graph colouring problem has been selected as a hard-to-solve problem, and an optimized solution based on hybrid techniques has been implemented. Experimental results on a set of benchmarks derived from the CAD for VLSI area show the applicability of the approach to graphs with millions of vertices in a limited CPU time. Boolean functions can be graphically manipulated to reduce the number of nodes, hence the area, when implemented as Binary decision diagrams. So here, ordering of BDD nodes plays a very important role. Most of the algorithms for variable ordering of OBDD have focus on area minimization. Hence, for minimizing the power consumption, suitable input variable ordering is required. So, to find an optimal variable, three algorithms have been used namely genetic algorithm based technique, a branch and bound algorithm and a scatter search algorithm in this paper. Experimental results show a substantial reduction in area and power. Also, the switching activity of the circuit is calculated. Moreover, a comparison is made between all the above techniques.
Valves in an engine have numerous functions to perform and they experience enormous stress even in normal conditions. Hence testing of valves is very important. The present work is focused towards design of combined valve oscillation testing machine to check two and four valve cylinder heads. Also four as well as six cylinder engine heads can be tested on the same machine. Conventionally, valves are tested on two different assembly lines separate for two and four valves. This approach has been adapted to save space, cost and time of testing. This concept incorporates mechanical, automation and pneumatic principles. As a result, it improves efficiency and speed of operation, reduces manpower. The aim is to elaborate the concept with the aid of designing. The concept is simple and reliable. The design calculations have been obtained and found to be suitable for the application stated above.
Implementation of Cooperative Caching in Social Wireless Networks
S.L.Suganya, Dr.R.Indra Gandhi
This paper introduced cooperative caching policies for minimizing the content provisioning cost in Social Wireless Networks(SWNET). SWNETs are formed by mobile devices, such as data enabled phones, electronic book readers etc., sharing common interest in electronic contents, and physically gathering together in public places. Electronic object caching in such SWNETs are shown to be able to reduce the content provisioning cost which depends heavily on the service and pricing dependences among various stakeholders including content providers (CP), network service providers, and End Consumers (ES). This Drawing motivation from Amazon’s Kindle electronic book delivery business, this paper develops practical network, service, and pricing models which are then used for creating two objects caching strategies for minimizing content provisioning costs in networks with homogeneous and heterogeneous objects demands. This paper constructs analytical and simulation models for analyzing the proposed caching strategies in the presence of selfish users that deviate from network-wide cost-optimal policies. It also report results from an Android phone-based prototype SWNET, validating the presented analytical and simulation results.
Performance analysis of different Image Enhancement Algorithms”
Mr. Gharu Anand Nandlal, Prof. Rastogi Nitesh
This project proposes different enhancement algorithms for blur images.It is useful to apply image enhancement methods to increase visual quality of the images as well as enhance interpretability and visibility. An Empirical Mode Decomposition (EMD) based blur image enhancement algorithm is presented for this purpose. EMD is a signal decomposition technique which is particularly suitable for the analysis of non-stationery and non-linear data.
An Empirical Mode Decomposition (EMD) based blur image enhancement algorithm is presented for this purpose. EMD is a signal decomposition technique which is particularly suitable for the analysis of non-stationery and non-linear data.In EMD, initially each spectral component of an blur image is decomposed into Intrinsic Mode Functions (IMFs) using EMD. The lower order IMFs capture fast oscillation modes (high spatial frequencies in images) while higher order IMFs typically represent slow spatial oscillation modes (low spatial frequencies in images).Then the enhanced image is constructed by combining the IMFs of spectral channels with different weights in order to obtain an enhanced image with increased visual quality. The weight estimation process is carried out automatically using a genetic algorithm that computes the weights of IMFs so as to optimize the sum of the entropy and average gradient of the reconstructed image.
Prof. (Ms) A. A. Patil, Prof. T. I. Bagban, Prof. S. J. Patil
Trust-based security for Ad-Hoc network
Prof. (Ms) A. A. Patil, Prof. T. I. Bagban, Prof. S. J. Patil
A mobile Ad-hoc network (MANET) is decentralized type, infrastructure less wireless network of mobile nodes. Since the nodes are mobile, the network topology may change rapidly and unpredictably over time. The network is decentralized, where all network activity including discovering the topology and delivering messages must be executed by the nodes themselves. i.e., routing functionality will be incorporated into mobile nodes. Due to multi-hop routing and absence of centralized administration in open environment, MANETs are vulnerable to attacks by malicious nodes. In order to decrease the hazards from malicious nodes, a simple trust model is built to evaluate neighbors’ behaviors using forwarding packets. Extended from the Ad-hoc on demand distance vector (AODV) routing protocol , a trust-based reactive multipath routing protocol, Ad-hoc on-demand trusted-path distance vector (AOTDV), is proposed for MANETs. This protocol is able to discover multiple loop-free paths as candidates in one route discovery. These paths are evaluated by two aspects: hop counts and trust values. From these paths shortest path is chosen that meet the requirements of data packets for dependability or trust. Several experiments have been conducted to compare AODV and AOTDV protocols and the results show that AOTDV improves packet delivery ratio and reduce the impairment from black hole.
A Literature Survey on Mobile Cloud Computing: Open Issues and Future Directions
Nitesh Kaushik, Gaurav, Jitender Kumar
Given the advances in mobile phones, users start to consider a mobile phone a personal information processing tool. So users want to execute its various operations on the top of mobile devices. Researchers have long recognized that mobile hardware is necessarily resource poor relative to static client and server hardware. Mobile cloud computing (MCC) which combines mobile computing and cloud computing is a good solution to this problem and has become one of the industry buzz words and a major discussion topic since 2009. This paper presents a review on the background and principle of MCC, characteristics, recent research works and future research trends.
Intelligent technique for mining customers review by the help of opinion mining
Shubham pandey, Prateek gupta
In the work an intelligent technique will develop to analyze and summarize the customer comment with the help of opinion mining and the part of speech tagging. Comments made by the customers some times are not able to understand by the computer so easily , so the intelligent technique help to overcome this problem that any web based business will become so easy and transparent to both customers and merchants. In the proposed work with the help of several steps like feature identification, sentiment analysis, and Summarization the orientation of each comment can be checked and user can know whether the comment is in favour or in against of the product
Shadreck Mudziwepasi, Phumzile Nomnga, Mfundo Shakes Scott
Development of an e-Diary Service for Deployment on the UFH network
Shadreck Mudziwepasi, Phumzile Nomnga, Mfundo Shakes Scott
Diaries play a critical role in the management of work schedules in societies and working environments. The result of their use is proper decision-making and time-management. This research project focused on the development of an e-Diary Service system for employees and students at the University of Fort Hare (UFH). This university is situated in a town called Alice in the Eastern Cape province of South Africa. The development of the e-diary was aimed at addressing the problems encountered in the use of hard copy diaries for the management of work schedules at UFH. It has always been a problem using a hard-copy diary for managing work schedules because of the irregularities and inconsistencies associated with its use. These include among others the misplacement or loss of recorded information and failure to meet set targets and attend to some important scheduled events due to lack of a reminder-alert in a hard copy diary. The system was thus developed to provide a suitable solution to these irregularities and help its users to manage their work schedules in an easy, fast, safe and cost-effective manner. Its functions include among many others the ability to allow users to record, update, view and delete their work schedules while also facilitating for the important functionality of sending reminders to users with regards to their work schedules via electronic mail or Short Message Service (SMS).
PROTECTED DESIGN FOR TOPOLOGY TO MANAGE AND AUTHORIZE MOBILE AD HOC NETWORK
Mr.J.Rajesh B.E.,(M.E), Mrs.A.Kamakshi* M.Tech
Wireless sensor network is the fastest emerging modern network which merge sensing, subtraction, and communication into a single tiny device. The control of wireless sensor networks deceit in the ability to deploy large numbers of tiny nodes that assemble and configure itself. Usage scenarios for these devices range from real time tracking, to monitor of environmental conditions, to ubiquitous computing environments. The most straight forward application of wireless sensor network technology is to monitor remote environment for low frequency data trends. In MANETs, based on cooperative communication which gives significant challenges to security issues. During cooperative communication each user transmits its own bits. Trade off is observed in this. In the obtainable system, authentication and topology control issues are focused. Authentication and topology control are directly associated in MANETs and it is considered mutually together. JATC is used to improve through put in the system. Even though JATC improves the throughput, the energy protection is low, communication between networks is less and spam attacks are not avoided. To recover the throughput a secured algorithm is proposed. In projected system Secure Adaptive Distributed Topology Control Algorithm proposes a competence throughput in the scheme. A secure decentralized clustering algorithm for wireless ad-hoc sensor networks is implemented. This algorithm operates without a federal controller, which operate asynchronously, and does not require that the location of the sensors.
This paper focuses on the improvement of Expectation Maximization algorithm. A method attribute selection for experimentation on Expectation Maximization (EM) clustering is used. In attribute selection we used Gain Ration Attribute Eval, Ranker method for EM clustering, which gives better results than the result obtain without using attribute selection method.
A Review Paper on Preventing DDOS Attack and Black Hole Attack with MANETs Protocols
Kanchan, Harwant Singh Arri
At the geographic position the user want the wireless connectivity so wireless network is gaining popularity day by day. To move freely in and out in the network MANET requires the mobile nodes. Wireless links should broken down due to the mobility and changeable transportation because the collection of the mobile nodes in MANET. Major subject and test in MANET is a Routing. We improve performance of routing and consistencies. There are various routing protocol are planned like IAODV and IDSR. Current Work is to make hybrid protocol to control DDOS attack and Black hole attack results will be verified, the routine metrics like delivery of packet division, throughput, and end to end stoppage. Simulation is done in Network Simulator 2 (NS2).
We are creating a web Application Sentiment analysis.There are number of social networking services available on the internet such as facebook, twitter,whatsupetc. In this websites we can send and receives the messages, comments, tag the images. but we cannot analysis or classified this comments on different form like positive, negative and neutral. But in our web application (product review by sentiment analysis) we can classified incoming comment or messages into positive, negative and neutral.A basic task in sentiment analysis is classifying the polarity of the text at the sentence, or character/nature level at the expressed opinion in a sentence or an entity opinion is positive, negative, or neutral. The purpose of this project is to build an algorithm that can accurately classify our messages with respect to a query term and according to average of message to generate a graph. In this web application message can converted in actual text.
In sentiment analysis there are several classifier are used. A Naive Bayes is a simple model which is used in our web application to classify the messages and comments in positive or negative form.
Rainfall Prediction Using Data Mining techniques: A Survey
Shoba G, Dr. Shobha G.
Data Mining is study of how to determine underlying patterns in the data. Data mining techniques like machine learning, alongside the conventional methods are deployed. Different Data mining techniques like GRNN, MLP, NNARX, CART, RBF, ARIMA and so on are used for the prediction of Rainfall. In this paper, analysis of various algorithms of data mining is used for rainfall prediction model. It is difficult to name a particular algorithm is suitable for prediction. Sometimes when certain algorithms are combined, they perform better and are more effective.
In the world renowned game of Cricket, currently for giving umpiring decisions like Stumping and Run Out, the Third Umpire has to review the various angular video footages thoroughly and laboriously. According to the current Laws of Cricket, the Third Umpire is expected to give the correct decision within 30 seconds, although it’s in his discretion to take as much time required by him. This process evidently takes a lot more time than 30 seconds. Hence, a software system for assisting the Third Umpire, is required that will helpcut short the above time lag. Then, the Third Umpire will just provide the Video feeds as input to the system and it will compute a result, which will aid the Third Umpire in reaching a decision quickly. Our Idea is to use Image Processing techniques which can provide us with the means of building such a software system that can be used for the above mentioned scenario. The Software system will make use of Object tracking algorithm for tracking Wickets, and depending on the displacement of wickets the frames of that moment will be taken using a Frame Extraction algorithm
Prof. Pankaj Khambre, Aishwarya Pathak, Aarti Deshpande, Shringar Jha
Cross lingual information retrieval using tf-idf and similarity measures
Prof. Pankaj Khambre, Aishwarya Pathak, Aarti Deshpande, Shringar Jha
This project demonstrates a simple and pragmatic approach for the creation of comparable corpora using Cross-Lingual Information Retrieval (CLIR). CLIR research is becoming more and more important for Information Retrieval (IR) on the Web as it is a truly multilingual environment and CLIR is necessary for global information exchange and knowledge sharing .In this project, the aim is to identify the same news story written in multiple languages (a problem of cross-language news story detection). For example, in a multilingual environment, such as India, where the same news story is covered in multiple languages, a reader might want to refer to the local language version of a news story and these are also rich sources of both parallel and comparable text. In the paper we have followed the corpus based approach for the retrieval of most relevant news.
Increasing Performance of Cooperative Opportunistic Routing in MANET using Spatial Reuse
Raghunath M. Kawale, Prof. M. D. Ingle
Mobile Ad Network is a self-configuring infrastructure less network of wireless communication. Opportunistic routing is a recent technique that achieves high throughput in the face of lossy wireless links. The current opportunistic routing protocol, ExOR, ties the MAC with routing, imposing a strict schedule on routers’ access to the medium. The main concept of cooperative communications is to make use of the broadcast nature of the wireless medium. In this paper, we are improving the performance of cooperative opportunistic routing in MANET using spatial reuse. Here MORE is used instead of ExOR, since ExOR ties the MAC with routing, adding a strict schedule on routers’ access to the medium. It randomly mixes the packets before forwarding them. This randomness ensures routers that hear the same transmission do no forward the same packets. Thus, MORE has no needs special scheduler to coordinate routers and can run directly on top of 802.11.
A STUDY ON SOFTWARE RELIABILITY, RELIABILITY TESTING AND GOMPERTZ MODEL
Sandeep Sharma
Reliability is defined as the ability of a system or component to perform its required functions under stated conditions for a specified period of time.There are various parameters which improves the reliability of the software.It is necessary to maintain the reliability of the software to keep track of correct information about any company the details of the information includes resource, money, employees, transaction details and many more, Now a days demand on complex systems has increased more rapidly. The size and complexity of computer systems has grown during the past decades in a very impressive manner.Due to the increase in size and complexity of the systems it become difficult to maintain the reliability of the system.Software reliability is closely related to safety engineeringand system safety, in that they use common methods for their analysis and may require input from each other. Software reliability focuses on costs of failure caused by various threats, software failure and many more.Various approaches can be used to improve the reliability of the software, however, it is hard to balance development time and budget with software reliability. But the best approach to assure software reliability is to develop a high quality software through all of the stages of software life cycle.
Ants release a chemical called pheromone while searching for food. Ants are capable of finding shortest path to their target .Observing their behaviour an ACS (Ant Colony System) algorithm was proposed. Earlier ACS has been used to give solution for many problems like travelling salesman problem. We propose to use ACS to solve Multi-stage graphs. Ant colony system can effectively give optimal solution in Multi-stage graphs. This approach can be used in real world applications like Grid computing, Data Mining etc.
Vehicle Health Monitoring System In A Developing Nation.
Ebole Alpha F., Prof. Omotosha O. J
Vehicle health monitoring in a Developing Nation (VHMDN) is a collection of data relevant to the present and future performance of a vehicle system and its transformation into information to be used to support operational decisions. The ideal was conceived due to the fact that Standard Organization of Nigeria (SON) a body design to regulate the strength of material has not work to expectation in the Country. This paper is design to embraces an integration of sensors, communication technologies, and Artificial intelligence to provide vehicle wide abilities to diagnose problems and recommend solutions as well as to estimate the total remaining life spam of the material. In Africa, the major concentration of Vehicle parts are mainly on fairly used components or parts, which is popularly called “TOKUBO” and these parts life time cannot be quantifier, the idea was based on the fact that new parts are very expensive and can be of lower standard or may be imported from undeveloped Nation. The introduction begins with a brief history of VHM technology development. Recent research has begun to recognize that the VHM problem is fundamentally one of the statistical pattern recognition (SPR) and a paradigm to address such a problem is described in detail herein as it forms the basis for organization of this theme issue
Shweta gonde, Assistant Professor Uday Chouarisia, Assistant Professor Raju Barskar
A SURVEY ON WEB IMAGE SEARCH USING RERANKING
Shweta gonde, Assistant Professor Uday Chouarisia, Assistant Professor Raju Barskar
The touchy evolution and well-known approachability of free underwritten media gratified on the internet have led to surge of research activity in hypermedia as well as web image search. Organizations that rub on text search procedure for amalgamation search have flourished limited accomplishment as they completely disregard visual content as a ranking signal. Audiovisual aid reranking or spitting image reranking which restructure visual leaflets based on multimodal cues to recover preliminary text only search has acknowledged mounting devotion in topical years. Such type of problem is thought-provoking for the reason that the initial search result often have a great deal of noise. Wisdom acquaintance or social content visual configuration or visual pattern out of such type of a noisy ranked viewpoint to monitor the reranking process is thought-provoking. Based on the statistic that how familiarity, understanding knowledge is extracted here we are bestowing an article on topic web image search using set of features procedure formula route by means of the connectivity amongst the images known as multimodality flaunted with all the brief description about kinds of or various types of reranking methodologies and approaches in particular precise image centered reranking and the portrayal about the procedure the practical method which can be used to implement the project.
A survey on phishing detection and prevention technique
Archit Shukla , Lalit Gehlod
The first and greatest casualty of fraud is faith. According to [7] just over two-thirds (68%) of fraud victims say they are less willing to have faith on others after their fraud experience and 63% are less willing to make future investments. As far as, people with criminal intentions are concerned, identity theft is a conventional idea. A con man in police uniform, not cause many victims to become suspicious and they will comply with whatever they are told. Similarly, phishing is a form of online identity theft that aims to steal sensitive information from users such as online banking passwords and credit card information. This paper focuses on the phishing attacks which includes study of different contributions of recent research on phishing detection and prevention techniques. In addition of that based on the review, a new model for detection and prevention of phishing attacks is given in this paper.