Design of Low Power Variable Latency adder and Its Implementation in Decimation Filter
Sasikumar.J N.Kirthika
In this paper, we have designed a new variable latency adder and its implementation of decimation filter. There are multiple ways to implement a decimationfilter. This filter design combination of CIC (cascaded-integrator-comb) filter and HB (half band) filter as the decimator filter to reduce the frequency sample rate conversion and detail of the implementation step to realize this design in hardware. Low power design approach for CIC filter and half band filter will be designed adder section..This adder allows works at lower power by maintaining the same throughput as well as compare the performance analysis of conventional adder. The VL-adder designed can be modified to overcome the effects of negative bias temperature instability (NBTI) on circuit delay operations..The filter is designed VHDL coding and verified using a FPGA (field programmable gate array) board and Mentor Graphics tool.
Grid computing Issues, Challenges, Need And Practice
Ayushi Pathak Nisha Kaushik
- In the last few decades with the rapid improvement in the field of computer technology, data and communication the concept of “GRID COMPUTING” have become more prominent. The popularity of the Internet and availability of high-speed networks have gradually changed the way we do computing. These technologies have enabled the cooperative use of a wide variety of geographically distributed resources as a single more powerful computer. This new method of pooling resources for solving large-scale problems is called as grid computing. The concept of grid computing have not only evolved in terms of topology but have also included wide spread data communication by the means of resource-sharing, however it is still lacking in terms of trust and reliability. Since a major ratio of people are communicating and sharing data which thereby decreases the trust of the users. It has not able to convince the user fully. This paper describes the concepts underlying grid computing and the trust model on which the reliability factor totally depends
Network crimes are the most unpredictable calamity on the cyber world. Unauthorized access, hacking, spreading of viruses, smashing computer networks on very large scale, the brutal weapons like email bombing, logic bombs resulting into the disrupt behavior of computer networks are very few incidences of recent days. Security of information systems against them is the protection of availability, confidentiality and integrity. The relative priority and significance of availability, confidentiality and integrity vary according to the information system. The legislation should address computers and computer material as unique objects and cover all computer-related aspects in the categories of crimes by means of computers and crimes against computers. The Union Government of India has started now to establish special cyber crime units for controlling such crimes.
Use Of Information Technology In Rural Development”
Gosawi P.R., Madan B.S., Chopane V.V,
A rural development programmed aims at three things: economic development of the village community; restoration of ecological balance in the village; and improvement of economic and social conditions of the resource poor and disadvantaged sections of the village community through access to income generating opportunities. The speed of technological developments in the fields of communication has no doubt changed the face of media and society in the past two to three decades. These developments are so fast that the social scientists and researchers are finding it difficult to asses, assimilate understand and interpret their impact on different communities and social groups. The development of the Internet and multimedia applications has given a new impetus to the external communication of economic and social agents. Not only are businesses and public authorities concerned, but also the not-for-profit, educational and cultural world as well. The step by step procedure for developing a rural area in terms of technology is to be defined.
TA DETAILED REVIEW ON INCREASING RELIABILITY OF SOFTWARE BY AN INNOVATIVE APPROACH
K Manideep, A L Siridhara,
Software Reliability is an important facet of software quality. Software reliability is the probability of the failure free operation of a computer program for a specified period of time in a specified environment. Software Reliability is dynamic and stochastic. It differs from the hardware reliability in that it reflects design perfection, rather than manufacturing perfection. This article provides an overview of Software Reliability which can be categorized into: modeling, measurement and improvement, and then examines different modeling technique and metrics for software reliability, however, there is no single model that is universal to all the situations. The article will also provide an overview of improving software reliability and then provides various ways to improve software reliability in the life cycle of software development
A Survey on SQL Injection attacks, their Detection and Prevention Techniques
Nithya1,R.Regan, , J.vijayaraghavan,
SQL injection is a technique that exploits a security vulnerability occurring in the database layer of an application. The vulnerability is present when user input is either incorrectly filtered for string literal escape characters embedded in SQL statements or user input is not strongly typed and thereby unexpectedly executed.SQL injection is a trick to SQL query or command as an input possibly via the web pages. They occur when data provided by user is not properly validates and is included directly in a SQL query. By leveraging these vulnerabilities, an attacker can submit SQL commands directly access to the database. In this paper we present all SQL injection attack types and also different technique and tools which can detect or prevent these attacks .Finally we assessed addressing all SQL injection attacks type among current technique and tools.
Gesture Recognition: A Communication and Future to Tablets
Kunal Gandhi, Ayushi Pathak, Nisha Kaushik
The term “Gesture” means non-verbal communication by means of visible body actions in order to communicate a particular message, similarly “gesture recognition” is a field of computer science technology which aims at recognizing human gestures in order to convey a particular instruction by means of mathematical algorithms. Gesture recognition is an initiative from the technology’s end in order to understand human perspective through sign language. The algorithm basically aims to determine “what instruction is to be given by the human being at a particular instant”. Human gesture is approximated with somewhat close to the actual instruction which is indeed in the person’s mind at that point of time. That is “the gesture should resemble the instruction which is to be given by the human to the machine “. This reduces human complexity.
AN ADAPTIVE TECHNIQUE FOR FINGER CODE GENERATION USING SEGMENTATION WITH SPFB BASED TECHNIQUE
Dr. S. Pannirselvam , P. Raajan ,
Fingerprint recognition and retrieval is a widely used biometric application for identification and security purpose. Various methods has been proposed for fingerprint identification and recognition and each methods has advantage as well as disadvantages that makes the system partially efficient. The complexity in process and other issues affects performance of the existing system and make the system inefficient. In this paper we presented a Singular Point Feature Block [SPFB] based segmentation of fingerprint in the generation of finger code for better recognition. The feature set is generated based on the singularity points such as core point, delta point and reference point and the global feature entropy.In the first phase the orientation field is estimated, singular points are detected and segment the image based on block generation around the singularity points. Hence a finger print code is obtained with singular point feature. In the next phase the feature set is generated and matching is done with various fingerprint images and this method produces an efficient result and cost effective. Euclidean distance is used for distance measure between two image features. The images from the standard image data base FVC2004 of DB3 is considered for the experimentation; FRR and FAR has been compute for performance evaluation. This proposed approach is compared with the existing methods and it provides better results
The exploit of image processing techniques and Computer Aided Diagnosis (CAD) systems has demonstrated to be an effectual system for the improvement of radiologists, diagnosis, especially in the case of Medical Image Processing. Screening is justified when there is evidence that it will extend lives at reasonable cost and acceptable levels of risk. In this paper, we present an automatic computer-aided diagnosis (CAD) system for early detection of lung cancer by analyzing chest computed tomography (CT) images. A detection of the lung cancer in its early stage can be helpful for medical treatment to limit the danger. Most traditional medical diagnosis systems are founded on huge quantity of training data and takes long processing time. So for reducing these problems the Hidden Markov Model is proposed. This method will increase the diagnosis confidence and also reduce the time utility.
One of the main success factors on which the grid computing depends is how quickly and correctly a computer is able to locate the resources like memory, CPU etc. in the Grid. With the increase in the size of grid, this search can take a considerable amount of time which could degrade the application performance. In this paper we present a middleware Grid component, Grid Resource Software (GRS) that can be used by the end user system to identify the resources in lesser time.
: A NOVEL TECHNIQUE OF REAL TIME VIDEO MONITORING SYSTEM
: Dharani.G, Jeevarathinam.K, Kavin.K, Pritiya.S,
The project entitled “A Novel Technique of Real Time Video Monitoring System” has been developed using JAVA as front end. The main objective of this project is to implement video monitoring in the international banking domain in order to make in the banks in a security way. This project uses five powerful modules namely user interface design, video capture, frame grabber, image comparison and frame into video. The goal of user interface design is to make the user’s interaction as simple and efficient as possible. Video capturing is technology of electronically capturing, processing, storing, transmitting and reconstructing a sequence of still images representing scenes in motion. This is a digital image taken by the host operating systems or software running on the computer, but it can also be a capture made by a camera or a device intercepting the video output of the display. Existing system records all the transaction of a particular day or a session. User can find theft only from location monitoring full video records. This is a long process. In the proposed system recording starting only when changes take places as compared to the original images. The recording stops or does not take place when the original image is present
Gaganpreet kaur, Sandeep kaur, Dr. Dheerendra singh,
ACTIVE SHAPE MODEL AND TOUCH LESS HAND GEOMETRY FOR PALM PRINT RECOGNITION
Gaganpreet kaur, Sandeep kaur, Dr. Dheerendra singh,
Palm Print is one of the relatively new physiological biometrics, attracted the researchers due to its stable and unique characteristics. The rich feature information of palm print offers one of the powerful means in personal recognition. Palm print verification System has long been used and it was found that many research activities were carried out. This paper introduces the active shape model and touch less hand geometry.
DESIGN OF THE MATHEMATICAL MODEL FOR ANALYSIS OF HUMAN RESPIRATORY SYSTEM
Ankit Kajaria,
A proposed model of the human respiratory system is described and analyzed in this article. We introduce a simple measurement technique which can track transient changes in human respiratory system. The present work provides a methodology for auto regressive exogenous (ARX) modelling of human respiratory system. The proper model structure and model parameters are determine for the human respiratory system. Estimated model parameters will reflect the dynamic changes in respiratory system so the differences between normal and disease person can be detected using this model. The results are promising and models obtained for physiological system are able to describe the difference between the normal patient and disease patent by the diagnosis of change in the model parameters.
Purelet approach and ICA Based Poisson Noise Reduction in MRI Data Set
Dr.S.Vasuki P.Karthikeyan, \G.Akshaya karthika
In this paper we proposed a hybrid method for Poisson noise amputation in MRI (Magnetic Resonance Imaging) datasets. The awareness should be paid while enhancing the Poisson noise, because it is vastly signal dependent. The Independent Component Analysis and Purelet (Poisson Unbiased Risk Estimation) performances are assessed directly. In purelet we are using Linear Expansion of thresholding. We approximate the mean square error to bring a dutiful transform domain thresholding. The hybrid method is in cooperation with ICA and Purelet Method. Foremost the ICA bring into play intended for dimensionality reduction for Multivariate Data, Subsequently Wavelet threshold Method exploit for Denoising. Performance Comparison is exploited in requisite of PSNR (Peak Signal to Noise Ratio) and speed of Denoising
A Modified Version Of Extended Plafair Cipher (8x8)
Gaurav Shrivastava Manoj Chouhan, Manoj Dhawan
In this paper we have some modification in Playfair Cipher Substitution Technique. We have proposed a method to enhance the Playfair cipher for more secure and efficient cryptography. We use 8X8 Playfair cipher and have made use of a Simple Columnar Transposition Technique with Multiple Rounds in 8X8 Playfair Cipher Substitution Technique and arranged a special symbol by overall Character Frequency Analysis (CFA) .
Secure Authentication Methods for Preventing Jamming Attacks In Wireless Networks
Y. Madhavi Latha, P. Rambabu,
The open nature of the wireless average greeneries it susceptible to intended interfering attacks, typically referred to as blocking. This intended interfering with wireless transmissions can be used as a Launch pad for rising Denial-of-Service attacks on wireless networks. Typically, blocking has been addressed under an external threat model. However, adversaries with internal knowledge of protocol specifications and network secrets can launch low-effort blocking attacks that are difficult to detect and counter. In this work, we address the problem of selective blocking attacks in wireless networks. In these attacks, the adversary is active only for a short period of time, selectively directing messages of high importance. We illustrate the advantages of selective blocking in terms of network performance reduction and adversary effort by presenting two case studies; a selective attack on TCP and one on routing. We show that selective blocking attacks can be launched by performing real-time packet classification at the physical layer. To lessen these attacks, we develop three schemes that prevent real-time packet classification by combining cryptographic primitives with physical-layer attributes. We analyze the security of our methods and evaluate their computational and message overhead
P. Supraja, , Sd. Afzal Ahmed , P. Babu P.Radhika,
Subscriber Theoretic Pricing for Video Streaming In Mobile Networks
P. Supraja, , Sd. Afzal Ahmed , P. Babu P.Radhika,
Mobile video surveillance represents a new pattern that encompasses, on the one side, ubiquitous video acquisition and, on the other side, ubiquitous video processing and viewing, addressing both computerbased and human-based surveillance. Mobile phones are among the most popular consumer devices, and the recent developments of 3G networks and smart phones enable users to watch video programs by subscribing data plans from service providers. Due to the ubiquity of mobile phones and phone-to-phone communication technologies, data-plan subscribers can redistribute the video content to nonsubscribers. Such a redistribution mechanism is a potential competitor for the mobile service provider and is very difficult to trace given users’high mobility. We analyze the optimal price setting for the service provider by investigating the equilibrium between the subscribers and the secondary buyers in the content-redistribution network.We model the behavior between the subscribers and the secondary buyers as a noncooperative game and find the optimal price and quantity for both groups of users. Based on the behavior of users in the redistribution network, we investigate the evolutionarily stable ratio of mobile users who decide to subscribe to the data plan. Such an analysis can help the service provider preserve his/her profit under the threat of the redistribution networks and can improve the quality of service for end users.
An Adaptive Architecture for Autonomic Orchestration of Web Services
Mrs.C.Sathya, Mrs.T.Hemalatha
Web service is an emerging paradigm in which loosely coupled software components are published, located and invoked on the web as a part of distributed applications. The advantage of employing web service composition is to create and consume a value added service by composing simple and complex software components which are deployed at different locations in an autonomic manner. The centralized web service composition approaches suffer from performance bottleneck and a single point failure. The web services are also distributed across geographical boundaries and they may be constantly removed or upgraded. The solution for the above problem is dynamic composition of web services. The proposed framework can handle any kind of users irrespective of their role in order to utilize the proposed architecture either as a provider or as a consumer. The proposed system performs autonomic composition of web services on-the-fly in order to perform dynamic composition of the services related to each other.
The telephone services are supported using LAN as the access network which is an emerging service. The software implementation of a telephone private branch exchange on the LAN connection allows the attached telephones to make calls and to connect other telephone services including Public Switched Telephone Network, Voice over Internet Protocol services and Voice over LAN services. It is well known that IEEE 802.11 WLAN is highly inefficient for transporting voice data. So boosting the voice capacity over multiple WLANs is therefore an area that deserves further attention from the research community. Due to this, the proposed scheme is that simple software upgrades to enhance the VoIP services.
Survey Paper on Training of Cellular Automata for image
Swati, Chauhan,
A cellular Automata(CA) is a system of finite automata that provide a discrete computational model to under the complex behaviour of images (computational and real).This article provide a survey on how researchers can train the Cellular Automata to learn best rules to achieve optimal solution in a large search space. Many researchers have provided different-different methods (SFFS, with GA, 3-State Representation ....) to train .CA trained with GA(Genetic Algorithm) capable to perform various difficult tasks. This survey also introduces the different fields where CA used with modification techniques (B-Rules CA, 2-Cycle CA).
Cloud storage enables users to remotely store their data and enjoy the on-demand high quality cloud applications without the burden of local hardware and software management. Though the benefits are clear, such a service is also relinquishing users’ physical possession of their outsourced data, which inevitably poses new security risks towards the correctness of the data in cloud. In order to address this new problem and further achieve a secure and dependable cloud storage service, we propose in this paper, a new cryptosystem for fine-grained sharing of encrypted data named Cipher text-Policy attribute-based encryption (CP-ABE). The cipher texts are labeled with sets of attributes and private keys are associated with access structures that control which cipher texts a user is able to decrypt. Attribute-based encryption (ABE) has been envisioned as a promising cryptographic primitive for realizing secure and flexible access control. The property based encryption extends the Attribute Set Based Encryption (ASBE) algorithm with a hierarchical structure to improve scalability and flexibility while at the same time it inherits the feature of fine-grained access control of ASBE. At the same time the data integrity of cloud storage is ensured by using (Message-Digest Algorithm) MD5 and Reed Solomon algorithm
Multi Benfit’s Through Compression For Large Data Stored In Cloud
1 M.Ravi Kumar, S.Manoj Kumar
As the sizes of IT infrastructure continue to grow, cloud computing is a natural extension of virtualization technologies that enable scalable management of virtual machines over a plethora of physically connected systems. Cloud computing provides on-demand access to computational resources which together with pay-per use business models, enable application providers seamlessly scaling their services. Cloud computing infrastructures allow creating a variable number of virtual machine instances depending on the application demands. However, even when large-scale applications are deployed over pay-per-use cloud high-performance infrastructures, cost-effective scalability is not achieved because idle processes and resources (CPU, memory) are unused but charged to application providers. Over and under provisioning of cloud resources are still unsolved issues. Here we try to present the data compression techniques to squeezing data that illustrates to reduce the size which is about deploy in cloud servers. These compression techniques are much reliable when we need to manage the large amount of data, especially useful for industries like which maintain huge data warehouses and big educational universities for reducing the cost. This work attempts to establish formal measurements for under and over provisioning of virtualized resources in cloud infrastructures, specifically for SaaS(software as a service) platform deployments and proposes a resource allocation model to deploy SaaS applications over cloud computing platforms by taking into account their multitenancy, thus creating a costeffective scalable environments. As a result the aim of this paper is two-folded; firstly to evaluate cloud security by compressing data which contains encrypted format to ensure sufficient security, Requirements and secondly to present a viable solution that creates a cost-effective cloud environment for large-scale systems
Arul Murugan A, Jothi Vignesh C, Nijanthan S, Rajasekar Vignesh E,
Performance Analysis of Content Based Image Retrivel
Arul Murugan A, Jothi Vignesh C, Nijanthan S, Rajasekar Vignesh E,
This report provides a comprehensive justification on the architectural design of the content based image retrieval system, which adopted the plug-in framework. It presents a thorough explanation of how diverse image feature extraction algorithms were implemented and adopted seamlessly in the system; moreover, it also describes how these algorithms will be dynamically applied in the context of user modifying the query parameters. Furthermore, it illustrates the prominent performance along with accurateness of the system by given that performance metrics and outcome of comparing with other systems. As a final point, it reports the work contribution of team members, and the project management disciplines used to achieve the success of the project.
A Novel Cloud Computing Architecture Supporting E-Governance
M.Shahul Hameed, A.Appunraj DR.T.Nalini
In this paper Slave Data center and Master Data Center is a technique used to spread a network service workload between two or more devices. Benefits include scalability, reliability, efficiency, redundancy, and minimized response time. The large number of service requests to fulfill the demands of millions of users will broaden the latency problem. Cloud service provider physically may be far away from the clients, compelling data to travel from several mediums and network equipments. So in this paper we are providing the an intelligent & energy efficient Cloud computing architecture is proposed based on distributed datacenters (DDC) which form a client’s instance in nearest neighborhood and fulfill client’s request in optimized latency.
In telecommunications, 4G is the fourth generation of mobile communication technology standards. It is a successor of the third generation (3G) standards. A 4G system provides mobile ultra-broadband Internet access, for example to laptops with USB wireless modems, to smartphones, and to other mobile devices. Conceivable applications include amended mobile web access, IP telephony, gaming services, highdefinition mobile TV, video conferencing, 3D television and Cloud Computing. In order to achieve that a recently proposed concept for 4G standards i.e LTE-Advanced is Cooperative Communication. Cooperative communication allows single antenna mobile (i.e. Single User) to temporary (Logically) share the antenna of other users in a system and thus creates a virtual multiple antenna array that allows it to achieve diversity gain & other benefits of multiple input multiple output (MIMO) systems in a cost effective manner. The scope of this research paper is to evaluate the Performance of Decode and Forward cooperative communication protocol in terms of its bit error rate(BER) throughput and outage probability.
While enabling interoperation with the Internet brings tremendous opportunities in service creation and information access, the security threat of the Internet also dauntingly extends its reach. In this paper, we wish to enlighten the community that the longrealized risk of interoperation with the Internet is becoming a reality. Smart-phones, interoperable between the telecom networks and the Internet, are dangerous conduits for Internet security threats to reach the telecom infrastructure. The damage caused by subverted smart-phones could range from privacy violation and identity theft to emergency call center DDoS attacks and national crises. We also propose techniques to generate solution space that includes smart-phone hardening approaches, Internet-side defense, telecom-side defense, and coordination mechanisms that may be needed between the Internet and telecom networks.
We are seeing demand for broadband services is continuously exploding. So, mobile wireless networks must expand greatly their capacities. Here we are reviewing the economic and technical challenges associated with wireless networks to meet this exploding demand. The paper first reviews a brief technical background on mobile wireless networks and these basic methods to deepen their capacity. We can divide the Methods of capacity expansion into three general categories: the deployment of more radio spectrum; more intensive geographic reuse of spectrum; by increasing the capacity of each MHz of spectrum within a given geographic area. We find that without significantly increased allocations of spectrum, wireless capacity expansion will be wholly inadequate to accommodate expected demand growth
This paper describes several existing data link layer protocols that provide real-time capabilities on wired networks, focusing on token-ring and Carrier Sense Multiple Access based networks. Existing modifications to provide better real-time capabilities and performance are also described. Finally the pros and cons regarding the At-Home Anywhere project are discussed. The ability to stream data between web-services is vital for the implementation of complex workflows on the Grid. In this work we investigate various protocols as to their suitability for streaming, taking into consideration issues such as security, reliability and speed over different Grid configurations. To perform these comparisons we have implemented a Server/Client web service that provides a simple API through which Java applications can stream data. As an added bonus, this architecture can be reused by scientific programmers who want to stream data without having to deal with protocol or web service trivialities. Finally, an evaluation of streaming as an alternative to file transfer on Grid environments is offered, based on a number of test case scenarios. Real-time Transport Protocol (RTP) provides a mechanism for sending real-time data such as video and multimedia. Compressing the RTP data packets and coupling the result with Asynchronous Transfer Mode (ATM) technology provides a means to deliver real-time application data over a network
Gowtham.R Kavipriya.K , Kesavaraj.G , Nathe Aen Mr. S.Maragatharaj,
Multiuser Short Message Service Based Wireless Electronic Notice Board
Gowtham.R Kavipriya.K , Kesavaraj.G , Nathe Aen Mr. S.Maragatharaj,
This is the model for displaying notices in colleges on electronic notice board by sending messages in form of SMS through mobile; it is a wireless transmission system which has very less errors and maintenance. The hardware board contains microcontroller AT89c52 at the heart of the system. The microcontroller is interfaced with GSM Modem via MAX232 level convertor. It is used to convert RS232 voltage levels to TTL voltage levels and vice versa. The hardware also has a 64K EEPROM chip AT24C64. This EEPROM is used to store the timings and messages to be displayed. Hardware also contains a real time clock DS1307 to maintain track of time. A 16x2 Character LCD display is attached to microcontroller for display. Microcontroller coding will be done using Embedded C and Kiel. PC Coding will be done using Visual Basic. Multiple Users are authorized to update notices on the electronic notice board by providing them password. We can use a PC with an administrator for monitoring the system
This study presents the processes undertaken in the design and development of an intelligent omnidirectional mobile robot using four custom-made mecanum wheels. The mecanum wheel developed consists of nine rollers made from delrin. All mecanum wheels are independently powered using four units of precisian gear DC motors and the wheel/motor assemblies were mounted directly to the robot chassis made using an aluminum frame. A four channel high power H- bridge using 2 units of LMD 18200 motor drivers IC circuit was design, built and interfaced to a Basic Stamp(BS2)microcontroller board. Basic mobility algorithm using Basic Stamp software was developed to test the basic mobility capabilities and test the qualitative view of the
An R-Tree Node Splitting Algorithm Using MBR Partition for Spatial Query
Dr.V.Khanaa, Dr.Krishna Mohanta
The optimization of spatial indexing is an important issue considering the fact that spatial database, in such diverse areas like geographical, CAM and image applns are growing rapidly in size and often contain in the order of millions of items.To handle these multi-dimensional data, R-tree is widely used as data structure. The node splitting algorithm used in R-tree process affects the query performance and results in an inefficient R-tree structure as it generates uneven nodes. To overcome these drawbacks, we have proposed an algorithm to balance the uneven node splitting to meet the demand of the R-tree process. The projected algorithm inserts the node into the sibling instead of splitting or re-insertion of the overflow node which paves way to reduce the overhead of splitting process, adjusting tree construction operation and number of disc accessing.
Graph Clustering In Social Networks Based On Attribute Similarities
S. Kalaichezhian
The goal of graph clustering is to partition a large graph into clusters based of node similarities and graph structure. Group of similar data types are done based on some measurements like distance between two vertices, weight of the node, etc. Here service oriented architecture is used in web service to obtain data from different web pages effectively and efficiently. The obtained data will be used to cluster the data based on attribute similarity. The clustered final graph is used to produce a static study about organization. This cluster will help to identify different communication and member’s interaction between the community members. This method will help to identify a community in social network or a group of members. In this paper, we are using modified similarity based graph clustering algorithm based on distance or difference between two web pages and weight of the node to group the data set.
"Identifying high throughput path in Wireless Mesh Networks with bandwidth Guarantees
Regan. R, Divya.K, Madheswari.R,
Wireless mesh networks (WMNs) have occurred as a key technology for next generation wireless networking. Because of their advantages over other wireless networks, WMNs are undergoing swift progress and inspiring numerous applications. To accelerate hop-by-hop routing, we develop a mechanism for calculating the available bandwidth of a path in a dispersed manner. Unfortunately, available bandwidth is not isotonic, the obligatory and appropriate property for reliable hop-by-hop routing. To solve the problem, we introduce an isotonic parameter that captures the available bandwidth metric so that packets can traverse the maximum bandwidth path consistently according to the routing tables constructed in the nodes along the path. To the best of our knowledge, our protocol is the first WMN hop-by-hop routing scheme that can identify bandwidth assured paths.
Improved Security Architecture For Up Keeping Routing Services on Ad Hoc Networks
VRegan. R, Ilakkiya veera, Kousalya.J,
Nowadays People’s dependence is increasing on crucial applications and wireless networks for executing anytime and anywhere. The presence of active routing protocols enable ad hoc network forming quickly. WANETs suffer from security attacks and intrusions even under several defense mechanisms. In case of providing both secure options and network procedures, we present SAMNAR, a Survivable Ad hoc and Mesh Network Architecture. Its objective is to provide essential preventive, reactive and tolerant security structures adaptively under damages and disturbances. The purpose of SAMNAR is to design a path selection scheme for Wireless Ad hoc Network routing. Our outcome explains that survivability achieved on routing services even under several damages and intrusions.
Secure Migration of Various Database over A Cross Platform Environment
R.Vinodha, Mr.R.Suresh
In this paper we aims to bring up the idea of different platform environment to compare a heterogeneous database system, where it may use different ways of storing like file formats, protocols, query language. The biggest challenge for any organization occurs during migration of large databases, which easily consume more terabytes. In different RDBMS, i.e. oracle database 10g, DB2, SQL Server, MySql and MS access etc where those database are not available previously for high speed migration and also performance is increased. This migration is independent o
n Artificial Neural Network Based Vessel Detection On The Optic Disc Using Retinal Photograph
Sharmila .S,
Diabetic retinopathy caused by complications of diabetes, which can eventually lead to blindness. It affects up to 80% of all patients who have had diabetes for 10 years or more. Despite of these statistics, research indicates that at least 90% of new cases could be reduced if there was proper and vigilant treatment and monitoring of the eyes. The longer a person has diabetes, higher the chances of developing diabetic retinopathy .The aim of the project is to detect abnormal vessels in the optic disc of human eye and also prevent from the eye related disease by measuring the features (shape, position, orientation, brightness, contrast) and applying segmentation by replacing the values of the feature measurements the vessels are detected. The existing system uses support vector machine (SVM) to categorize each segment as normal or abnormal. The SVM is used to analyze data and recognize patterns but it cannot detect the vessels automatically, accuracy is not clear and the prediction of disease needs better knowledge. The proposed system uses neural network algorithm for training the features and prediction of disease that is accurate and faster and can find the disease based on ranking its features. .
Tracking System for Wireless Devices in Wifi Environment
Deepali Khatwar, Vaishali Katkar
Growing convergence among mobile computing devices and embedded technology sparks the development and deployment of “context-aware” applications, where location is the most essential context. In this paper, we introduce a similar forensic surveillance tool for wireless networks. Our system, for wireless network can reveal the locations of WiFi-enabled mobile devices within the coverage area of a high-gain antenna. it features a mobile design that can be quickly deployed to a new location for instant usage without training. We present a comprehensive set of theoretical analysis and experimental results which demonstrate the coverage and localization accuracy of the design framework.
Design of Ultra High Band Pass Filter By Using Di-Electric Resonator
Rajni Yadav, Kavita Dagar, Bhanu Yadav,
This paper presents a design of a bandpass filter using combination of a simple transmission line and cylindrical dielectric resonator for wide band application. Four dielectric resonators with same permittivity (FR4 epoxy ) having high permittivity and diameter of 1mm are identified to be contributed to an ultra-wideband bandwidth of the filter. This new approach increases the coupling effect as well as minimizing the insertion loss in the passband. Experimental results from the simulation are closely agreed to the measured values. In order to prove that the new approach contributes more advantages and viable at the desired application band, the return and insertion losses of the filter were analyzed. Bandpass filters play a significant role in wireless communication systems. Transmitted and received signals have to be filtered at a certain center frequency with a specific bandwidth. In designing of microstrip filters, the first step is to carry out an approximated calculation based on using of concentrated components like inductors and capacitors. After getting the specifications required, we realized the filter structure with the parallel-coupled technique. Experimental verification gives comparison, how close the theoretical results and measurements look like.
An Efficient Framework for Name Disambiguition In Digital Library
J. Pricilla
In digital library a number of authors may have same names. The authors with same name may belong to different domains. It leads to name ambiguity while searching the books in digital library. To formalize these problems a unified probabilistic framework is proposed. Using this framework the books titles are analyzed to find out the similarity as well as strong relationship between them. The books titles with strong relationship and higher similarity are grouped to the same cluster. This process continues until all the books in the library are clustered. The users can get the results based on the domain name search in addition with author name search. The users obtain the results by specifying author name and the domain. For this purpose the data in the digital library are also partitioned according to the domain of authors. And hence the authors with the same name are determined easily
Extraction of Html Documents From Heterogeneous WebPages Using Cluster Techniques
Sruthi Kamban K.S, M.Sindhuja,
The World Wide Web is a vast and rapidly growing source of information. Most of this information is in the form of unstructured text which makes the information hard to query. To make the queries easy and to provide the result accurately, template extraction technique is used .In the existing system the techniques which are used to extract the data is not efficient and causes the factors such as delay, accuracy, and duplicate data. The proposed system is presented with Hyper Graph technique for extracting the templates from a large number of web documents which are generated from heterogeneous templates for making the web search more efficient in cost wise, performance and time wise. In addition the proposed approach make use of a clustering technique to retrieve the web documents based on the similarity of underlying template structures in the documents so that the template for each cluster is extracted simultaneously providing goodness measure with its fast approximation for clustering.
Dynamic loading is widely used in designing and implementing software. Its benefits include modularity and generic interfaces for third-party software such as plug-ins. Dynamic loading components are utilization requires local file system access on the end host. The following problems are occurred in the local and remote dynamic components loading. In local system, the file does not exist in the specified Path or the specified search directories, hijacking the components. Although in the remote system, the browser automatically download arbitrary files to the user’s Desktop directory without any prompting, vulnerable program starts up via the shortcut, an archive file containing a document and a malicious component. In existing system the admin have to analyze the profile to check unsafe components. The proposed system has a facility to construct a profile for unsafe component by user.
To avoid unexpected equipment failures and obtain higher accuracy in diagnostic c for the predictive maintenance of induction motors, on-line health nitoring system plays an important role to improve the system reliability and availability. Among different techniques of fault erection, work on motor current signature analysis by using only stator current spectra has been well documented. In addition, the recent developments in MEMS technology shows increasing trend in integrating vibration analysis for fault diagnostic. Vibration-based detection by using the accelerometer is gaining popularity due to high reliability low power consumption, and low cost. A electric machine based on wireless sensor network (ZigBeeTM/IEEE802.15.4
In this Generation Elevator has an vital role in day today human activities, Group elevator scheduling is efficient for mid-rise and high-rise buildings. This paper is to solve the group elevator scheduling problem with advance traffic information. A formulation is developed we can schedule the elevator processing based on the below priorities, first PIR sensor- This will be working based on the number of persons waiting for Elevator; second RFID Card reader-This is used to provide the priority to higher officials who carry this card to get the elevator faster; and finally Emergency Mode – Which will take the high priority among the elevator faster than the other signals. In existing, detailed car dynamics are embedded in simulation models for performance evaluation. Taking advantage of advance information, a new door action control method is suggested to increase the flexibility of elevators. In view of this existing system, all methods were combine together into a single process as mentioned above.
Communication has been made possible since human desires to communicate from one corner to another corner. Every way we have used many resources in communication and reached out around the globe. Today communication has been developed for various sectors in a country. In India, primarily in education sector, many schools and colleges have been providing internet facility for students. Our paper deals with one of the security protocol, its – secure educational network to educate students without resisting internet as only the education medium. We call it as – Secure Data Monitor Protocol
Very one of us are aware of the five basic senses-seeing, feeling, smelling, tasting and hearing. These senses have evolved through millions of years. Whenever we encounter a new object/experience our natural senses tries to analysis that experience our natural senses tries to analysis that experience and the information that is obtained is used to modify our interaction with the environment. But in this new age of technologies the most important information that helps one to make right decision is something that cannot be perceived and analysed by our natural senses. That information is the data in the digital form, and it is available to everyone through sources like internet. The sixth sense technology concept is an effort to connect this data in the digital world in to the real world
Mobile application development has entered new stage largely driven by advent of highly influential iOS, android etc. bases smartphones and tablet devices. In sales, smart mobile devices are outpacing conventional computer clients. While mobile app development is still primarily province of customer applications, time has come that enterprise development teams need to prepare their applications to run on intelligent mobile devices. There are few major categories of mobile app delivery available today. These are: Native type, running directly on device, web based types, employing device’s web browser; a hybrid of native and web based types. Each approach is suitable to work in different environments and also each has some pros and cons in comparison with others. There are many factors that play a part in deciding mobile development strategy, such as development skills, required device functionality, the importance of security, offline capability, interoperability, multiplatform support, deployment method etc. that must be taken into account.
Heterogeneous Mobility Management and Vision for Wireless Networks
Madhurima V, T.Venkat Narayana Rao Lallu Nayak,
The amazing demands from social market are pushing the development of mobile communications faster than before, leading to emergence of new advanced techniques. This paper describes the need for enhancements in mobility management for current and future communication networks and the integration of these heterogeneous networks for a smooth handoff and better quality service in the context of next evolutionary step for wireless communication networks. IP based service technologies are becoming increasingly important in wireless communications. Cellular networks will be used as an access method to the Internet and other IP-based networks. The macro and micro mobility solutions for Mobile IP are analyzed and a comparative study is done among HMIP, Cellular IP and HAWAII protocols.
P. Ranjeet Kumar *, R. Ramesh, T.Venkat Narayana Rao Shireesha Dara
Software Quality Prediction: A Review and Current Trends
P. Ranjeet Kumar *, R. Ramesh, T.Venkat Narayana Rao Shireesha Dara
Software Engineering is the area to analysis, design, development and maintenance of software. Quality is the main constraint about the success of the software design. Works on testability of components or component-based software have proposed several techniques for increasing testability of component-based software systems. This work aims at reviewing these techniques for understanding their similarities and differences. This helps in evaluating proposed techniques as per their contribution in solving the concerned problems. The major quality attribute of a software product is the degree to which it can be relied upon to perform its intended function. Evaluation, prediction, and improvement of this attribute have been of concern to designers and users of computers and software from the early days of their evolution. Idea behind code reuse is that a partial or complete computer program written at one time can be written into another program at later time. Simulators also calculate the mean relative error between original effort and calculated efforts. Selecting the right software tool is one of the most important decisions taken by company; the success of the company will depend predominantly on it. Moreover, it is hard to recognize which tool is most appropriate technology transfer is an issue of major significance for organisations wishing to use reuse technology. This paper gives overview about the software models and reuse technology of software and the trends, which are currently used.
In the recent years, reversible logic has emerged as a promising technology having its applications in low power CMOS, quantum computing, nanotechnology, and optical computing. The basic gates such as AND, OR, and EXOR are not reversible. This paper presents various designs of reversible logic gates used for reversible operation & the applications as carry bypass and select adder Block. This paper also includes simulation result of forward & backward computation of reversible TSG, Fredkin & Toffoli gate. These gates are then used to design four bit Carry bypass and select Adder blocks. Methodology used for designing reversible gate is Tanner Tool Version-14.1 & technology file 0.25 micron. It is shown that the adder architecture designed using TSG Fredkin & Toffoli gate are much better & optimized as compared to existing four bit Carry bypass and select Adder in terms of low power dissipation .The four new reversible gates are also proposed in this paper.
Protection of Data Base Security via Collaborative Inference Detection
Alok Kumar Shukla, Ajay Sharma, Dharmendra Kumar,
- In many applications like Defense department, Commercial departments and Marketing departments we need a strongly secured database. Database securities are needed in order to protect our identity and authentication process of users. We propose a novel security mechanism to overcome inference problems and risks for securing the database. Our approach is used for the violation inference detection for single users and multi users. An agent is located between the user input query and the database. Our approach can be used for both the single user as well as the multiple users. This process achieves high authorization, communication accuracy and trust in communication and preventing data from leakage by inference. Here Work is focused on employee information access. Probability of each employee goes on increasing on each query request. When a user poses a query, detection system will examine users past query log for last three days and calculates probability. If probability exceeds than the specified threshold, the query will be denied for that day. Also, to monitor activities, security officer can generate log.
Biometric Recognition is the recognition of an individual by using biometric features. The biometric recognition is done by many different features like iris, face, palm, fingerprint, voice, etc. Each biometric has its strengths and weakness and the choice of biometric feature depends on the requirement of the system. In this paper, fingerprint feature is used and its various steps for the recognising any individual. In this paper, the various strategies/methods are used for each step of fingerprint recognition. The fingerprint recognition has many applications like identification of user, time and attendance system, device access control, forensic identification, etc.
Role Based Query Modification for Relational Database systems
Ahmed H. Araf Nawal El-Fishawy, Mervat M. Mousa
This paper presents a Role based query modification access control model that is implemented at database level by which fine-grained access control is handled by the underlying DBMS . The proposed model is a combination of the Role based access control model and the case statement query modification algorithm. While RBAC is used to specify the security policy for an organization , queries are modified to reflect the policy rules specified by RBAC model. The proposed model provides cell level granularity for relational database access control through a database level implementation that can't be bypassed and is independent of the underlying DBMS. It considers the insert, update and delete statements in the modification. It reduces the database size required to store the privacy meta data which will improve the performance by reducing the execution time of a given query. It also simplifies the security administration and the maintenance of users and their security policies
A Review on Tcp Westwood Protocol For Simulated And Internet Environment
Vijay P Reshamwala Kaushika D Patel
: This paper discusses various versions of TCP and their congestion control algorithm changes in TCP’s existing congestion control, limitation of older version of TCP variants. We discuss new version of TCP called TCP Westwood with sender side modification of the window congestion control scheme TCP Westwood continuously estimate at sender side packet rate of connection by monitoring ACK reception rate. In this paper we reviewed the comparison of performance of TCP Reno with TCP Westwood in good link and lossy link. In this paper we have discussed the fairness and friendliness issue of TCP Westwood. Also we discuss the mechanism called agile probing that improves the performance of TCP Westwood in slow start phase. This method improves the startup performance of TCP Westwood. This method improves the performance of TCP Westwood in congestion avoidance phase as well as when large amount of bandwidth that suddenly becomes available
Quantification Analysis of Chaotic Fractal Dimensions
Hemant Kumar Singh Prof. Deepa Gupta
Fractals appear from a various sources and have been observed in nature .One of the substantial characteristics of fractals is that they can be described by a non-integer dimension. The geometry and the mathematics of fractal dimension have contributed useful tools for a variety of scientific speciality. The fractal dimension quantifies its dimension across the curves and trajectories. In recent years, various numerical methods have been developed for quantifying the dimension directly from the observations of the natural system. The purpose of this paper is to quantify dimensions of fractals that arise in nature by two fractal quantifiers to quantify the dimensions i.e. compass dimension and box counting dimension thereby deducing an algorithm of chord length and the number of solution steps used in computing fractacality. Results demonstrate that trajectory’s fractal dimension can be nearly approximated. We expect this paper could make the fractal theory understood absolutely, and could expand fractal application in numerous fields.
rof. Satange D. N Ms. Pasarkar Anagha A Ms. Chauhan Priyanka D
Microscopic Image Analysis for Plant Tissue Using Image Processing Technique
rof. Satange D. N Ms. Pasarkar Anagha A Ms. Chauhan Priyanka D
In this paper we analyze some microscopic images of plant tissue (xylem) using image processing technique. We capture the image with the help of Compound Laboratory Microscope. A compound laboratory microscope (CLM) is an advanced piece of lab equipment that utilizes light to create magnified images of specimens that are then projected onto an imaging mechanism. The deformation of plant tissue obtained at the scale of the two principal failure mechanisms: Cell breakage & cell separation. Different fundamental algorithm of image processing is implemented like edge linking and boundary detection of plant cells, image enhancement ,segmentation ,noise removal, morphological operation. A segmentation based problem to count number of white & black nano particles approximately in microscopic image
OVERVIEW OF TEXT SUMMARIZATION EXTRACTIVE TECHNIQUES
Mrs Pimpalshende A. N.
Text summarization plays an important role in the area of natural language processing and text mining. As the information resources in both online and offline are increasing exponentially, the major challenge is to find relevant information from large amount of data in short time. Text summarization aims to create a compressed summary while retaining the main characteristics of the original set of documents. Many approaches use statistics and machine learning techniques to extract sentences from documents. An extractive summarization method consists of selecting important Sentences, paragraphs etc. from the original document and concatenating them into shorter form. The importance of sentences is decided based on statistical and linguistic features of sentences. An abstractive summarization method consists of understanding the original text and re-telling it in fewer words. It uses linguistic methods to examine and interpret the text and then to find the new concepts and expressions to best describe it by generating a new shorter text that conveys the most important information from the original text document. This paper presents a survey of extractive text summarization techniques. We explore different types of summarization, evaluation strategies and metrics, features, extractive summarization techniques, approaches and problems in text summarization.
DATA MINING TECHNIQUES FOR SOFTWARE EFFORT ESTIMATION TO IMPROVE COST EFFICIENCY
Ms. K. Gayathiri Dr. T. Nalini ,Dr. V. Khanaa,
Software Cost Estimation can be described as the process of predicting the most realistic effort required to complete a software project. Due to the strong relationship of accurate effort estimations with many crucial project management activities, the research community has been focused on the development and application of a vast variety of methods and models trying to improve the estimation procedure. The rapidly increased need of large-scaled and complex software systems leads managers to settle SEE (software effort estimation) as one of the most vital activities that is closely related with the success or failure of the whole development process.Data mining tool is used for selecting a subset of highly predictive attributes such as project size, development, and environment related attributes, typically a significant increase in effort estimationaccuracy can be obtained.
Miss. Chaitali G. Taral, Dr.P.R.Deshmukh Prof. G.S.Thakare
RELIABILITY APPROACH OF DATA GATHERING PROTOCOL FOR MOBILE USERS IN WIRELESS SENSOR NETWORK
Miss. Chaitali G. Taral, Dr.P.R.Deshmukh Prof. G.S.Thakare
In today’s world people interact with the network and collects data from network with the help of handheld devices. Here we propose a novel approach for the mobile users to collect data network wide. Most of the traditional Wireless Sensor Network (WSN) architectures consist of static nodes which are densely deployed over a sensing area. The route structure of data collection is updated every time when the movement of the mobile user changes. By considering this approach we perform limited modification to update the route structure while the route performance is bounded and controlled compared to the optimal performance in the Wireless Sensor Network. In this we need to update the route structure of data collection whenever there is mobile movement. The proposed protocol to update route structure is easy to implement. The protocol uses three approaches as: Initialization of data collection tree, Updation of data collection tree, Routing of data. Our analysis generated the above mentioned protocol is use to shows that the proposed approach is scalable in maintenance overheads, performs efficiently in the routing and provides continuous data delivery during the movement of the user. Finally we examine the efficiency of our protocol by varying the settings of networks which provides the efficient way of data collection using wireless sensor network.
Integration of 2 D Secure Barcode in Identity Cards:A New Approach
Manisha Bajpai Arun Prakash Agrawal
This paper introduces an ID card Management System that integrates 2-Dimensional Barcode which is responsible to produce more secure, reliable identification cards. This system will capture the personalized data including signature and photo of holder and dynamically generate an image of 2D Barcode containing the information provided and affix this barcode image on ID Card. This card can be used to validate and authenticate the holder. Main advantage of using 2D Barcode is its data encoding capacity, 2D Barcode is able to encode up to 500 bytes per square inch. Some data is used for error correction encoding provides the capability to tolerate the holes, cuts and dirt marks and makes 2D Barcode readable. There are varieties of 2D Barcodes available.This Paper uses one of the standard 2D Barcode PDF417 to demonstrate the ID card management. The paper discusses 2D barcodes and their encoding methodology, system workflow and architecture of proposed ID card management system.
Sumedha S. Deshpande Krutika V. Asare, Swapnil Deshpande
An overview of Mobile Ad Hoc Networks for the Proactive, Reactive and Hybrid Routing Protocol
Sumedha S. Deshpande Krutika V. Asare, Swapnil Deshpande
A Mobile Ad-Hoc Network (MANET) is an autonomous collection of distributed mobile users. Mobile ad hoc networks (MANETS) can have widely varying characteristics that have a greater impact on the behavior of different routing protocols created for these networks. In this paper, we have compared delay, throughput, routing load and end to end delay of MANET proactive routing protocol (WRP,GSR,DSDV,HSR,CGSR), reactive protocols (AODV, DSR, TORA,ABR,LMR) and hybrid protocol (ZRP,FSR,LANMAR).
Mrs. Mishra Keerti Mrs.Verma Deepti, Mr. Verma R.L.
Hybrid Dwt-Dct Coding Techniques for Medical Images
Mrs. Mishra Keerti Mrs.Verma Deepti, Mr. Verma R.L.
In this paper, a hybrid image compression coding technique using the discrete cosine transform (DCT) and the discrete wavelet transform (DWT) is used for medical images. The aim is to achieve higher compression rates by applying different compression thresholds for LL and HH band wavelet coefficients. The DCT transform is applied on HL and LH bands with maintaining the quality of reconstructed images. After this, the image is quantized to calculate probability index for each unique quantity so as to find out the unique binary code for each unique symbol for their encoding
This paper introduces hybrid watermarking scheme for medical images. We propose a watermarking scheme that can recover the original image from the watermarked one. Medical images require special safety and confidentiality because critical judgment is done on the information provided by medical images. Transmission of medical image via internet or mobile phones demands strong security and copyright protection in telemedicine applications. In the field of telemedicine confidentiality of a medical image can be achieved by hiding the Electronic Patient Record (EPR data)in corresponding medical images. Technique named Class Dependent Coding Scheme [2] and the modified difference expansion watermarking using LSB replacement in the difference of virtual border. The payload is formed by image hashing using MD5 [1]. The proposed techniques aim at increasing the data hiding capacity, so it can be used to protect vary images like military or medical images. The paper discusses the perspectives of digital watermarking in a range of medical data management and distribution issues, and proposes a complementary and/or alternative tool that simultaneously addresses medical data protection, archiving, and retrieval, as well as source and data authentication. The scheme imperceptibly embeds in medical images multiple watermarks conveying patient's personal and examination data, keywords for information retrieval, the physician's digital signature for authentication, and a reference message for data integrity control. Experimental results indicate the efficiency and transparency of the scheme, which conforms to the strict requirements that apply to regions of diagnostic significance.
This paper introduces hybrid watermarking scheme for medical images. We propose a watermarking scheme that can recover the original image from the watermarked one. Medical images require special safety and confidentiality because critical judgment is done on the information provided by medical images. Transmission of medical image via internet or mobile phones demands strong security and copyright protection in telemedicine applications. In the field of telemedicine confidentiality of a medical image can be achieved by hiding the Electronic Patient Record (EPR data)in corresponding medical images. Technique named Class Dependent Coding Scheme [2] and the modified difference expansion watermarking using LSB replacement in the difference of virtual border. The payload is formed by image hashing using MD5 [1]. The proposed techniques aim at increasing the data hiding capacity, so it can be used to protect vary images like military or medical images. The paper discusses the perspectives of digital watermarking in a range of medical data management and distribution issues, and proposes a complementary and/or alternative tool that simultaneously addresses medical data protection, archiving, and retrieval, as well as source and data authentication. The scheme imperceptibly embeds in medical images multiple watermarks conveying patient's personal and examination data, keywords for information retrieval, the physician's digital signature for authentication, and a reference message for data integrity control. Experimental results indicate the efficiency and transparency of the scheme, which conforms to the strict requirements that apply to regions of diagnostic significance.
USE OF ARTIFICIAL NEURAL NETWORK TO IDENTIFY CARCINOMA CELL
Miss. Kuralkar Samruddhi S. *, Dr. Deshmukh P.R
Artificial neural networks have featured in a wide range of medical fields, often with promising results. This paper reports on a review benefit of artificial neural networks (ANNs) as decision making tools in the field of cancer. Artificial neural network are used to find relationship between input and output or recognized pattern in data. Here we have to recognized cancer cell pattern. This addresses the system which achieves cell characterization for finding percentage of cancer cells in the given image with high accuracy. Harris corner detection Algorithm itself scans the whole image and performs the classification of cancer cell. Artificial neural networks are used to aggregate the analyzed data from these images to produce a diagnosis prediction with high accuracy instantaneously where digital images serve as tool for input data. Hence in the process of surgery these automated systems help the surgeon to identify the infected parts or cells in case of cancerous growth of cells to be removed with high accuracy hence by increasing the probability of survival of a patient. In this proposal one of such an automated system for cancer cell classification which helps as a tool assisting surgeon to differentiate cancerous cells from those normal cells i.e. percentage of carcinoma cells, instantaneously during the surgery. Here the pathological images serve as input data. Finally, algorithm was applied to selected pathological images for classification. This design can be extended to estimate the number of carcinoma cells per unit area.
Wireless Cellular Communications Security Issues and Types of Attacks with WAP
K.V.N.R. Sai Krishna
Cellular Communication has become an important part of our daily life. Besides using cell phones for voice communication, we are now able to access the Internet, conduct monetary transactions, send text messages etc. using our cell phones, and new services continue to be added. Therefore, it is important to provide users with a secure channel for communication. This paper will give a brief introduction to the various generations of cellular networks. For those not familiar with the cellular network architecture, a brief description of the new 3G cellular network architecture will be provided. Limitations of cellular networks, their security issues and the different types of attacks will be discussed. Then the steps taken in the new 3G networks to combat the different security threats will be provided. Also, the security features of the Wireless Application Protocol (WAP) used to access the Internet will be discussed.
Image Segmentation Using Cellular Automata: A Technical Survey
Priyanka Shotrya Sanjeev Bhardwaj
Image segmentation is an integral part of image processing applications like medical images analysis and photo editing. In this paper, we have discussed the most commonly used Image Segmentation techniques using Cellular Automata. The segmentation technique to be chosen depends on the properties of
“The Pertinence of Intelligent System in the field of Digital Image Forensics”
Miss. Pande Ankita V Prof. Shandilya V.K.
: Images and videos have become the main information carriers in the digital era. The expressive potential of visual media and the ease in their acquisition, distribution and storage is such that they are more and more exploited to convey information, even sensible. As a consequence, today images and videos represent a common source of evidence, both in every-day life controversies and in trials. The simplest video in TV news is commonly accepted as a certification of the truthfulness of the reported news. In a similar way, videosurveillance recording scan constitute fundamental probationary material in a court of law. Together with undoubted benefits, the accessibility of digital visual media brings a major drawback. Lately, the reliability of digital visual information has been questioned, due to the ease in counterfeiting both its origin and content. Digital image forensics is a brand new research field which aims at validating the authenticity of images by recovering information about their history. Two main problems are addressed in this paper are: the identification of the imaging device that captured the image, and the detection of traces of forgeries. Digital images have the problem of being easy to edit and to tamper. As a result, digital forensics is now an important field in image processing
“Digital Image Processing Approach for Fruit and Flower Leaf Identification and Recognition”
Miss. Pande Ankita V Prof. Shandilya V.K
- Fruit and Flower trees are broadleaf and usually deciduous, meaning they lose their leaves annually in the fall. Knowing what types of fruit trees occur naturally will narrow the list of possibilities when trying to identify a tree. Observing the leaf of a tree or shrub is an excellent way to determine the species. Characteristics such as shape, size, texture and how the leaf is arranged can help identify the type of plant when fruit has not yet formed. Leaf features extraction on fruit leaf image is still be a problem on automatic plant leaf identification. This paper presents identification of local fruit trees through leaf structures using image processing techniques. In this study Chain code method is used which is used to obtain the shape of an object. Computer-Aided Plant Species Identification Technique (CAPSI) is used which is based on the image matching technique of leaf shape. The image pre-processing begins with converting the RGB image to the grey-scale image by applying thresholding technique before removing the noise. Sobel operator is applied to the binary image to recognize the edge of that image before thinning the edges. The feature extraction process is then conducted by using the chain code technique. The last stage is to recognize the leaves feature by using Linear Comparision technique.
Achieving Energy Efficiency by the Enhancement Of Mac Protocol To Prevent Nav Attack
Pathak Vivek Gangal Ambrish
The performance of Ad-hoc network is calculated by the number of packets successfully delivered to the destination. In multi-hop wireless networks, every node acts as middle node to forward packets to other nodes. To increase the overall network performance, number of packets delivered to the destination successfully must be increased. In the congested network the coordination between the active nodes must be maintained to increase network performance .In this paper, we discuss medium access protocol, MACA .The Hidden terminal problem and exposed terminal problem had been solved by using MACA. In this paper we review MACA and highlight the NAV attack which is possible in this protocol .We also propose the new technique for the prevention of NAV attack. When triggers the NAV attack in MACA protocol overall network performance degrades. There is an algorithm to detect the selfish nodes and also simulated their approach in NS-2 which shows the efficiency of the algorithm of solution [11]. The attacks exploit the local estimation and global aggregation of the metric to allow attackers to attract a large amount of traffic [10].
Improving Decision Support Systems with Data mining, Data warehousing, OLAP and OLTP Technologies
Ms. Ramteke M. A*, Prof. Dhande S. S,
: Rapidly changing markets and global competition require up-to-date, high-quality and complete information for decision support. Data mining, data warehousing, on-line analytical processing (OLAP), on-line transaction processing (OLTP) are important elements to support decision making process which has increasingly become a focus of the database industry. This paper provides a review of data mining, data warehousing, OLAP and OLTP technologies, with an emphasis on the components of Data warehouse architecture. A "data warehouse" is an organization-wide snapshot of data, typically used for decisionmaking. The use of data mining to facilitate decision support can lead to an improved performance of decision making. OLAP technology is used to perform complex analysis of the data in a data warehouse. OLTP technology is used to perform updates on operational or transactional systems. Online Transaction and Processing helps and manages applications based on transactions involving high volume of data. Decision Support System is a computer based information system designed to facilitate the decision making process of semi structured tasks. DSS systems and warehouses are typically separate from the on-line transaction processing (OLTP) system. Unlike a traditional relational database model or On-Line Transactional Processing (OLTP) system, an OLAP system is optimized to provide data to end users in a meaningful format through a Decision Support System (DSS) Application.
An Aperture Coupled Feed Approach to Gain and Bandwidth Enhancement of Microstrip Patch Array Antennas
Fathima Anees S Mrs. Rajini A R
Communication has become indispensable in the modern world and antennas, being the electronic eyes and ears of the world, have become an integral part of our communication technologies. Among the various types of available antennas, Microstrip Antennas have been one of the most innovative developments in the era of miniaturization and are increasingly finding applications in a wide range of microwave systems. The aim of this paper is to design and compare the performance of Rectangular and Circular Microstrip Patch array antenna using ADS (Advance Design System) Momentum. Microstrip Line feed is used to feed each element. The performance is further enhanced by replacing the Microstrip Line feed by Aperture coupling since aperture coupling eliminates the direct electrical connection between the feed and patch and allows independent optimization of both the feed and radiating patch, improving radiation pattern and bandwidth. These arrays are designed to operate at a frequency of 2.4 GHz. Various parameters like the Return Loss, two dimensional and three dimensional radiation patterns, Gain and Directivity of the designed antenna are obtained using ADS Momentum for three designs, using different substrates. The purpose of this antenna is to obtain a high gain with better band width and reduced losses, to be especially used for WLAN applications.
IMAGE RECONSTRUCTION OF WATER WAVES USING BISPECTRUM TECHNIQUES
Miss. Solio Anchal A. Dr. Ladhake S.A
CBIR applies to techniques for retrieving similar images from image databases, based on automated feature extraction methods. In recent years, the medical imaging field has been grown and is generating a lot more interest in methods and tools, to control the analysis of medical images.CBIR for medical images has become a major necessity with the growing technological advancements. The content of an image have to be carefully extracted, classified with efficient techniques foe easy retrieval. Contents in an image can be of various forms like texture, color and shape etc of which is regarded as the most efficient metric. Medical images are usually fused, subject to high inconsistency and composed of different minor structures. So there is a necessity for feature extraction and classification of images for easy retrieval. There are various methods have been proposed for medical image retrieval system such asShape Based Method, Texture Based Method, Continuous Feature Selection Method, Storage and access methods.
Image segmentation is a method of partitioning an image into meaningful parts, which can be put into a detailed study. This method is largely useful for the practical applications such as machine vision, bacterial study, face detection, video surveillance, fingerprint detection, etc. Graph partitioning is a method developed from Graph Theory which has multiple applications in various fields. This process considers an image as a graph with V vertices and E edges. It can be partitioned into k-components with specific properties. This enables the detailed study of an image for various purposes. The variation in the pixels, texture, etc, from one component to another component can be easily identified. This paper involves in proposing an effective Graph partitioning method overcoming all the existing disadvantages
Anomaly Detection: An Application of Visual Surveillance
M. Aishwarya, S. Shanu
Recent advances in camera and video technologies have made it possible to network numerous video cameras together in order to provide visual coverage of extensive public spaces such as airports and train stations. As the size of the camera network grows and the level of activity in the public space increases, it becomes infeasible for human operators to monitor the multiple video streams and identify all events of possible interest. Consequently, a timely challenge for computer vision researchers is to design camera sensor networks capable of performing visual surveillance tasks automatically or at least with minimal human intervention. This paper presents with the technology of intelligence of the surveillance systems, while surveillance systems can provide many services, this paper describes the development of an application that uses cameras to detect the changes in the homes, buildings and offices etc. Basically the proposed system will detect the changes (whenever anybody enters) in a room with the help of the motion detection algorithm and it will automatically send an email to a computer in a security room.
This paper describes dynamic routing with securityusing a cryptographic algorithm which can be used in multiple organization system. The problem of security is the major issues for data communication over worldwide networks. The main objective of this paper is to propose a dynamic routing with security considered using strongest algorithm, such as Blow fish algorithm. Blow fish algorithm provides the strong security from the client to the server system. The dynamic routing avoids two consecutive packets on the same link and updates the routing information from neighbors of the router in the network. The main aim of this paper is a provide strong security from many organizations to a head of the organization, such as Banks, Colleges, Companies, Universities as they need a strong security for data communication. So this paper main issue is to provide strongest security and less time for data transmission from the clients to a server.
Nayak Divya, Prof. Shukla Suwesh, Shrivastava Manish
A SURVEY OF BROADCASTING PROTOCOLS IN WIRELESS NETWORK
Nayak Divya, Prof. Shukla Suwesh, Shrivastava Manish
Broadcasting is a fundamental operation in which a single message is to be sent from a source to all other nodes in the network. This paper presents a brief survey of broadcasting in wireless network and different types of broadcasting techniques. Broadcasting plays an important role in designing communication protocols in various types of networks. For example, in highly parallel systems such as Networks of Workstations (NOW) or Grid computing, it is essential to quickly distribute the input data or computational results quickly to all nodes in the system. The factors affecting the performance of the broadcasting also discussed. Different broadcasting methods have different advantages, hence a suitable method must be chosen for any application. We have discussed the advantages and disadvantages of all the method, and compared these methods on the basis of different criteria.
ARTIFICIAL VISION SYSTEM FOR AUTOMATIC NUMBER PLATE IDENTIFICATION
Miss. Kuralkar Samruddhi S. Dr. Deshmukh
During the recent years, Intelligent Transportation Systems (ITS) are having a wide impact in people’s life as their scope is to improve transportation safety and mobility and to enhance productivity through the use of advanced technologies. License Plate Recognition (LPR) is an integral part of ITS. The popularity of License Plate Recognition System is mainly because of its Successful applications in Traffic congestion and monitoring, Revenue control related to road usage, Campus security Systems, Access Control Systems, etc
In this paper, a algorithm for vehicle license plate identification is proposed, on the basis of a image segmentation technique and connected component analysis in conjunction with character recognition Neural Network. The algorithm was tested gray level vehicle images of different backgrounds .The camera focused in the plate, while the angle of view and the distance from the vehicle varied according to the experimental setup.In the first image enhancement is performed. Then it is used for edge detection. After edge detection series detect the license plate number. Character segmentation is done using line scanning technique. In addition to the algorithms based on gray-level image processing, color information of license plates also plays an important role in license plates localization, where the unique color or color combination between the license plates and vehicle bodies are considered as the key feature to locate the license plates. The mechanism is able to deal with difficulties raised from illumination variance, noise distortion ,and complex and dirty backgrounds. Numerous captured images including various types of vehicles with different lighting and noise effects have been handled. A study of the different parameters of the training and recognition phases showed that the proposed system reaches promising results in most cases and can achieve high success rates location has been confirmed by the experiments.
The World Wide Web’s constant growth and transformation offers great opportunities and poses many challenges for exploitation. The Web Mining has emerged as a possible answer to these challenges by offering an approach in organizing and sharing the Web, and explicitly considering user needs and requirements. Web mining is a very hot research topic which combines two of the activated research areas: Data Mining and World Wide Web. The Web mining research relates to several research communities such as Database, Information Retrieval and Artificial Intelligence. Web mining basically can be divided into three categories: Web Content mining, Web Structure mining and Web Usage mining, these three categories deal with different features of a web page. Web content mining deals with discovering useful information or knowledge from web page contents, web structure mining deals with discovering and modeling the link structure of web, web usage mining is used to discover interesting usage patterns from web data. For the survey, we focus on representation issues, on the process, on the learning algorithm, and on the application of the recent works as the criteria. We conclude the paper with some research issues
In recent days different access control methods have been proposed to secure the ATM Transaction from unauthorized access. This paper describes a method of implementing two way authentication. The first one is normal PIN verification method and if the password is correct then it goes to the second step of authentication (i.e.,) two way authentication method. In that if the authorized person replied YES through their mobile, then corresponding transaction takes place. Otherwise it switches ON the buzzer, automatically close the door of ATM centre and LCD will show the detail about ATM theft to the higher authorities.
Increasingly developed social sharing websites, like photobucket, pixable, picasa, piczo, instagram and Kodak gallery, allow users to create, share, annotate and comment medias. Photo sharing websites helps in publishing or transferring user's photos on-line, that enables the user to share their photos with others publicl/ privately . Nowadays, there has been an increase in the photo sharing websites which allows users not only to create, share, annotate and comment multimedia contents, but also provide useful information to improve media retrieval and management. Experiments conducted on the COREL image dataset demonstrate the effectiveness and efficiency of the proposed integrated framework and the conversion strategies.
SECURITY AND KEY DISTRIBUTION IN BINDING OF IP TO DNS
Arvind Kumar Gupta Pravin Tripathi,Sat
The mapping or binding of IP addresses to host names became a major problem in the rapidly growing Internet and the higher level binding effort went through different stages of development up to the currentlyusedDomain Name System (DNS)The DNS Security is designed to provide security by combining the concept of both the Digital Signature and Asymmetric key (Public key) Cryptography. Here the Public key is send instead of Private key. The DNS security uses Message Digest Algorithm to compress the Message(text file) and PRNG(Pseudo Random Number Generator) Algorithm for generating Public and Private key. The message combines with the Private key to form a Signature using DSA Algorithm, which is send along with the Public key.
The receiver uses the Public key and DSA Algorithm to form a Signature. If this Signature matches with the Signature of the message received, the message is Decrypted and read else discarded.
: A STUDY OF FUNDAMENTALS AND PRINCIPLES OF SOFTWARE ENGINEERING PROCESS
Niraj jNake
S
Software engineering (SE) is the application of a systematic, disciplined, quantifiable approach to the design, development, operation, and maintenance of software and the study of these approaches; that is, the application of engineering to software. A software development process is concerned primarily with the production aspect of software development as opposed to the technical aspect, such as software tools. These processes exist primarily for supporting the management of software development and are generally skewed toward addressing business concerns. Many software development processes can be run in a similar way to general project management processes. Software process is defined as a set of activities that leads to the production of a software product. Although most of the softwares are custom built, the software engineering market is being gradually shifted towards component based. There is no any ideal approach to a software process that has yet been developed. Some fundamental activities like software specification, design, validation and maintenance are common to all the process activities.
Cloud services will eliminate the need to install and manage client rich applications and further its scope in all the private sector would increase thus it would help the company to reduce high cost infrastructure and maintenance cost. . In this paper we are surveying the security related issues in the field of cloud computing. We will discuss the different techniques regarding the data privacy and also the advancement in these techniques .This paper evaluates different data privacy techniques involved in Cloud computing and then proposing the most optimized technique. This paper focuses on the usage of Cloud services and security issues to build these cross-domain Internet-connected collaborations
: Anish .T. Achankunju Sivalingam A, , Kannadasan T.
SIMULATION AND EXPERIMENTAL STUDIES ON HYDRODYNAMICS CHARACTERISTICS OF COCURRENT THREE PHASE FLUIDISATION USING FUZZY LOGIC
: Anish .T. Achankunju Sivalingam A, , Kannadasan T.
Simulation and hydrodynamics studies were carried out in a solid - liquid - gas phase concurrent fluidized bed. The experiment was carried out in a 5.4cm I.D, 6cm O.D and 160cm height vertical Perspex column. Water was used as a continuous phase and air was used as a dispersed phase. Glass beads of diameter 0.2cm, 0.4cm, 0.6cm and Gypsum particle of diameter 0.1201cm, 0.1676cm, 0.2099cm were used as solid for concurrent studies. The column consists of three sections viz. gas liquid disengagement section, test section and gas liquid distributed section. The experiment was conducted with both Glass beads and Gypsum particles on keeping constant gas velocity by varying liquid velocities. The effect of individual phase holdup for various particle sizes with specific liquid flow rates and gas flow rates were studied. Pressure drop in the fluidized bed was measured by using mercury manometer. The individual phase holdups were determined, liquid holdup and solid holdup decreases with increase in liquid velocity whereas gas holdup increases with increases in liquid velocity. With the experimental data, simulation studies were carried out using Fussy Logic.
PERFORMANCE EVALUATION OF LOCALIZATION TECHNIQUE IN WIRELESS SENSOR NETWORK
Urvashi Singh Manish kumar Jha
This The process of estimating the geographical location of sensor nodes, called localization is an important research area in WSN. Accurate localization or tracking of wireless device is a crucial requirement for many emerging location aware systems. Fields of application include search & research, medical care, intelligent transportation, location based billing, security, home automation, industrial monitoring and control, location-assisted gaming, and social networking. This work attempts to deal with Classical Multidimensional Scaling (CMDS) technique to estimate the mapping of nodes. Further mapping is done with Neural network (NN) to further improve the accuracy of estimated node position. Comparison results show improved localization with NN implementation.
Today, the popular technology in Networking talks about MANET(Mobile Adhoc Network). It makes tremendous change in wireless network. A mobile adhoc network is an wireless network connected with autonomous mobile nodes which are self configured and dynamic. Security is a main concern for protecting the communication between mobile nodes. The security design of the mobile adhoc network leads many problems such as shared wireless medium,, severe resource constraints, and highly dynamic network environment. In this paper we identify the security issued relevant to this problem, analyze the challenges to security design, and evaluate the state-of-art security proposals etc. The security solution is given according to the survey.
ACTIVE SOURCE ROUTING PROTOCOL FOR MOBILE NETWORK:
Anil Kumar Prasad Anamika Bhusan, Divya Gupta
An ad-hoc mobile network is a collection of mobile nodes that are dynamically and arbitrarily located in such a manner that the interconnections between nodes are capable of changing on a continual basis. The primary goal of such an ad-hoc network routing protocol is correct and efficient route establishment between a pair of nodes so that messages may be delivered in a timely manner. LAR is an on-demand protocol who is based on the DSR(Dynamic Source Routing). The Location Aided Routing protocol uses location information to reduce routing overhead of the ad-hoc network! Normally the LAR protocol uses the GPS(Global Positioning System) to get these location information’s. With the availability of GPS, the mobile hosts knows there physical location.Ad hoc networks are a new wireless networking paradigm for mobile hosts. Unlike traditional mobile wireless networks, ad hoc networks do not rely on any fixed infrastructure. Instead, hosts rely on each other to keep the network connected. The military tactical and other security-sensitive operations are still the mainapplications of ad hoc networks, although there is a trend to adopt ad hoc networks for commercial uses due to their unique properties. One main challenge in design of these networks is their vulnerability to security attacks. In this paper, we study the threats an ad hoc network faces and the security goals to be achieved. We identify the new challenges and opportunities posed by this new networking environment and explore new approaches to secure its communication. In particular, we take advantage of the inherent redundancy in ad hoc networks — multiple routes between nodes — to defend routing against denial of service attacks.
EMPIRICAL ANALYSIS OF CORRELATION BETWEEN C&K METRICS AND TESTABILITY
Jaspreet kaur, Raj Kumari
Software quality is very crucial in the development of software systems.Software Metrics help in identifying potential problem areas and finding the problems in those areas decreases the cost. C&K metric suite is one of the well known and most popular metric suites known for measuring the design of object oriented programs. As we know that testing determines the conformance of the software’s implementation to its specification so it is one of the important phases in object-oriented systems. This paper focuses on the analysis of the impact of various C&K metrics on testing.