Enhancement in the neighbor-based diskless checkpointing approach
Parase Dipali B, Dr Mrs Apte S S, Shegadar A R
In this paper we are providing a solution for the stable storage requirement in the disk-based system. Fault tolerant is very essential for distributed and parallel systems. To handle multiple processor failure here, we are using a diskless checkpointing approach. In that we are enhancing the neighbor-based approach. Instead of storing checkpoint in the stable storage we are storing it in peer processor’s main memory. To overcome the memory overhead we use a parity technique.
over the recent years, a great deal of effort has been made to age estimation & gender recognization from face images. It has been reported that age can be accurately estimated under controlled environment such as frontal faces, no expression, and static lighting conditions. However, it is not straightforward to achieve the same accuracy level in real-world environment because of considerable variations in camera settings, facial poses, and illumination conditions. In this paper, we apply a recently-proposed machine learning technique called covariate shift adaptation to alleviating lighting condition change between laboratory and practical environment. Through real-world age estimation experiments, we demonstrate the usefulness of our proposed method.
Online Identification Using RLS Algorithm and Kaczmarz’s Projection Algorithm for a Bioreactor Process
S.Sundari, Alamelu Nachiappan
The control theory and automation technology is widely used in industries. Many control algorithms are based on the mathematical models of dynamic systems. Mathematical models and parameter estimation are basics for automatic control. A recursive least squares parameter estimation algorithms and Kaczmarz’s projection algorithm is applied based on ARMAX model and OE models. The estimated parameters are compared with the true ones. The proposed method has been applied to identify the parameters of bioreactor process
Detection & Prevention of Wormhole attack on AODV Protocol in Mobile Adhoc Networks (MANETS)
Dimple Saharan
In the past few years mobile ad hoc networks (MANETs) have been emerged as the networks for next generation wireless networks. MANETs networks does not require an underlying infrastructure and have dynamically changing network topology. Due to high dynamism and mobility these networks are more vulnerable to attacks. The network performance might be improved if the network is clustered by grouping together nodes that are in close proximity. The primary goal is to enhance the performance of the network and to improve the durability of the nodes and hence the network life time. In this paper the effect of Wormhole attack is analyzed on AODV routing protocol in MANET and a prevention mechanism is presented to secure the network.
Energy-based Controller with Optimization Tuning by Using Nelder-Mead Algorithm for Overhead Cranes
Nguyen Quang Hoang, Vu Van Khoa
This paper presents a combination of a nonlinear PD controller and Nelder-Mead algorithm to design an optimal controller for a nonlinear overhead crane system. The nonlinear PD controller is derived based on the passivity of the system and Nelder-Mead algorithm is exploited to find optimal parameters for the controller. The system dynamic model is derived by using Lagrangian equation. Simulations are conducted within Matlab environment to determine the optimal control parameters and to verify the performance of the controller. The simulations demonstrates that the controller is effective to move the trolley as fast as possible to the desired position while the oscillation of the payload is suppressed at the end of the operation. The robustness of the controller against uncertainties in cable length and payload is also indicated by the simulations.
A Wireless LAN Protocol for Initial Access Authentication
Sandhya K, Nagaraju Rayapati
Nowadays there is widespread use of WLAN enabled devices, so it is equally important to have efficient initial link setup mechanism. In this paper a fast access authentication process is implemented which is faster than current 802.11i. Through experiments, it is observed that the inefficiency of 802.11i is due to its design from the framework perspective which introduces too many messages. Due to more number of round-trip messages in 802.11i authentication delay is intolerable under some scenarios. To overcome this, an efficient initial access authentication protocol FLAP is proposed which introduces two round-trip messages with authentications and key distribution. Proposed FLAP protocol scheme is more secure than 4-way handshake protocol. Simulations are conducted using different scenarios like Authentication delay, Throughput, Packet Delivery Ratio (PDR), Packet Drops are measured for different scenarios and compared between the 802.11i and FLAP protocol. The results show that the FLAP is more advantageous when WLAN gets crowded.
STUDY ON WEB STRUCTURE MINING AND ITS PAGE RANKING ALGORITHM
T. Shanmugapriya, K.Kalaiselvi
Web Structure Mining deals with the hyperlink structure of the document in the web. The various Web Structure Mining algorithm are page rank, weighted page rank, hyper induced topic search(HITS),Link Editing, Topological Utility Frequency Mining. The study focuses on the page rank algorithm. The following section describes the page rank algorithm structure, computation, problems, pros and cons.
Maryam Nazemi Ashani , Ali HarounAbadi , sayed Javad Mirabedini
Presenting of executable model of the enterprise architecture in order to evaluate the reliability
Maryam Nazemi Ashani , Ali HarounAbadi , sayed Javad Mirabedini
In this study has been used unified modeling language and reliability parameter as generalize stereotypes have been added. Then, executable model of enterprise architecture products has been made by using Colored Petri Net. By using the results of simulation of this model we can compute the minimum and maximum of reliability (interval of reliability). The proposed approach than former methods has this privilege that by early computing of reliability in planning phase and consider the feedback on the model and modify the model, it is possible to avoid onerous costs of implementation.
Reduction of ambiguity due to synonym and Homographs in Punjabi Language
Navdeep Kaur, Vandana Pushe
We present a probabilistic generative model for learning semantic parsers from ambiguous supervision. Our approach learns from natural language sentences paired with world states consisting of multiple potential logical meaning representations. It disambiguates the meaning of each sentence while simultaneously learning a semantic parser that maps sentences into logical form. Compared to a previous generative model for semantic alignment, it also supports full semantic parsing.
A novel approach to make NLP predictive and non-ambigious in Punjabi Language
Parneet Kaur, Er.Vandana Pushe
This paper present a probabilistic generative model for learning semantic parsers from ambiguous supervision. Our approach learns from natural language sentences paired with world states consisting of multiple potential logical meaning representations. It disambiguates the meaning of each sentence while simultaneously learning a semantic parser that maps sentences into logical form. Compared to a previous generative model for semantic alignment, it also supports full semantic parsing.
Rakesh Patel , Mili Patel, Proff. Anupam R. Chaube
Study of Public and Private Clouds in Indian Environment
Rakesh Patel , Mili Patel, Proff. Anupam R. Chaube
The Cloud computing is a general term used to describe a new class of network based computing that take place over the internet. These platform hide the complexity and details of underlying infrastructure from users and applications by providing very simple graphical interface.
Basically, clouds are of three categories. IT organizations can choose to deploy applications on public, private, or hybrid clouds. In this paper we are studding Private and Public Cloud
Punjabi Speech Synthesis System for Android Mobile Phones
Jagmeet Kaur, Parminder Singh
Mobile phone usage is approximately 3.5 times more than the usage of Personal Computers. Android has the biggest share among all Smartphone Operating systems like Symbian, Windows etc. because it has very few restrictions for the developers to develop an application. Text to speech synthesis is one application which reads the written text aloud. TTS systems on Android are available for many languages but not for Punjabi. Our present work is to develop a Punjabi text to speech synthesizer that can produce an output speech on a mobile device. While porting this TTS system to a resource limited device like mobile phones, some practical aspects like application size and processing time are considered. The Concatenative Speech Synthesis technique has been used which uses the Phonemes as the smallest single units for concatenation.
Stock Market Be¬havior Prediction Using Pattern Matching Approach
Prakash Kumar Sarangi, Birendra Kumar Nayak
In this paper we propose a new model for prediction of stock market behavior using pattern matching approach. The fluctuation of stock market is characterized by a number 0 and 1, ‘0’ denoting non-increasing state and ‘1’ denoting an increasing state. The behavior of the stock market is put into sequence of 0’s and 1’s which was converted to the sequence of nucleotides A, T, C, G. This sequence so obtained is matched to the text DNA sequence by using BLAST. Comparing their results using hamming distance and predict the future increasing or non-increasing behavior of stock market. Possibility using this approach to predict the stock market behavior is explored.
Amanjot Kaur, Dr. Paramjeet Singh, Dr. Shaveta Rani
Spell Checking and Error Correcting System for text paragraphs written in Punjabi Language using Hybrid approach
Amanjot Kaur, Dr. Paramjeet Singh, Dr. Shaveta Rani
The spell checker is the basic necessity for composing any documentation in any language. Spell checker is software that analysis the incorrect word and provide their most possible correct word. Make a spell checker in Indian language is a very challenging and uphill task. Punjabi is world’s 14th mostly used language. Work done in Punjab language is very challenging task. In a computer system a Punjabi words is typed in different manners because Punjabi language has more than 40 different fonts. Punjabi is mother tongue for more than 110 million people in the world. This paper describes the techniques used in a spell checker. Natural language processing (NLP) is a field of computer science concerned with the interactions between computers and human (natural) languages. Modern NLP algorithms are based on machine learning, especially statistical machine learning.
Interactive Pick and Place Robot Control using Image Processing Method
Mr. Raskar D. S., Prof. Mrs. Belagali P. P.
In this paper we describe a straight forward technique for tracking a human hand based on images acquired by an active stereo camera system. We demonstrate the implementation of this method for pick and place robot as part of a multi-modal man-machine interaction system: detecting the hand-position, the robot can interpret a human pointing gesture as the specification of a target object to grasp.
Review Based Entity Ranking using Fuzzy Logic Algorithmic Approach: Analysis
Pratik N. Kalamkar, Anupama G. Phakatkar
Opinion mining, also called sentiment analysis, is the field of study that analyzes people’s opinions, sentiments, evaluations, appraisals, attitudes, and emotions towards entities such as products, services, organizations, individuals, issues, events, topics, and their attributes. Holistic lexicon-based approach does not consider the strength of each opinion, i.e., whether the opinion is very strongly negative (or positive), strongly negative (or positive), moderate negative (or positive), very weakly negative (or positive) and weakly negative (or positive). In this paper, we propose approach to rank entities based on orientation and strength of the entity’s reviews and user's queries by classifying them in granularity levels (i.e. very weak, weak, moderate, very strong and strong) by combining opinion words (i.e. adverb, adjective, noun and verb) that are related to aspect of interest of certain product. We shall use fuzzy logic algorithmic approach in order to classify opinion words into different category and syntactic dependency resolution to find relations for desired aspect words. Opinion words related to certain aspects of interest are considered to find the entity score for that aspect in the review.
Corner Defect Detection Based On Inverse Trigonometric Function Using Image of Square Ceramic Tiles
Ravindra Singh, Gyan Chand Yadav
Today Ceramic Tile is mostly used in construction of floor at home, offices and shops and many other places. For user needs the ceramic tile industry demand increase, thus tile industry also increases the production, but the quality maintaining or quality control is also an important factor. This factor is directly related the production of ceramic tile. If quality is maintaining manually, it takes lots of time and some minor defect not finding by this process. In this paper we find corners defects of the ceramic tiles. We use image processing and Inverse Trigonometric Function. The angle of each corner square ceramic tile is equal to 90˚ means it is normal. Our proposed algorithm is classified in Upgraded Automatic Quality Maintaining Machine (UAQMM). By proposed algorithm we increase the efficiency rate and decrease the total computational time.
Mr. P Ravindra, Mr. Dr.V.V.Krishna Mr. S.Srinivasulu
The Future Version of IP - IPV6
Mr. P Ravindra, Mr. Dr.V.V.Krishna Mr. S.Srinivasulu
In the process of Internet evolution, the transition from IPv4 to IPv6 has become inevitable and fairly urgent. Internet Assigned Numbers Authority has finally exhausted the global IPv4 address space, which leaves the community no choice but pushes forward the IPv6 transition process. IPv4 and IPv6 networks both will exist during the transition period, while the two are not compatible in nature. Therefore it is indispensable to maintain the availability, as well as to provide the inter-communication ability of IPv4 and IPv6. Years ago a series of transition techniques were actually proposed. However, because of their technical immatureness, they failed to cover the solution space well. Some of these techniques were even obsoleted by IETF due to their flaws. This paper reconsiders the basic problems and key difficulties in IPv4-IPv6 transition, and introduces the principles of tunneling and translation techniques. Then the paper surveys the mainstream tunneling and translation mechanisms raises since 1998.
Usability Testing is the process by which the human-computer interaction characteristics of a system are measured, and weaknesses are identified for correction. It is a process by which products are tested by those who will use the system. The need for a Usability Testing is someone who is a user of the design (who acts like a user), something to test (a design in any state of completion), and someplace where the user and design can meet and can be observed. Gathering Usability Testing Requirements is different from Functional Test Requirements because they are much loosely defined. Often many of these requirements are implicitly stated and requirements vary depending on the Business Domain of the system and its Target Users. Studying the user work profile could help the Test Team decide on how Usability Testing needs to be done. Usability Testing is a kind of a Quality Testing that assesses how easily the User Interfaces can be used.
Wavelength Assignment in Optical WDM Optical Network
Mudasir Ali , Parminder Singh Saini
This paper analyses the wavelength assignment problem in case of WDM optical networks. The random wavelength assignment algorithm is compared with the first – fit wavelength assignment algorithm .These two assignment strategies are also compared with wavelength conversion case. These comparisons are made on the basis of blocking probability, keeping the number of channels and number of links constant and the response is calculated by varying load per link ( in erlangs). Blocking probability on a link is calculated by using famous Erlang B formula. It is seen that blocking probability in case of random wavelength assignment is always greater than first –fit wavelength assignment algorithm. The blocking probability in case of WDM networks employing wavelength converters is minimum but it increases the overall cost of the optical network.
K. Naveen Kumar, Madan Pojala, P.Venkateswarlu Reddy
Analyze & Classify Intrusions to Detect Selective Measures to Optimize Intrusions in Virtual Network
K. Naveen Kumar, Madan Pojala, P.Venkateswarlu Reddy
Cloud computing provides a rich set of features and facilitate user to install softwares and applications in virtual machine (VM) temporally to finish their task with that software which is required and not available in cloud. But some attackers mislead this feature to introduce vulnerability as applications into VM. These applications are distributed over the virtual network and denial some services running over the VM accessing unknowingly by multiple users. To prevent vulnerabilities in VM we are introducing a network agent periodically scans the VM for vulnerable things and reported to attack analyse. It build attack graph by analyse the attack to know its type and apply selective measures to optimize it by network controller with help of VM Profiler.
The electric drive systems used In Industrial applications are increasingly required to meet higher performance and reliability requirements. The DC Motor is an attractive piece of equipment in many Industrial applications requiring variable speed and load characteristics due to its ease of controllability. Microcontrollers provide a suitable means of meeting these needs. In this paper, Implementation of the AT89C51 Microcontrollers for speed control of DC motor fed by a DC chopper has been investigated. The chopper is driven by a high frequency PWM signal. Controlling the PWM duty cycle is equivalent to controlling the motor terminal voltage, which In turn adjust directly the motor speed. The paper is designed to develop a speed control system for a DC motor using microcontroller PIC AT89C51. The motor is operated in four quadrants i.e. clockwise; counter clock-wise, forward brake and reverse brake. It also has a feature of speed control.
Eigenface Approach For The Recognition Of Face And Detection Of The Face
Er. Anup lal Yadav, Er. Sahil Verma, Er. Kavita
As one of the most successful applications of image analysis and understanding, face recognition has recently received significant attention, especially during past few years. Recently face recognition is attracting much attention in the society of network multimedia information access. Areas such as network security, content indexing and retrieval, and video compression benefits from face recognition technology because "people" are the center of attention in a lot of video. Network access control via face recognition not only makes hackers virtually impossible to steal one's "password", but also increases the user-friendliness in human-computer interaction. Indexing and/or retrieving video data based on the appearances of particular persons will be useful for users such as news reporters, political scientists, and moviegoers. For the applications of videophone and teleconferencing, the assistance of face recognition also provides a more efficient coding scheme. In this paper, we give an introductory course of this new information processing technology.
Ideas for Some Improvement in A.C.O. On The Behalf Of Presently Working A.C.O.
Er. Anup lal Yadav, Er. Sahil Verma, Er. Kavita
Various natural systems teach us that very simple individual organisms can create systems able to solve the complex problems like optimization. Ant Colony Optimization (ACO) is an agent-based technique, which simulates the natural behavior of ants and develops mechanisms of cooperation and learning. Ant Colony Optimization is a metaheuristic approach for solving hard combinatorial optimization problems. The main idea of ACO is to model a problem as the search for a minimum cost path in a graph. Ant Colony Optimization has been successfully applied to scheduling, vehicle routing, and the sequential ordering problems. A review of several Ant Colony Optimization algorithms is done in this paper.
Harshraj N. Shinde, Aniruddha S. Raut, Shubham R. Vidhale, Rohit V. Sawant, Vijay A. Kotkar
A Review of Various Encryption Techniques
Harshraj N. Shinde, Aniruddha S. Raut, Shubham R. Vidhale, Rohit V. Sawant, Vijay A. Kotkar
In today’s world secure transmission of important or personal data is of big concern. The encryption of the data plays important factor of security while sending the information of data. Encryption of data means converting plain text to cipher text. There are many encryption techniques available, but the big question that arises is which one is good one or which one is suitable as per requirements. In this paper we do analysis of various encryption techniques viz. AES, BLOWFISH, DES, 3DES, RC4, RC6, RSA, UMARAM and UR5.
Web Service has been playing a significant role in application development and integration. SOA enables the development of flexible large scale-applications in open environments by combining the web services. There exist many web services which has similar functional characteristics. It provides service to consumer with facilities for selecting required web services according to their functional and QoS characteristics. Important issues are a how to conveniently, accurately and efficiently retrieve the web services from large-scale repositories. The goal of the multi agent is to support web services discovery with QoS registration, verification, certification, and confirmation. In this paper, propose a novel approach for designing and developing a multi agent-based architecture and its QoS-based matching, ranking, feedback and selection algorithm for evaluating web services.
Revised Reliable Algorithm with Results & Algorithm for Thinning Numeral Pattern
Er. Anup, Er. Sahil Verma, Dr. Kamal Sharma
thinning algorithms have played an important role in the preprocessing phase of OCR systems.. Many algorithms for vectorization by thinning have been devised and applied to a great variety of pictures and drawings for data compression, pattern recognition and raster-to-vector conversion. The vectorization algorithms often used in pattern recognition tasks also require one-pixel-wide lines as input. But parallel thinning algorithms which generate one-pixel-wide thinnings can have difficulty in preserving the connectivity of an image or generate spurious branches. A few most common thinning algorithms have been implemented and evaluated on the basis of performance parameters.
Comparative analysis of image quality assessment using HVS Based Model
Dr. Anil Panghal, Puja Chugh A.P
Image quality assessment means estimating the quality of an image and it is used for many image processing applications. Image quality can be measured in two ways, subjective and objective method. In this paper, I am focusing on the FR objective image quality metric, where the quality of the distorted test images are obtained based on the comparison with the reference image which is assumed to be perfect in quality. In this we evaluate the MSSIM IQA metric for colour image at different noise density level .
HyperSpectral image classification has been used for many purposes in remote sensing, and vegetation research, environmental monitoring and also for land cover classification. A hyperspectral image consists of many layers in which each layer represents a specific wavelength. This paper aims to classify the hyperspectral images to produce a thematic map accurately. Spatial information of hyperspectral images is collected by applying morphological profile and local binary pattern. Support vector machine (SVM) is an efficient classification algorithm for classifying the hyperspectral images. Genetic algorithm is used to obtain the best feature subjected for image classification. The classes and thematic map are generated by using future extraction. Experiment is carried out with AVIRIS Indian Pines and ROSIS Pavia University. This method produces the accuracy as 93% for Indian Pines and 92% for Pavia University.
In this paper we present the concepts and methods developed for the activities in communication systems for car-to-car communication to avoid Vehicle crash. From such communication systems, Controller area network (CAN) is a computer network protocol and bus standard designed to allow microcontrollers and devices to communicate with each other without a host computer. Modern automobiles are no longer mere mechanical devices; they are pervasively monitored and controlled by dozens of digital computers coordinated via internal vehicular networks. From a network-based systems the controller area network (CAN) to evaluate the feasibility of using such an in modern cars for a cooperative driving. The CAN module and multiple sensors have added in system for the passengers and vehicle safety. The cooperative and safe driving can be achieved by drivers by analysing another close vehicles data by drivers of car on their Dashboard.
Design & Implementation of Binary Phase Shift Keying Demodulation through phase locked loop
Roopa.V, R.Mallikarjuna Setty
This paper describes about BPSK demodulation using costas loop and implementing on a DSP kit. The demodulation process can be divided into three major subsections. First, since the incoming waveform is suppressed carrier in nature, coherent detection is required. The methods by which a phase-coherent carrier is derived from the incoming signal are termed carrier recovery. Next, the raw data are obtained by coherent multiplication, and used to derive clock-synchronization information. The raw data are then passed through the filter, which shapes the pulse train so as to minimize inter symbol-interference distortion effects. This shaped pulse train is then routed, along with the derived clock, to the data sampler which outputs the demodulated data. The codes for the PLL and the Costas loop are written in C language and implemented on the DSP kit ADSP 21060 [4,5]. The various input and output plots are observed on the SHARC (Super Harward Architecture) simulator and also on the Emulator.
Battery Capacity Management in Wireless Sensor Network Rechargeable Sensor Nodes
Mrs. Anasuya., N. Jadagerimath
A Wireless Sensor Network (WSN) consists of distributed autonomous sensors to monitor physical or environmental conditions and to cooperatively pass their data through the network to a main location. Each node and sensor is capable of computation, communication and sensing. Size and cost constraints on sensor nodes result in corresponding constraints on resources such as energy and memory. Sensor Node battery capacity management improves to a considerable extent the challenges of recharging nodes and prolonging WSN lifetime. The energy allocation according to wireless node charge level along with energy harnessing, Optimal decision of node sampling rate and Time determination of natural energy utilization are methods by which Node battery capacity can be maintained in an energy efficient way thereby saving energy during WSN working and data transmission by sensor nodes . This paper gives an overview of energy saving in WSN, objectives leading to conservation of power in Wireless nodes, challenges and constraints in WSN working and methods to maintain battery reserves of WSN sensor nodes.
Neural control/interfacing/interaction is a powerful means, which can develop a robust bridge between humans and machines. In this paper we emphasize on neural interfacing as an evolving trend in wireless communications by taking into account one of its important application i.e. cyborgs. A cyborg is a cybernetic organism (i.e. an organism that is a self-regulating integration of artificial and natural systems). In the next half of the paper we discuss the operational features of cyborgs.In an attempt to promote greater interaction between humans and computers, companies that develop (cybernetic) robotics technologies participate in a variety of seductive strategies that embody the cyborg discourse. Some of these strategies persuade individuals to concede to particular philosophies, such as the argument that technical artifacts and instrumental reasoning are necessary for effective social development. With the experiments conducted and proposed to be conducted in future and in the process give a brief description of the advantages and disadvantages of this technology
This paper is mainly focused on approaches and techniques to online promotion on web marketing including search engine optimization, (SEO), online advertisements and new methods of increasing the visitors to a Website. In last decade, the emergence of search engine marketing as one of the fastest growing industries has been observed. Search engine marketing has been categorized as: paid and non-paid. Paid marketing included pay-per-Click (PPC), web page, including the page title, content and keyword density, and come up with a ranking for where to place the results on the pages and paid inclusion while non-paid marketing includes natural SEO and link popularity index. This is still a relatively new market and most of the companies lack in the knowledge or the tools to manage their marketing strategies in order to maximize the return of their investments (ROI). It can be effectively applied to technical aspects in managing paid and non-paid search engine marketing required in the Service Oriented Architecture (SOA) dimension. In this paper an attempt has been made to analyse the following factors: online promotion techniques, search engine optimization, on-page optimization, paid and non paid search marketing, and factors that affect SEO.
Srinivas Kolli, B. V. Srikanth, Dr.P.Venkateswarlu
Switching attacks in wireless networks using denial of service phenomenon
Srinivas Kolli, B. V. Srikanth, Dr.P.Venkateswarlu
sensor networks offer eco- nomically viable solutions for a variety of applications. for example, current implemen- tations monitor factory instrumentation, pollution levels, free- way traffic, and the structural integrity of buildings. other applications include climate sensing and control in office buildings and home environ- mental sensing systems for tempera- ture, light, moisture, and motion. ad-hoc low-power wireless networks are an exciting research direction in sensing and pervasive computing. prior security work in this area has focused primarily on denial of communication at the routing or medium access control levels. this paper explores resource depletion attacks at the routing protocol layer, which permanently disable networks by quickly draining nodes’ battery power. these “vampire” attacks are not specific to any specific protocol, but rather rely on the properties of many popular classes of routing protocols. we find that all examined protocols are susceptible to vampire attacks, which are devastating, difficult to detect, and are easy to carry out using as few as one malicious insider sending only protocol compliant messages.
A Comprehensive Analysis of Cloud Computing Including Security Issues and Overview of Monitoring
Deepika Upadhyay, Sanjay Silakari, Uday Chourasia
Cloud computing is a widely adopting paradigm now days. Primary reason behind more organizations adopting cloud is reduction in cost and dynamic resource allocation. Also various characteristics such as scalability, elasticity, multi-tenancy, pay-per-use approach make cloud computing the most wanted and widely popular paradigm today. But with these characteristics, cloud inherits some serious issues like insider attacks, security, and reliability. Cloud is deeply affected by malicious attacks, for example in 2009 Google was attacked by DOS intrusion that took down services of cloud like Google news, Gmail for several days. So the major focus and challenge is securing cloud because huge amount of users and IT organizations implementing cloud services, this is the reason that gaining large user’s trust is very important. Cloud monitoring helps to properly manage and control these issues in an efficient way. This paper has done a brief analysis of cloud computing, security risks in cloud, and overview of cloud monitoring. Current platforms of cloud monitoring are surveyed based on some evaluation parameters and services. Finally research is outlined for cloud monitoring and attack detection using monitoring.
Extended self destructing data system with data recovery
Vinayaka R H, Dayananda P , Shwetha S
Personal data stored in cloud may contain account numbers or passwords or notes or any other important information that could be used and misused by a hacker, a competitor, or a court of law. These data are cached, copied, and used by Cloud Service Providers (CSPs), often without users authorization and control. SeDas mainly aims at protecting the user data’s privacy. All these data and their copies become destructed or unreadable after a user-specified time, without any user intervention. In addition to that the decryption key will be destroyed after the user-specified time. The presence of SeDas, a system that meets this challenge through a novel integration of cryptographic techniques with active storage techniques based on T10 OSD (Object Storage Device) standard provides a recovery mechanism to the legitimate users to obtain their data back by requesting to the cloud admin. A new key is be sent to the legitimate user either to the Email or to Mobile using this key he has to login to the SeDas platform to get back their data. So this approach is more efficient to use and possible to achieve all the privacy preserving goals.
New forms of scholarship are emerging in the world. These aim at providing freely accessible research materials through scholarly communication. There are two common ways of providing open access to research outputs namely, through the open access journals and through discipline or institutional-based open access repositories, that is: Open Access Institutional Repositories (OAIR) driven by open access publishing. In principle, open access to research outputs maximizes research access and thereby also research impact, making research more productive and effective. Today, institutional repositories are becoming major components of the technical infrastructure of successful research–based institutions. It is a trend now observed in most universities and in particular research-based institutions in developing countries. Recently, there has been a high uptake of institutional repositories by higher learning institutions in Tanzania. Despite this fact however, the study on the status of open access publications in the country have not been undertaken.
This paper provides an overview of characteristics of open access repositories as a global publishing concept. It then applies the same, as a case study, to review open access publications in Tanzania, and summarizes the status of a growing body of evidence on adoption and usage of open access publications in the country. It also assesses the contribution of Tanzanian institutional repositories in the scholarly communication. In the assessment, performance of Tanzanian institutional repositories, as reflected through global visibility and impact of their repositories in the Directory of Open Access Repositories (OpenDOAR), is examined. In addition, the performance of Tanzanian universities in archiving and sharing research findings through institutional repositories, based on the Ranking Web of Repositories (RWR) is examined.
Findings from the examination of the identified open access repositories reveals that, out of the 5 Tanzanian institutional repositories in the study, only 2 are listed in the RWR. These two are ranked at 1060th and 1362nd positions (out of 1983) in the world ranks, as at July 2014. This implies that only 40% of the identified repositories in Tanzania are visible and incorporate good practices in their web publications. It is also revealed that much studies have focused not on open access institutional repositories, but on the factors contributing to adoption of open access scholarly communication in the country. Hence, the relevance of the research reported in this paper.
Finger Vein Detection Combining Segmentation, Gabor Filter and Matched Filter
Amandeep kaur
This paper proposes a method of personal identification based on the finger-vein patterns. A camera under the IR light transmission captures the finger hence capturing the finger vein patterns. We systematically develop a new approach for the finger vein feature extraction using Repeated Line tracking, Gabor filters and Matched Filter. The result after the combinations of these methods is more accurate than the results of the individual method. The aim of this paper is to investigate finger-vein technology.
A Survey on Routing Protocols Using TCP Variants over MANETs
Manpreet Kaur, Dr. Sandeep Singh Kang
Ad hoc networks are characterized by multi-hop wireless connectivity, frequently changing network topology and the need for efficient dynamic routing protocols plays an important role. Comparison of the performance of two prominent on-demand routing protocols for mobile ad hoc networks: Dynamic Sequence Distance Vector (DSDV) Routing Protocol, Optimized Link State Routing (OLSR) Protocol. A detailed simulation model with MAC and physical layer models is used to study the interlayer interactions and their performance implications. This paper demonstrate that even though OLSR and DSDV share similar on-demand behaviour, the differences in the protocol mechanisms can lead to significant performance differentials. Overview of two on demand routing protocols DSDV and OLSR based on packet delivery ratio, normalized routing load, normalized MAC load, average end to end delay by varying the number of sources, speed and pause time. Reliable transport protocols such as TCP are tuned to perform well in traditional networks where packet losses occur mostly because of congestion. However, networks with wireless and other lossy links also suffer from significant losses due to bit errors and handoffs. TCP responds to all losses by invoking congestion control and avoidance algorithms, resulting in degraded end-to-end performance in wireless and lossy systems. In this Paper, comparison of several schemes designed to improve the performance of TCP in such networks is included.
A Metric Base Calaculation for Object Oriented Software Modularization Quality Measurement
Ruchi Kulkarni, Samidha Diwedi Sharma
Software development and maintenance is major concern to adopting modularization. Measuring the design quality early during software development has been regarded as a prominent way to assure the quality of software products. Several models has been proposed to estimate the quality of software systems.
This system proposed an approach for determining the design quality of an Object Oriented software using software metrics. The metrics for object oriented design focus on measurements that are applied to the class and design characteristics. To validate the proposed methodology, we have chosen Open source software project. We extracted a set of chosen software metrics that play a definite role in software design quality. Our metrics characterize the quality of modularization with respect to the APIs of the modules. The percentile average values calculated by these metrics formulate a straight forward approach to assign a design quality for any software systems.
For this work simulation, an application is developed in java. This system examines the modularization quality of OO software by measuring the extent in which a class in a module uses another class in some other module and the extent of inter-module call traffic created by inheritance Experiments are carried out by means of different software version and the result show that it works properly. The outcomes of the experimental study provide a strong base for the effectiveness of our system for metric based design quality measurement of object-oriented software.
Mining High Utility Items from Transactional Databases-Useing Systolic Algorithm
C.Mamatha Devi, M. Bhargavi
Datamining emergence topic is utility Efficient mining of high utility itemsets plays an important role in many real-life applications and is an important research issue in data mining area. Algorithms utility pattern growth and UP-Growth+ is used for storing the information about high utility item set such that by using only double scanning of database, candidate itemsets can be efficiently generate. Mining high utility itemsets by cropping candidates based on the estimated utility values, and based on the transaction weighted utilization values . so we need to take the lot of memory and time consumption is more this draw back overcome we need to propose the systolic algorithm .in this algorithm we need to calculate the single transaction, tree mining .when u compare to the up growth and systolic algorithm it takes less time
We review structures for microwave containing ‘left-handed’ metamaterials that are artificial composites with simultaneously negative effective permittivity and permeability which achieve negative values of refractive index and then we discuss some of the salient electromagnetic features of these metamaterials. This is followed by the description of some of the ideas regarding potiential future applications of metamaterials in devices and components.
A Review On Software Testing In SDlC And Testing Tools
T.Amruthavalli, S.MahaLAkshmi, K.HariKrishnan
The paper reviews the software testing and the different tools used for testing the software.The testing is based on the methods of its attributes,security and its usability.Testing is an important part in software engineering where the coding gets deployed based on the testing.The testing can be done with different parameters of different types .I do not mean to give here a complete survey of software testing.Rather I intend to show how unwieldy mix of theoretical and technical challenges faced by the testers between the state of art and practice.
Solving Knapsack Problem Using Constraint Programming
Vaishali Cooner
A logical relation among several unknown variables is known as a constraint, where each variable in a given domain takes a value. The basic idea behind constraint programming framework is to model the problem as a set of variables with domains and a set of constraints.This problem has been studied since 1897 i.e. more than a century. In this paper we will consider a problem which will represent the basic features of constraint programming. We will here use constraint programming to solve this problem of knapsack. Our goal is to select items which have a greatest total value under certain conditions of limit on weight.
Vaibhavi S. Gandhi, Akshay A. Khond, Sanket N. Raut, Vaishali A. Thakur, Shabnam S. Shaikh
A Review of Various Gesture Recognition Techniques
Vaibhavi S. Gandhi, Akshay A. Khond, Sanket N. Raut, Vaishali A. Thakur, Shabnam S. Shaikh
Gesture recognition is a subfield of Human Computer Interaction (HCI). Human Computer Interaction has become very attractive field in recent years. Various conventional devices such as mouse, keyboard, joysticks now can be replaced by touch free technologies. To achieve touch-free environment for interacting with computer systems, various algorithms and methodologies have been proposed. In this paper we are surveying methodologies that are proposed previously for hand gesture recognition. We are comparing these techniques on the basis of various parameters such as high accuracy, robustness and lower complexity. The evolution of hand gesture technology from glove based sensing to most recent model based sensing is explained in accordance with their advantages and limitations. Vision based sensing has an advantage that there is no hardware component required and it also gives a benefit of directly using natural motion of hand. For vision based sensing, latest approaches are listed on the basis of parameters of correctness and user friendliness.
Effect on Relaxation Time and Polarizability in Electric Field Induced Nucleation Processes
N. Singh, R. S. Chauhan
The increase in polarizability of water vapour molecules in the nucleation of water vapour condensation and ice glaciations, result in increase of Gibbs free energy and hence the increase in nucleation rate, but at the same time decrease in relaxation time, the effective polarizability varies nearly inversely as the absolute temperature.
M. C. Raju, B. Vidyasagar, S.V. K. Varma, S. Venkataramana
Radiation absorption effect on MHD, free convection, chemically reacting visco-elastic fluid past an oscillatory vertical porous plate in slip flow regime
M. C. Raju, B. Vidyasagar, S.V. K. Varma, S. Venkataramana
In this paper an analysis presented to investigate the influence of radiation absorption, chemical reaction and heat source effects on hydro magnetic free convection heat and mass transfer flow of visco-elastic fluid through porous medium bounded by an oscillating porous plate in the slip flow regime with constant suction and temperature dependent heat source. A uniform magnetic field of strength B0 is applied at an angle α with the field flow direction. Analytical solutions for velocity, temperature and concentration are obtained. Skin friction, rate of heat and mass transfer coefficients are also derived. The results have been analyzed and presented graphically for various values of the flow parameters. It is observed that velocity increases with an increase in porosity parameter, radiation absorption, Grashof number, modified Grashof number and it decreases with an increase in magnetic parameter, Schmidt number, chemical reaction parameter, Prandtl number.
MPLI: A Novel Modified Parametric Location Identification for AODV in MANET
Akash Karma, Jitendra Choudhary
Wireless ad hoc network provides a short range communication medium for mobile devices. In such type of network the components satisfying the infrastructural needs are not presents and each functionalities needs to be performed by inbuilt elements of the nodes which let them work as a router. Routing is one of the key features performed by specifically designed light weighted protocols in these ad-hoc networks. It suffers from various issues includes route discovery, bandwidth management, congestion, location detection, energy effective operations, link handling etc. All of the above and many other functionality is mainly depends upon the position of the mobile nodes. Over the past few years location based networking technologies is changed very abruptly along with a horizontal and vertical growth in number of applications and its users. This dynamic change in topologies makes the task more difficult to resolve it accurately and with limited number or parameters. This paper proposes a novel MPLI (Modified Parametric Location Identification) approach. Apart from only using the x and y coordinates, the suggested work added some more values which includes angle of arrival, time, distance and circular region quadrants for accurate detection. It also provides timely updates of positions so as to make the routing more robust and position aware so as to avoid data losses and connection termination due to mobility.
An Overview on Hybrid Cloud as an It Service Broker
P. Soumya Sree Laxmi
Today the growth of cloud computing is riding a new role in corporate IT as a service broker. A hybrid cloud gives a choice to its customers in deploying the workload in a location with performance, security and utility is provided which remains a major concern for many IT organizations. When you decided to move to the cloud, one of your main ideas is probably to shift work from your hands or systems to someone else’s and to save money. In this scenario as work increases the cost of getting work done also increases. A hybrid cloud delivery model will operate as a broker of IT services, giving you more flexibility to best match your business’s needs and also provides you service on demand. The advantage of such a hybrid cloud deployment is that an organization only pays for extra compute resources when they are needed. We look into the architecture and reasons how Hybrid cloud are emerging as a better service brokers for business and customers
Implementation of Personalized mobile search engine based on Ontology
Ms.Namrata G Kharate
Mobile search engine is a program that searches for and identifies items in adatabase that correspond to keywords or characters specified by the user, used especially for finding particular sites on the World Wide Web.A major problem in mobile search is that the interactions between the users and search engines are limited by the small form factors of the mobile devices. As a result, mobile users tend to submit shorter, hence, more ambiguous queries compared to their web search counterparts. In order to return highly relevant results to the users, mobile search engines must be able to profile the users interests and personalize the search results according to the users profiles. A personalized mobile search engine (PMSE) that
captures the users preferences in the form of concepts by mining their click through data. Due to the importance of location information in mobile search, PMSE classifies these concepts into content concepts and location concepts. The user preferences are organized in an ontology based, multifaceted user profile, which are used to adapt a personalized ranking function for rank adaptation of future search results. To characterize the diversity of the concepts associated with a query and their relevance’s to the users need, four entropies are introduced to balance the weights between the content and location facets. Based on the client-server model, a detailed architecture and design for implementation of PMSE is also presented. In implemented system, the client collects and stores locally the clickthrough data to protect privacy, whereas heavy tasks such as concept extraction, training, and reranking are performed at the PMSE server. User get the information depending upon query and nearby location.
Identification Of Malnutrition With Use Of Supervised Datamining Techniques –Decision Trees And Artificial Neural Networks
D.Thangamani, P.Sudha
In today’s modern world, Globalization, demographic transition, life style changes and dietary meal patterns influences people’s nutrition. This work attempts to demonstrate the analysis of malnutrition based on food intakes, wealthy index, age group, education level, occupation, etc. Objective of this work is to use of effective supervised machine learning techniques-decision trees and artificial neural networks to classify dataset of family health survey and Classification and prediction techniques provides appropriate and flexible methods to process large amount of data for specifying accurate malnutrition detection and prevention over the survey dataset. The result of supervised data mining techniques in nutrition database provides the nutrition status of children age under five. This work is useful to improve nutrition level of public health with the help of government health services to the people.
CFD Analysis & Experimental Study on Heat Transfer Enhancement by various shapes of wings and Material with Forced Convection
Snehal C. Kapse, Dr. R.R Arakerimath
In recent years, vortex generators such as fins, notches, wings etc. have been successfully used for heat transfer enhancement of the modern thermal systems like dryers, electronic equipments etc. The aim of present study is to investigate Heat transfer coefficient in rectangular plates using various shapes such as spherical wings, tubular wings , bare plate using different material such as Copper, Brass and M.S plates and comparing the results using CFD Analysis and developing Mathematical modeling of the results. Therefore, a forced convection experimental setup is to be built to study the various parameters such as heat transfer coefficient, Reynolds number and Nusselt number.
Removal of High Density Salt and Pepper Noise from Digital Images
Sruthi Ignatious
This paper compares efficient filters for the restoration of images that are corrupted by a high density of Salt and Pepper noise. There are several non linear filters for the restoration of the images. Among these, median filters are the most popular and powerful technique for the removal of salt and pepper noise. Different variations of median filters are now available. These variations outperform the standard median filter. These filtering techniques include two stages, such as noise detection stage and noise filtering stage. The noise detection stage will identify the noisy pixel within the image and noise filtering stage will filter the image. The performance qualities of different filters are measured using Peak-Signal to Noise Ratio (PSNR) and Image Enhancement Factor (IEF) value.
In the field of networks wireless sensor network consists of small, large number of sensing nodes which is having the sensing, computational and transmission power. But due to insecure nature of wireless communication, these networks are vulnerable to internal and external attacks. Moreover, routing protocols are designed, taking the consideration of power consumption not security as a goal. As security plays an important role in the ability to deploy and retrieve trustworthy data from a WSN. This paper introduces all kind of routing protocols with their advantages and disadvantages. We also present a survey of all kind of attacks and secure routing protocols which will help us to know about the present status of security in WSN. We also introduce the concept of multipath routing in WSN to provide the secure and reliable communication. At the end we have proposed the solution regarding security point of view for WSN.
A Comprehensive Paper on Performance Analysis between AODV & DSDV Routing Protocol
Mr. Raj Kumar Singh, Mr. Rakesh Kumar Khare
Keeping the routing overhead minimal is the key to design an efficient routing protocol for mobile ad hoc network MANET. This type of networks is a non infrastructure, self-configuring and decentralized set of mobile nodes. The node moves at different speeds in independent random form, connected by any number of wireless links, where each node is ready to pass or forward both data and control traffic unrelated to its own use ahead (Routing) to other nodes in a flexible interdependence of wireless communication in between. In contrast to infrastructure wireless networks, where the communication between network nodes is take place by a special node known as an access point. It is also, in contrast to wired networks in which the routing task is performed by special and specific devices called routers and switches. In this paper we compare the performance of AODV & DSDV by two parameter PDR & End to End Delay.
Preparation of silver nanoparticles by pulsed laser ablation in liquid medium
Leena F. Hamza, Dr.Issam M.Ibrahim
In present work Silver nonoparaticles have been prepared by using pulsed laser ablation ( Q -switched Nd:YAG) 1064nm pluse duration and(E= 100mJ to 400mJ ) of pure Ag mateal plate immersed in distilled water and dionized water .The synthesized nanoparticles are characterized using transmittance electron microscopy (TEM) and UV-VIS spectrophotometer . The effect of the pulses energies and number. of shots have been reported .The silver nanoparticles exhibited asurface Plasmon resonance effect with wave length (λspr=400nm).The production of pure and spherical Ag with average size of (1-15nm) .All the size measurements have been confirmed by TEM.
This paper is intended as overview of Linux security and different threads to server, network and workstation. SELinux, which is an implementation of Linux Security Modules (LSM), implements several measures to prevent unauthorized system usage. Security is a very broad concept, and so is the security of a system. All too often, people believe that a system is way more secure that it in practice is, but the biggest problems is still the human factor of the users; the possibility of careless or malicious users are commonly overlooked. Finally this paper concludes with providing list of some various common vulnerability, attacks and countermeasures.
Review On: Fractal Antenna Design Geometries and Its Applications
Ankita Tiwari, Dr. Munish Rattan, Isha Gupta
In this review paper, we provide a comprehensive review of developments in the field of fractal antenna engineering. First we give brief introduction about fractal antennas and then proceed with its design geometries along with its applications in different fields. It will be shown how to quantify the space filling abilities of fractal geometries, and how this correlates with miniaturization of fractal antennas.
Gayatri D. Kulkarni, Priyanka V. Gode, Jadi Pratapreddy,Madhura H. Deshmukh, Nitin R. Talhar
Smart Home Implementation Using Data Mining
Gayatri D. Kulkarni, Priyanka V. Gode, Jadi Pratapreddy,Madhura H. Deshmukh, Nitin R. Talhar
In this project we are attempting to design a home that acts as an intelligent agent. This paper mainly focuses on developing a system which will automate all domestic devices. Datasets are built for analysis purpose. Voice is recorded and then converted in textual form. Then patterns of text are matched with the data sets provided. Data sets consisting of different conditions will be the initial input to the system. According to the input provided at given instant, corresponding action will be performed by the system.
Enhanced Cluster Based Distributer Fault Tolerance Algorithm For Mobile Node In WSN
Avrinderpal Kaur , Er. Upasna Garg
A wireless sensor network (WSN) is a wireless network consisting of spatially distributed autonomous devices using sensors to monitor physical or environmental conditions. A WSN system incorporates a gateway that provides wireless connectivity back to the wired world and distributed nodes.
Connecting two or more computers together in such a way that they behave like a single computer that is called clustering. Clustering is used for parallel processing, load balancing and fault tolerance. The clustering technique is used where network organizes around a small set of cluster heads which then gather data from their local cluster aggregate this data and transmit it to the base station. Fault tolerance techniques attempt to prevent lower-level errors from propagating into system failures. By using various types of structural and informational redundancy, such techniques either mask a fault or detect a fault and then effect a recovery process which, if successful, prevents a system failure .In this paper we are reviewing comparison of existing clustering technique by using various performance factors like time complexity, node mobility, cluster count etc.
Modelling Reservoir Operation Using Multiple Regression & Artificial Neural Network
S.S.Khare, Dr. A.R.Gajbhiye
Reservoir operation frequently follows a conventional policy based on Guide curves that prescribes reservoir releases. Operating policies can be derived using system techniques such as simulation, optimisation and combination of these two. In recent years, artificial intelligence techniques like Artificial Neural Network (ANN) arisen as an alternative to overcome some of the limitations of conventional methods. In most of the studies, feed forward structure and the back propagation algorithm have been used to design and train the ANN model respectively.
A case study of Wadgaon reservoir is considered. On the basis of data available and observations simulation based Multiple regression (MLR) modelling and Artificial Neural Network (ANN) modelling is carried out . Forty seven years of 10 daily Inflow data and other relevant data is used for the analysis.
The main finding of the research is that the ANN procedure to derive the general operating policy for reservoir operation gives better and robust performance, indicating that ANN has a great potential for deriving optimal operating policy for reservoir.
A Review of Load Balancing Algorithms for Cloud Computing
Dr.G.N.K.Sureshbabu, Dr.S.K.Srivatsa
Cloud Computing is an emerging computing paradigm. It aims to share data, calculations, and service transparently over a scalable network of nodes. Since Cloud computing stores the data and disseminated resources in the open environment. So, the amount of data storage increases quickly. In the cloud storage, load balancing is a key issue. It would consume a lot of cost to maintain load information, since the system is too huge to timely disperse load. Load balancing is one of the main challenges in cloud computing which is required to distribute the dynamic workload across multiple nodes to ensure that no single node is overwhelmed. It helps in optimal utilization of resources and hence in enhancing the performance of the system. A few existing scheduling algorithms can maintain load balancing and provide better strategies through efficient job scheduling and resource allocation techniques as well. In order to gain maximum profits with optimized load balancing algorithms, it is necessary to utilize resources efficiently. The main aim of this paper is to discuss some of the existing load balancing algorithms in cloud computing environment.
Impact of Mobile Applications on Health Care Information System
R. Anand, Dr. S. K. Srivatsa
This article explores the how the mobile devices is used for improving health care system. M-health—mobile applications for healthcare—is a young and dynamic field that could improve the well-being of people around the world. Mobile applications can lower costs and improve the quality of healthcare as well as shift behavior to strengthen prevention, all of which can improve health outcomes over the long term. Mobile technologies are increasingly growing in developing countries like India. There have been several new researches and developments in this space. Nowadays mobile is becoming an important ICT tool not only in urban regions but also in remote and rural areas. The rapid advancement in the technologies, ease of use and the falling costs of devices, make the mobile an appropriate and adaptable tool to bridge the digital divide. Mobile phone ownership in India is growing rapidly, six million new mobile subscriptions are added each month and every one will own a phone by the end of 2014. By the end of 2015, three quarters of India's population will be covered by a mobile network. Many of these new "mobile citizens" live in poorer and more rural areas with scarce infrastructure and facilities, high illiteracy levels, low PC and internet penetration. The availability of low-cost mobile phones and the already broad coverage of GSM networks in India is a huge opportunity to provide services that would trigger development and improve people’s lives. The objective of this paper is to bring out status of mobile devices based Health care management systems in the world particularly in India.
Power dissipation is major area of concern in today’s CMOS technology. In this paper we present a six transistor (6T) Static Random Access Memory cell for low power applications. The proposed design has strong read static noise margin (SNM) and strong write ability. The impact of process variation on the different failure mechanism in SRAM cell is analyzed. A 32 bit SRAM with proposed and standard 6T bit cells is simulated and evaluated for read SNM, write ability and power. In the proposed 6T SRAM architecture intended for the advanced microprocessor cache market using 0.18um technology. The goal is to reduced power dissipation while maintaining competitive performance.
Detection of TCP/IP Covert Channel based on Naïve-Bayesian Classifier
Vibhor Kumar Vishnoi, Sunil Kumar
A covert channel is any methodology of communication that`s accustomed illicitly transfer data, so breaking the protection policy of a system. A network covert channel is a covert communication by hiding covert messages in to overt network packets. Any shared resource will probably used as a covert channel. In recent years with the development of various hiding methods, network covert channel has become a new kind of threat for network security. A covert channel is an unintended design within legitimate communication whose motto is to leak information as a part of rudimentary protocols. In fact, most detection systems can detect hidden data in the payload, but struggle to cope with data hidden in the IP and TCP packet headers. The vast number of protocols in internet seems ideal as a high-bandwidth vehicle for covert communication. Due to unwanted and malicious nature of covert channel applications and as it poses a serious security threat to network, it is recommended to detect covert channels efficiently. This paper presents a review of TCP/IP covert channel design and their detection scheme and presents a proposed method based on Naïve-Bayesian classifier to detect covert channels in TCP ISN and IP ID fields of TCP/IP packet.
Classification of Human Action in a Controlled Environment
Asmeet Bhosale, Geetanjali Kale
It is essential to monitor suspicious activities in certain places. In this paper we propose to develop an intelligent system for visual surveillance which will notify or generate an alarm if a person is found doing a suspicious activity. The system will make use of computer vision concepts like human motion analysis for posture recognition which involves certain suspicious activity.
Distributed file systems are client based applications in which the central server stores the files that can be accessed via clients with proper authorization rights. Similar to an operating system, the distributed file systems manage the overall system with naming conventions and mapping schemes. Google file system (GFS) was the proprietary system developed by Google for its own use, which included deployment of commodity hardware to retain the enormous generation of data. Hadoop Distributed File System (HDFS), an open source community project, was majorly developed by Yahoo! was designed to store large amounts of data sets reliably along with providing high sets of bandwidths for streaming data on client applications. For the processing of the data stored in HDFS, Hadoop provides the users with a programming model called MapReduce. This model allows the users to reliably distribute a large problem into smaller sub-problems onto several nodes in number of clusters without facing any problems. In this paper, we describe the GFS and HDFS and compare and contrast the two systems based on a list of attributes as also this paper provides the basic functionality of MapReduce framework.
Proficient Secret Transmission of Images And Message
S.Uma, V.Valarmathi
The concept of data hiding mainly focuses on embedding messages within the other. Many transformations have been made in Least Significant Bit (LSB), but it is a traditional approach. Challenges still remain in storing or embedding text data in images without image distortion. The proposed technique provides a solution for data embedding along with the image security by encrypting the image with the key generated by block permutation. The process of encrypting the given image with the key makes the image to be secure in transmission. In this paper, a novel scheme for separable reversible data hiding in encrypted image is proposed, which consists of image encryption, data embedding and data-extraction / image-recovery phases. The input image is obtained from the user and it is converted to gray scale. A key is generated to perform image encryption. Image encryption involves a sequence of steps viz. key permutation, pixels shuffle, block shuffle, height and width shuffle. The final image obtained is used for data hiding. The text to be hidden is encrypted and the position to hide the data is chosen based on the main key. When the receiver has both of the keys, the hidden data and the original image can be obtained.
Defense Mechanism for Denial of Service Attack to UMTS Networks Using Sim-Less Devices
V. Palaniyappan , M.Duraipandian , K.Malarvizhi
One of the basic security element in cellular networks is the verification procedure functioned by means of subscriber identity module that is necessary to give access to network services and hence secure the network from unauthorized usage by implementing different types of parameters. The large amount of computing power available in modern clustered HLRs, it is also essential to consider the counter-intuitive result summarizes and showing that the more busy the HLR is, the more difficult is disrupting its services. The cellular infrastructure as a whole and thus in the measure needed by its defense, namely: 1.The complexity and the high level of programmability of latest mobile phones and 2.The interconnection between the cellular network and the internet. The awareness of this attack can be exploited by many applications both in security and in network equipment manufacturing sectors.
UW-Hybrid Architecture of Underwater Acoustic Sensor Networks
Tanu Singh, Manu Singh
Ocean is one of the concerning research these days. Many marine applications seems comparatively slow in exploiting the information communication technology. In this paper, new UW-Hybrid architecture for 2D and 3D communication is proposed for underwater acoustic sensor networks. The objective of this paper is to highlight the drawbacks in the existing architecture of underwater networks. This paper also focuses on the research issues of different layers in the protocol stack of underwater networks. Cross layer protocol stack is one of the solution of above given problem is explained briefly here to compensate for possible limitation in the layer interface.
Combination of Markov and MultiDamping Techniques for Web Page Ranking
Sandeep Nagpure
Web page Re-ranking has been widely used to reduce the access latency problem of the Internet. However, if most prefetched web pages are not visited by the users in their subsequent accesses, the limited network bandwidth and server resources will not be used efficiently and may worsen the access delay problem. Therefore, it is critical that we have an accurate re-ranking method during prefetching. The technique like Markov models have been widely used to represent and analyze user‘s navigational behavior. One more technique is multidamping which utilize the stochastic matrix for the ranking of the pages. This paper give new algorithm for the page ranking different featurea combination of the markov as well as the multidamping method.
Performance analysis of Neural Network Algorithms on Stock Market Forecasting
Manasi Shah, Nandana Prabhu, Jyothi Rao
Artificial Neural Networks (ANN) have been used in stock prediction extensively as it provides better results than other techniques. In this paper, different architectures of ANN, namely, simple feed forward back propagation neural network (FFBPNN), Elman Recurrent Network, Radial Basis Function network (RBFN) are implemented and tested to predict the stock price. Levenberg-Marquardt Back-propagation algorithm is used to train the data for both FFNN and Elman Recurrent Network. These techniques were tested with published stock market data of Bombay Stock Exchange of India Ltd., and from the results it is observed that FFBPNN gives better results than Elman Recurrent Network.
License plate character recognition is an integral part in Intelligent Transportation Systems. Traditional methods used for number plate recognition were OCR, Optical Character Recognition and Formula Based Recognition. As Neural Network is an intelligence engine, it ensures greater accuracy rate along with better recognition speed. In this system, recognition is done with the help of Neural Network. License plate is automatically located using principal visual word (PVW), discovery and local feature matching. Given a new image, the license plate is extracted by matching local features with PVW. Then image preprocessing operations are performed for improving the quality of plate images. This license plate image is then passed for Character Segmentation and these segmented characters are recognized using Neural Network.
New level of Security in ATM System Using Brain Fingerprinting
N. Geethanjali
Security plays a vital role in all fields and in various applications. The existing computer security systems used at various places like banking, passport, credit cards, smart cards, PIN (Personal Identification Number), access control and network security are using username and passwords for person identification. In distributed system like ATM (Automated Teller Machine) the security is a main issue. The security level has been grown from providing PIN to Smart Card to Biometrics. Even though the security has been increased at the same time fraudulent activities have been grown to equal level. In the existing approaches different biometric technologies, multibiometrics, multimodal biometrics and two tier security is introduced in ATM to provide higher level of security. The proposed work is to enhance the security using brain fingerprinting technology which acts as an uncrackable password. A brain computer interaction has been developed to record the brain signal through Digital Electroencephalography. Brain fingerprinting is a technique used to find the unique brain-wave pattern generated by brain when a person encounters a familiar stimulus.
Study and Analysis of Edge Detection Techniques for Segmentation using Dental Radiograph
N.P.Ansingkar, M.G.Dhopeshwarkar
Dental Biometric is mainly used in Forensic Science. Segmentation plays a vital role in the interpretation of Dental Radiograph that is useful in identification of an individual, bite mark analysis and in mass disasters. In this paper, we have an overview of a different discontinuities and Edge Detection Techniques. We have also done a comparative study of the edge detectors. In this Study, the first step we have done is pre-processing on animage. In the second step, we enhance the image by using Low Pass Filter and in the last step we applied different Edge Detectors on the image.
Diagnosis and Correction of tear in films with Optimization of Motion Estimation Technique
Jilfy James, Jobi Jose
This paper mentions a method for the automatic detection and correction of films damaged by tear. The method includes estimation of motion vectors between two frames in an image sequence, delineating the tear boundary and restoring the damaged frame using suitable missing data treatment. Thus loss of data along the tear boundary is also corrected. The paper also includes the comparison of performance of different motion estimation methods like exhaustive search, logarithmic search, diamond search and adaptive rood pattern search. Among these adaptive rood pattern search is found to be the best in performance based on the study of quality measurement and the execution time required.
Asst. Prof. Jyoti Rao, Dr. Vikram Patil, Ms. Smita Patil
Survey of Visual Cryptography Schemes without Pixel Expansion
Asst. Prof. Jyoti Rao, Dr. Vikram Patil, Ms. Smita Patil
The basic idea of the Visual Cryptography is to encrypt a secret image into n number of meaningless share images. The Visual Cryptography technique cannot leak the encrypted information of the shared secret by virtue of any combination of the n share images combined together. This share images are printed on separate transparencies and distributed as shares such that, when the share images are superimposed, the concealed secret image is discovered. Thus, the human visual system can recognize the shared secret image without using any computational devices. There is no need of cryptography knowledge and complex computation. The traditional visual secret sharing scheme uses a pre-defined pattern book to generate shares, which leads to a pixel expansion problem on share images. Basically, the performance of visual cryptography scheme depends on different measures like pixel expansion, security, contrast, computational complexity, accuracy, share generated, number of secret images and type of secret images encrypted by the scheme. Objective of this paper is on study and performance analysis of the visual cryptography schemes without pixel expansion, number of secret images, type of the image and type of shares generated (meaningless or meaningful).
Improving Edge Based Color Constancy for Multiple Light Sources
Ancy Varghese, Darsana Vijay
Color constancy algorithms are used for making illuminant independent images. Illuminant independency is achieved in most of algorithms by uniform light source assumption. But this condition may violate in most of real world images which introduces performance failure in computer vision applications like object tracking and object recognition. Local color correction for up to two light sources has been developed by various algorithms. In this paper derivative structure of images are used for local color correction for up to three light sources. Specular edge weighting scheme on derivative structure of image is applied to grid sampled patches of the image to achieve local color correction. Each estimate from patches is combined and result is projected onto input image so that pixel estimate is obtained. Color correction is achieved on a diagonal model with pixel estimates. Experiments show that specular edge weighting method reduces the error when compared with existing local correction methods which uses derivative image structures.
This study proposes a step-down tranformerless direct ac/dc converter suitable for universal line applications (90–270 Vrms ). The topology consists of a buck-type power-factor correction (PFC) cell with a buck–boost dc/dc cell and both works as a step down converter. A part of the input power is coupled to the output directly after the first power processing. With this direct power transfer feature and sharing capacitor voltages, the converter is able to achieve efficient power conversion, high power factor, low voltage stress on intermediate bus (less than 130 V) and low output voltage without a high step-down transformer. The absence of transformer reduces the component counts and cost of the converter. Both simulation and experimental results demonstrate the validity of the proposed converter.
Fuzzy Logic Enforced Traffic Management for High Speed Networks
T. Hari, P. Hemanth Kumar
As the increase in use of computing devices such as computers, tablets and smart phones there is a huge demand for the fast-growing internet traffic. Distributed traffic management frame work has been proposed, in which routers are deployed with intelligent data rate controllers to tackle the high traffic level. The traffic control protocol is unique as other traffic control protocols have to estimate network parameters which involves link latency, bottleneck bandwidth, packet loss rate, or the number of flows in order to compute the allowed source sending rate. The fuzzy-logic based controllercan measure queue size directly; it neglects various potential performance issues arising due to parameter estimations as we reduce muchconsumption of computation and memory resource in routers. As a network parameter, the queue size can be monitored accurately and used for making proactive decision if action should be taken to regulate the source sending rate, and thus increases the resilience of the network to traffic congestion. By the fuzzy logic technique, QoS (Quality of Service) in communication is assured by good performances of our scheme such as max-min fairness, low queuing delay and good robustness to network dynamics. The conclusion is that the results and comparisons have verified the effectiveness and made a created a new benchmark that our traffic management scheme using fuzzy-logic can achieve better performance than the existing protocols that depend entirely on the estimation of network parameter.
Comparative Analysis of Secure and Energy Efficient routing protocols in Wireless sensor network
Amanpreet Kaur, Dr. Sandeep Singh Kang
Hierarchical routing architecture divides the whole network into a group of cluster and only cluster head is responsible to forwarding the data to base station directly. In hierarchical based architecture of routing, the cluster head is used to aggregate the data from other nodes and send the aggregated data to Base station.During the creation of network topology, the process of setting up routes in WSNs is usually influenced by energy considerations, because the power consumption of a wireless link is proportional to square or even higher order of the distance between the sender and the receiver. In hierarchical routing architecture, sensor nodes self-configure themselves for the formation of cluster heads. In this paper, the survey on energy efficient and secure routing protocol in wireless sensor network and few of them are compared and evaluated.
Analysing Performance of Companding Technique For PAPR Reduction in OFDM System
Rikhee Ram, Manoj Gabhel
The major drawback of OFDM transmission is the large envelope fluctuation which is qualified as Peak to Average Power Ratio. Since power amplifier is used at the transmitter, so as to operate in a perfectly linear region the operating power lies below the available power. For reduction of this PAPR lot of algorithms have been developed. A uniformly distributed nonlinear companding scheme efficiently reduces PAPR with a low Bit Error Rate (BER). However, the uniformly distributed companding scheme cannot perform variably to satisfy the different performance requirements for the systems. Therefore, this work proposes a novel scheme that transforms the OFDM signals into a trapezium distribution. The uniformly distributed companding scheme is a special case of the proposed scheme. The general formulas of the proposed scheme are derived and the trade-off between PAPR reduction and BER performance is achieved by setting the value of a parameter. Then, the simulation results show the PAPR reduction and the BER over the AWGN and multipath channels, indicating that the proposed scheme provides a favourable trade-off between the Papr Reduction and the BER.
New refined Model for Mechatronics design of solar differential drive mobile robotic platforms
Farhan A. Salem
This paper proposes a new generalized and refined model for solar electric differential drive Mobile Robotic platforms (SEDDMRP) and some considerations regarding design, modeling and control solutions. The proposed system design consists of seven main subsystems, each subsystem, is mathematically described and corresponding Simulink sub-model is developed, then an integrated generalized overall model of all subsystems is developed. The proposed whole SEDDMRP system model is developed for research purposes and application in educational process, to help in facing the main challenges in developing Mechatronics SEV systems; early identifying system level problems and ensuring that all design requirements are met, also, it is developed to allow designer to have the maximum output data to select, evaluate and control the overall SEDDMRP system and each subsystem outputs characteristics and response, for desired overall and/or either subsystem's specific outputs, under various PV subsystem input operating conditions, to meet particular SEDDMRP system requirements and performance. The obtained results show the simplicity, accuracy and applicability of the presented models in Mechatronics design of SEDDMRP system application.
Modeling and Control Solutions for Mechatronics Design of Solar Battery Electric Vehicles
Farhan A. Salem , Ali S. Alosaimy
In last years an increasing attention and more importance are dedicated to research in the fields of alternative vehicles, fuel consumption economy and reduction of related emissions. As an option to conventional vehicles, an increasing attention is being paid to the integration of Electric Vehicles (EV) and PhotoVoltaic (PV) panels resulting in Hybrid Solar Vehicle (HSV), and pure Solar Electric vehicle (SEV). This paper extends writers' previous works, and proposes a new generalized and refined model for Mechatronics design of pure Solar Electric Vehicles (SEV) and some considerations regarding design, modeling and control solutions. The proposed SEV system model consists of eight main subsystems, each subsystem, is mathematically described and corresponding Simulink sub-model is developed, then an integrated generalized model of all subsystems is developed and tested. The proposed SEV system model is developed to help in facing challenges in developing Mechatronics SEV systems, in particular; early identifying system level problems and ensuring that all design requirements are met, as well as, for research purposes and application in educational process. Model is developed to allow designer to have the maximum output data to to design, tested and evaluate overall SEV system and/or each subsystem outputs characteristics and response, for desired overall and/or either subsystem's specific outputs, under various PV subsystem input operating conditions, to meet particular SEV system requirements and performance.The obtained results show the simplicity, accuracy and applicability of the presented models to help in Mechatronics design of SEV system.
Diabetes Prediction in Women Based On Soft Computing Techniques
S.Karthikeyeni, S.Saranya
Diabetes Mellitus, generally called as High Sugar Problem is a metabolic disorder described by chronic hyperglycemia enhancing rapidly. It is a polygenic disease characterized by abnormal high glucose in the blood. Statistics say that 90 to 95 % of the World Diabetics have Type 2 Diabetes. The consequences of diabetes may be macro vascular complication or micro vascular complication. This proposes a new model for prediction of complications developing due to Diabetes Mellitus. This system compares the performance of various soft computing techniques such as Artificial Neural Networks, Support Vector Machine, Particle Swarm Optimization and Genetic Algorithms for prediction of diabetes disease.
A Study of Existing Cross Site Scripting Detection and Prevention Techniques in Web Applications
Neha Gupta
Web Applications provide wide range of services to its users in an efficient manner. Web based attacks are increasing with the intent to harm the users or the reputation of particular organization. Most of these attacks occur through the exploitation of security vulnerabilities found in web applications. These vulnerabilities exists because developer focuses more on the development of the application rather than its security due to the time and budget constraints. Cross Site Scripting (XSS) is one of the major security vulnerability found in web applications. In 2013, XSS is ranked third among the top 10 list of attacks by OWASP (Open Web Application Security Project).XSS flaws occur whenever an application takes insecure data and sends it to the browser without proper validation or escaping. This can result in hijacking user session, defacing websites and redirecting the user to malicious sites. In this paper, we will study different existing techniques which can be used for detection and prevention of XSS attacks
IPV6 SLAAC related security issues and removal of those security issues
Priya Tayal
The internet protocol version 4 (IPV4) is depleting in address spaces day by day and hence the deployment of IPV6 in network is very essential. IPV6 is a new generation protocol that is expected to solve the issues that arise in IPV4 but also poses some security issues. The emphasis in this paper is to identify the vulnerabilities that come in IPV6 and how to remove those vulnerabilities. Reconnaissance attack is employed by attackers to fraudulently enter in IPv6 network. For this reason an unique and secured IPV6 address will be generated so that a secured network can be set up and malicious nodes would not enter the network. There are different type of techniques that are used like i-SeRP(IPV6 Security Risk Prototype), CGA(Cryptographically Generated addresses), Privacy Extension, SSAS (Simple Secure Addressing Scheme) to make the network secure. In this paper an unique secured IPV6 address will be generated to make network secure by using the combination of certification authority and PKI structure for node verification, Diffie Hellman for exchange of secrets.
Implementing Big Data Management on Grid Computing Environment
Lawal Muhammad Aminu
With the current advances of today's technology in many sectors such as manufacturing, business, science and web application, a variety of data to be processed continues to witness an exponential rise. This data is referred to as big data. Efficient management and processing of this data poses an interesting but significant problem. To utilize the numerous benefits of grid computing, Big data processing and management techniques should be integrated in the current grid environment. In this paper, the definition, features and requirements of big data platform are explored. Incorporating Hadoop is suggested as it the most commonly used technique in handling Big Data as it offers reliability, ease of use, ease of maintenance and scalability.
Performance Evaluation of Secure Key Distribution Based on Quantum Mechanics Principles Over Free Space
Lawal Muhammad Aminu
Quantum key distribution (QKD) provides a perfectly secure coding method which solves the problem of key distribution, it is currently the most mature application in the field of quantum computing. Performance analysis is very important in determining the effectiveness of various QKD protocols. However, Lack of effective simulation tools for evaluating QKD protocols over free space results to use of Analytical (theoretical) and experimental (real equipments) for evaluation, the later is inaccurate while the former is expensive. Optisystem 7.0, a commercial photonic simulator which is widely used in telecommunication was used in modeling and simulating BB84,B92 and Six State QKD protocols. The simulation model emphasizes on the experimental components of quantum key distribution. Results obtained based on the sifted key rate and failure rate shows that Six state protocol has a low sifted key rate and high failure rate which are identical to results from experiments. Lack of detector implementation and assumption of the single photon reduces the accuracy of the results. The simulation can help researchers to test their models before performing experiments.