STUDY ON ENSURING DATA STORAGE SECURITY SYSTEM IN CLOUD COMPUTING SERVICE
K.. Kavitha
Cloud Computing has been envisioned as the next-generation architecture of IT Enterprise. It moves the application software and databases to the centralized large data centers, where the management of the data and services may not be fully trust worthy. This unique paradigm brings about many new security challenges, which have not been well understood. This paper proposed a problem of ensuring the integrity of data storage in Cloud Computing. In particular, consider the task of allowing a third party auditor (TPA), on behalf of the cloud client, to verify the integrity of the dynamic data stored in the cloud. The introduction of TPA eliminates the involvement of the client through the auditing of whether his data stored in the cloud are indeed intact, which can be important in achieving economies of scale for Cloud Computing. While prior works on ensuring remote data integrity often lacks the support of either public auditability or dynamic data operations, first identify the difficulties and potential security problems of direct extensions with fully dynamic data updates from prior works.
Dr. B. Jai Shankar ,Mr.K.Siddharth Raju, Mr.M.Santhanaraj , Mr.S.Balamurali
Artifact Reduction For Visually Challenged
Dr. B. Jai Shankar ,Mr.K.Siddharth Raju, Mr.M.Santhanaraj , Mr.S.Balamurali
Compression of image or video now happens to be an essential feature to fit a large amount of binary data into an available narrow bandwidth digital communication channel. Inveterate compression algorithms formulated by means of Discrete Cosine Transform and Wavelet Transform suffers a lot with the presence of undesirable frequency components which in turn pull down the visual quality of the image or video. Those frequency components create prominent visible changes, identified as artifacts. The visual quality of such images needs improvement by the elimination of these artifacts not only for consumer electronics but also for the analysis and decision making algorithms in real time systems. Artifacts can be modeled with the characteristics of neighboring pixels. The most noticeable are blocking, ringing and mosquito artifacts. The proposed adaptive filter applicable for JPEG compression algorithms detects the blocking artifacts with the new metric voted as Total Blocking Error. Subsequently the adaptive control superimposed for the corrective parameter of the localized filter and for the correlation factor to create the new adaptive quantization matrix for correcting the rounded off Discrete Cosine Transform coefficients at the transmitter side reduces the blocking artifacts. The boundary pixels of each n*n block yields the measure of blocking annoyances with this new metric. Experimentation with SAR, Medical and Conventional images proved that the value of the metric gets increased with the increase in the compression ratio and gets reduces with the proposed adaptive algorithm. Results bestow the assessment slot in for the adaptiveness suggested is strong worthy in improving the visual quality both quantitatively and qualititatively.
Design Patterns’ Model for Application Development in Object Oriented Languages
Gaurav Pradip Pande , Akshay Janardan Mendki
Design patterns are one of the most efficient ways of application development. Many developers are still reluctant to the use of design patterns and development is carried out with not enough attention paid to the reusability and extensibility of the code. Patterns are not to be restricted in the building of integrated development environments and language libraries.
Proposed work presents a model which use combination of multiple design patterns for application development in object oriented programming. The model suggested in this proposed work uses hierarchical abstraction of objects created using factory pattern. Façade pattern is also used which makes functional encapsulation possible. Proposed work also presents the scenarios in which use of patterns is likely to improve the code efficiency and reusability. Along with the factory pattern, singleton pattern may also to be used together in the proposed model for their respective benefits.
A Study on Chaos-Based System for Improved Encryption of Images
Kukoo Anna Mathew
In modern world, encryption technology has been developed rapidly the chaos based cryptographic algorithms. Chaos based cryptographic recommend a variety of features over the traditional encryption algorithms such as high security, speed, and prudent computational overheads and power. Chaos theory studies the performance of dynamical systems that are enormously sensitive to initial conditions, which means that a small change in the initial state can be lead to a big diverse action in the final state. This paper presents a study on chaos based system for improved encryption of images using RSA public key cryptography.
Development of an Efficient Hand Gesture Recognition system for human computer interaction
Umadevi N, Divyasree I.R
Human computer interaction technique has become a bottleneck in the effective utilization of the available information. The development of user interfaces influences the change in the human computer interaction (HCI). Dynamic hand posture recognition is one of the difficult tasks due to its complex background. The proposed method is implemented for hand postures taken in the natural environment. Challenges such as change of illumination and similar background are taken into consideration. Working system is based on two steps, namely hand detection and hand gesture recognition. In the hand detection process, normalized colour space skin locus is used to threshold the skin pixels from its varying background. Morphological filtering and canny edge detection is used efficaciously for removing object noise and obtain counter edges of hand gestures. Normalized cross correlation is used for pixel-wise comparison between the input image and the depository images and the best matching is considered as region of interest.
Performance Measures for a three-unit compact circuit
Ashish Namdeo ,V.K. Pathak , Anil Tiwari
This paper analyses a three component system with single repair facility. Denoting the failure times of the components as T1 and T2 and the repair time as R, the joint survival function of (T1, T2, R) is assumed to be that of trivariate distribution of Marshall and Olkin. Here, R is an exponential variable with parameter α and T1 and T2 are independent of each other. In this paper use of Laplace-Transform is taken for finding Mean Time Between Failure, Availability and Mean Down Time and table for Reliability measure is shown in the end.
A Survey on Different IP Traceback Techniques for finding The Location of Spoofers
Amruta Kokate, Prof.Pramod Patil
The problem of finding DDoS (Distributed Denial of Service)Attack is one of the threats in the Internet security field. To get the spoofers, a number of IP traceback mechanisms have been proposed. As their attack root is often hidden The problem lies in distinguishing the attack traffic from the normal traffic. Different techniques are used to get and identify the origin of DDoS attack with the help of IP Traceback . The most famous techniques in finding the attack source is the IP traceback . Many kinds of traceback techniques are their with each having its own pros and cons. This paper contain and evaluates some of the existing and recently evolving IP traceback techniques with respect to their advantages and disadvantages.
A Generic Computer Tool for the Solution of Fluid Flow Problems: Case Study of a Thermal Radiative Flow of an Optically Thick Gray Gas in the Presence of Indirect Natural Convection
Bachir Moussa Idi, Harouna Naroua, Rabé Badé
In this paper, we present a generic computer tool based on the Nakamura finite difference scheme in order to solve laminar fluid flow problems. The present study is restricted to the category of one-dimensional, two-dimensional and three-dimensional fluid flows expressed in one spatial coordinate. All problems are assumed to be time dependant. The equations describing the flow and other relevant parameters are defined in a generic file which is used as input to the system. A generic interpreter is used to generate postfix codes that it will interpret in the process of calculations. For the purpose of application, we consider a two-dimensional unsteady flow of an incompressible electrically conducting viscous fluid along an infinite flat plate. The effects of the various parameters entering into the problem are discussed extensively and shown graphically.
Vibrant Rule Base Erection and Continuance plot for Ailment Profecy
Ms. M. Mythili , Mr. A.P.V.Raghavendra
Business and healthcare application are tuned to automatically detect and react events generated from local are remote sources. Event detection refers to an action taken to an activity. The association rule mining techniques are used to detect activities from data sets. Events are divided into 2 types’ external event and internal event. External events are generated under the remote machines and deliver data across distributed systems. Internal events are delivered and derived by the system itself. The gap between the actual event and event notification should be minimized. Event derivation should also scale for a large number of complex rules. Attacks and its severity are identified from event derivation systems. Transactional databases and external data sources are used in the event detection process. The new event discovery process is designed to support uncertain data environment. Uncertain derivation of events is performed on uncertain data values.
A classical PID controller tuning method for the integrating processes is presented to provide the desired phase margin which acts as the major tuning parameter along with the crossover frequency. Analytical expressions are given for controller setting based on Mikhalevich method. This method provides a way to select the desired phase margin. This method has been tested successfully and compared with the various examples on the basis of integral performance index (ISE and IAE ) to show the effectiveness.
Proficient Pattern Selection for Supervised Tagging
A.P.V.Raghavendra, I.Vasudevan, M.SenthilKumar
Pattern selection encompasses pinpointing a subsection of the most important features that is well-suited results as classification features. A pattern selection algorithm may be appraised from both the good organization and usefulness points of view. Although the good organization concerns the time necessary to discover a subsection of pattern, the usefulness is related to the excellence of the subsection of features. Latest methodologies for classification data are based on metric resemblances. To reduce unfairness measures using graph-based algorithm to replace this process in this project using more recent approaches like Affinity Propagation (AP) algorithm can take as input also general non metric similarities.
Banking Expert System”With credit card fraud detection using HMM algorithm
Priya Ravindra Shimpi, Prof. Vijayalaxmi Kadroli
Due to rapid advancement in electronic commerce technology. Most of the transactions in the banking are done online. As there are many service provider in banking so user must analyze the performance and choose the best among them. Also Credit cards become the most popular mode of payment for both online and regular purchase.
In this paper we are introducing the concept of three level of security, the first level is the static User name or password, and in the second level it uses Hidden Markov Model (HMM) and shows how it can be used for the detection of frauds. An HMM is initially trained with the normal behavior of a cardholder. If an incoming credit card transaction is not accepted by the trained HMM with sufficiently high probability, it is considered to be fraudulent. At the same time, we try to ensure that genuine transactions are not rejected. And to reduce the false positive transactions we will send the dynamic password, which can be send through the use of web services to the user’s mobile phone number instantly and he/she has to enter same password for getting the authorization from the bank side and suppose if due to the heavy load on the server side , if the user does not get the password in its mobile phone within the given stipulated time, then after a little time interval some personnel questions(either security question or images) will be asked which can be answered by the end user
Ms.S.Nathiya , Mr. U.Gowrisankar , Ms. P.Jayapriya
Strict Privacy With Enhanced Utility Preservation By N, T Closeness Through Microaggregation
Ms.S.Nathiya , Mr. U.Gowrisankar , Ms. P.Jayapriya
Micro aggregation is a technique for disclosure limitation aimed at protecting the privacy of data subjects in micro data releases, like releasing medical data, census data etc… It has been subjected to generalization and suppression to generate k-anonymous data sets, where the identity of each subject is hidden within a group of k subjects. Unlike generalization, micro aggregation perturbs the data and this additional masking freedom allows improving data utility in several. Existing algorithms like k-anonymity and t-closeness is based on generalization and suppression k-anonymity does not deal with attribute disclosure and hence the work focuses on closeness. This paper proposes and shows how to use micro aggregation to generate n,t close data sets. The advantages of micro aggregation with n,t-closeness are analyzed, and the ultimate aim of the project is to make comparative analysis and evaluate micro aggregation algorithms for t-closeness and n,t closeness .There are many real-life situations in which personal data is stored: (i) Electronic commerce results in the automated collection of large amounts of consumer data. These data, which are gathered by many companies, are shared with subsidiaries and partners. (ii) Health care is a very sensitive sector with strict regulations. In the U.S., the Privacy Rule of the Health Insurance Portability and Accountability Act (HIPAA) requires the strict regulation of protected health information for use in medical research. In most western countries, the situation is similar.
Review of Efficient Routing In Delay Tolerant Network
Sonia Kumari, Pratibha Yadav, Manvendra Yadav
Delay Tolerant Networking (DTN)[2][3] has been proposed as a potential solution for this technical challenge as it is designed to offer a solution for routing in a network which is not always connected. Ad-hoc network[6] is a decentralized type of wireless network. There is no pre existing infrastructure, like the base station to coordinate the flow of messages between the nodes. To deliver a message to a node, the shortest path to the destination node is computed. Ad hoc networks can use flooding for forwarding the data. Ad hoc routing assumes that there is an end to end connectivity between the nodes. However this not the case for DTNs. DTNs or the Delay Tolerant Networks are ad hoc networks in which contacts are not available at all times. The messages are buffered till the contacts are available and are then forwarded on the contacts when available. In this paper, we evaluate some of the most prominent routing protocols for Delay-Tolerant Networks such as Epidemic[1], PRoPHET , Contact Graph Routing and Dynamic social grouping (DSG)
Review on Automated Text Summarizer using Top K-Rules
Ms.Priya J.Patel, Professor.Pravin G.Kulurkar
In this paper we address the automatic text summarization task. Text Summarization was showed to be an improvement over manually summarizing the large data. It summarizes the salient features from the text by preserving the content and serves the meaningful summary. To design an algorithm that can summarize a document by extracting key text and attempting to modify this extraction using a thesaurus and to reduce a given body of text to a fraction of its size, maintaining coherence and semantics. This summarization method can be done in natural language processing approach integrated with rule mining.
Study On Cardiovascular Disease Classification Using Machine Learning Approaches
R. Subha
The diagnosis of heart disease which depends in most cases on complex grouping of clinical and pathological data. Due to this complexity, the interest increased in a significant amount between the researchers and clinical professionals about the efficient and accurate heart disease prediction. In case of heart disease, the correct diagnosis in early stage is important as time is very crucial. Numeral number of tests must be requisite from the patient for detecting a disease. Machine learning based method is used to classify between healthy people and people with disease. Cardiovascular disease is the principal source of deaths widespread and the prediction of Heart Disease is significant at an untimely phase. In order to reduce number of deaths from heart diseases there has to be a quick and efficient detection technique. This work presents a comprehensive review for the prediction of cardiovascular disease by using machine learning based approaches.
Map-based Multi-source Message Passing Model to find Betweenness Centrality using MapReduce
Nikhitha Cyril, Arun Soman
The need for large-scale graph analysis and finding efficient methods for the same are increasing. MapReduce framework has already been accepted as a standard for large-scale analysis. Finding betweenness centrality is highly significant in finding the importance of a node in a network. In this paper, we propose a map-based method to find betweenness centrality using the multi-source message passing model. The multi-source message passing model has two phases namely the breadth-first traversal phase and the backtracking phase. In the breadth-first traversal phase, we traverse the graph in breadth-first form by sending and receiving messages and in the process find the shortest paths in the graph. In the backtracking phase, we traverse back to the source node. While doing so, we find the dependency of the source node on other vertices by transmitting messages. Both these phases are implemented in the map-based method which consists of an initial MapReduce job followed by a number of iterations of mapper functions until there are no more messages to send. We can then find betweenness centrality by summing the dependencies of all the source nodes on the vertex.
Although tremendous progress has been made in the medical image processing in past decade for evaluation of the clinical information based on obtained medical images, still there exist a number of problems. In medical image processing some advert cases of clinical analysis have been recorded where physician fails to analyze the patient scenario based on single medical source image. In this paper a novel medical image fusion work based on NSCT domain has been presented which proves to be efficient than conventional approaches. In conventional algorithms no relevant research has been carried out to get detailed low frequency and high frequency coefficients which helps further in reliable fusion process. In proposed method phase congruency and directive contrast are used to yield reliable analytical analysis of low frequency and high frequency coefficients. Finally reconstructed fused image has been proposed based on acquired composite coefficient. In experimental results performance gain of proposed method can be clearly seen over the conventional approaches and this multimodal fusion approach has been successfully conducted on Alzheimer, subacute stroke and recurrent tumor which shows clinical ability of the proposed method in terms of good accuracy and better performance. In extension is done on YCBCR color space for better analysis.
Papr Reduction And Companding Distortion Mitigation By Piecewise Linear Companding
Kalluri Ravi Kumar, K.V.Acharyulu
In the OFDM communication system the main disadvantage factor is Peak Average to Power Ratio (PAPR) which limits the performance of the overall system. To limit this factor in the OFDM system there are so many techniques are there, depends on the nature of the system like clipping, Partial transmission, Selective mapping, Companding transform etc, in these technique companding technique is the a simple methodology to compress or compand the input signal based on the inflection points to reduce the PAPR in the system, while decompander is the technique in the receiver to expand the companded signal from the transmitter section in the OFDM. The piecewise linear companding is based on the linear equations to compress the OFDM sequence where the companding distortion should considered, in this paper we present a efficient companding based on the piecewise linear equations. The whole system considered under ETU channel model. The simulation results show that a reduced PAPR and optimal BER rate of the OFDM system.
The paper represents the various types of syntactic errors, their detection and further their correction system. This method of detection and correction of the syntactic errors is inspired by the need to correct them for a correct grammatical sentence formation. The best systems, however, results in syntactic errors due to of the lack of linguistic knowledge. An attempt to optimize the task of detecting and correcting the syntactic errors is reflected in this paper.
A New Digital Video Quality Enhancement Approach Based On Generalized Histogram Equalization Model
Ramjee.Pinipe , K.V.Acharyulu
In this paper, we have a tendency to propose a generalized leveling model for video improvement. supported our analysis on the relationships between video bar chart and distinction improvement white reconciliation, we have a tendency to initial establish a generalized leveling model desegregation distinction improvement and white reconciliation into a unified framework of protrusive programming of video bar chart. We have a tendency to show that several video improvement tasks may be accomplished by the projected model exploitation totally different configurations of parameters. With 2 shaping properties of bar chart remodel, specifically distinction gain and nonlinearity, the model parameters for various improvement applications may be optimized. We have a tendency to then derive associate degree best video improvement formula that in theory achieves the most effective joint distinction improvement and white reconciliation result with trading-off between distinction improvement and tonal distortion.
A Dual System Capture Biometric Fingerprint Scanner
Oduah, Uzoma Ifeanyi (Dr.), Yang Wu
In this research a dual system capture method is applied in the development of a biometric fingerprint scanner with enhanced sensitivity and resolution. The device utilizes a combination of the optical imaging and capacitive imaging to capture the biometric features of the fingerprint. The system therefore shuffles between the optical mode and capacitive mode while scanning. Designed software compares and matches the fingerprint using Fourier series. By capturing and matching the minutia features and fingerprint patterns, this scanner can be used for cryptography to authenticate a user.
Yekini N. Asafe, Adebari F. Adebayo, Aigbokhan E. Edwin
Refocusing Computer Engineering and Computer Sciences in Educational Institutions For Economic Sustainability And Local Input Contents Of Computer And Mobile Phones In Nigeria
Yekini N. Asafe, Adebari F. Adebayo, Aigbokhan E. Edwin
This paper is focus on the Refocusing Computer Engineering and Computer Sciences Education for Economic Sustainability and Local Input Contents of Computer and mobile phones in Nigeria. There is over ten millions computer system, and more than hundred millions mobile phones in use in Nigeria. All hardware, software and accessories of the computer systems/mobile phones components use in Nigeria are imported from foreign countries e.g. China, Taiwan, USA, Europe and Dubai in foreign currencies and when exchanged in Nigeria currency (naira). The foreign exchange trough the importation is of adverse to the country economy sustainability. Nigeria can improve her value added tax on locally made or assembled computer and mobile phones by fabricating some of the components and accessories from local raw materials. This could be achieved by refocusing and rebranding computer engineering and computer sciences education for students to acquire skills, and knowledge needed for industrial revolution and entrepreneurship for economic sustainability.
Classification of astronomical data has always been a challenging problem faced by astronomers. Classification if performed by ensemble method gives more accurate results than single classifier. In the present study, model for spectral classification is developed using Salford predictive modeler in context of astronomical objects catalogue SDSS. Random forest ensemble method is used for model development and classification results are analyzed. Experimental results show that the model is beneficial for spectral classification.
using technology in Education is still very limited compare to other disciplines like medicine. In recent years ago researchers focus on serious game, because of the importance of game in learning and the enthusiasm of adults to play a game. Yet this process is in the research step. The model of a serious game shows its concepts, which is very important to match with the concepts of education. In this paper we try to suggest a model of serious game which includes most principles of an educational model. Then expose how a game can be used in education.
Power Aware and Energy Efficient Multipath Routing Scheme for WSNs
Dr.S.Saira Banu, J.Shafiq Mansoor
Wireless Sensor Networks consists of sensor nodes which move randomly. Energy based routing in wireless sensor network is a demanding assignment. Power Aware Routing which attains application-specified communication delays at low energy cost by adapting transmission power and routing decisions. Some of the major issues in wireless sensor networks are energy consumption, lack of authentication, data integrity and instability of path link between sensor nodes which reduces the popularity of the sensor network.In recent years, the existing methods have focused on attaining either energy consumption or security. Here it consists of optimized multipath routing, power aware energy based routing approach to make the wireless sensor networks more secure with minimum energy consumption. The multipath routing is constructed to achieve high throughput and load balancing. The optimal energy path is established to maintain the data packet flow in the wireless sensor network unobstructed and the energy consumption model is developed to produce the minimum energy consumption.
Universal Asynchronous Receiver Transmitter (UART) is a serial communication interface. This paper presents the design of UART for FPGA based systems using Verilog. The UART design has programmable features for Transmission, Reception and Baud Rate generation. It has FIFO storage, programmable serial interface characteristics, complete status reporting capabilities and error detection. The design is implemented using Hardware Description Language Verilog. The design is simulated and verified on Xilinx ISE.
Social network can be generally defined as a group of individuals who are connected by a set of relationships. A key characteristic of social networks is their continual change. However, the bulk of the analysis methods developed and popularized in the field of computer were static in that all information about the time that social interactions take place is discarded. Although recently there is some work on dynamic social network analysis. In this article we have presented the overview of dynamic social grouping algorithm[10][11] . Probabilities and social behaviour are two common criteria used to route a message in disruption/delay tolerant network wherein there is only intermittent connectivity between the nodes. In this article we first discuss how the characteristics of these routing algorithms can be exploited by a malicious node to attract data packets and then dropping them to degrade the network performance. We then show the impact of such a behaviour called blackhole attack on DSG as it leverages both the social behaviour as well as the delivery probabilities to make the forwarding decisions. We present three solutions to mitigate black hole attacks. The first algorithm mitigates non collaborating blackhole nodes. In the second algorithm, we present a solution that handles collaborating blackhole nodes. The first two algorithms handle only the external attacks. It does not handle the scenario in which a node that is good initially and becomes malicious or selfish later. Finally, we present our third algorithm which handles collaborative black holes as well as internal attacks. We validate the performance of our algorithms through extensive simulation in ONE simulator.
Analysis of a Semi-Markov Model for Software Reliability
Ashish Namdeo ,V.K. Pathak, Anil Tiwari
In this paper analysis of a semi-markov model is done with reference to famous Jelinski-Moranda model which is probably the first model in software reliability. Fault removal resulting from the execution of program depends on the occurrence of the associated failure. Occurrence of failure depends both on the length of time for which the software has been executing and on the execution environment or operating condition. When different functions are executed, different faults are encountered and failures that are exhibited tend to be different.
A New Approach For Text Detection In Natural Scene Images
Sonia Jenifer Rayen
Text detection in natural scene pictures is a very important requirement for several content-based image analysis tasks. Text data facilitate to perceive a picture, scanned document analysis, electricity meters, reading of street name, automobile vehicle plate reading, detection and translation of sign, mobile text characters recognition by image processing. This paper tend to propose associate correct and study methodology for police investigation texts in natural scene pictures. a quick and effective pruning algorithmic program is intended to extract Maximally Stable Extremal Regions (MSERs) as character candidates exploitation the strategy of minimizing regular variations. Character candidates square measure classified into text candidates by the single-link bunch algorithmic program which is used here, wherever distance weights and bunch threshold square measure learned mechanically by a unique self-training distance metric learning algorithmic program.
Barbara SIMONS, Christian KORANTENG, Joshua AYARKWA
Practical Energy Saving Techniques For Multi-Storey Office Buildings In Accra, Ghana
Barbara SIMONS, Christian KORANTENG, Joshua AYARKWA
The recent influx of glazing materials being used as curtain walls in Ghana has resulted in the increase use of energy to provide thermally comfortable interior spaces for occupant comfort. The ‘glass box’ phenomenon in Ghana has arisen due to contemporary architecture. The climate of Ghana (warm humid) makes it almost impossible for these multi-storey glass boxes to be ventilated naturally throughout the year.
The current study was aimed at finding means of significantly reducing the cooling loads of these multi-storey office buildings whilst ensuring thermal comfort. Four high rise office buildings in Accra were selected for the study. Several parameters ranging from efficient glazing, thermal mass, façade insulation, night ventilation etc were probed into towards the reduction of cooling loads.
The study revealed amongst others that reduced light loads of 2W/m2 considerably decreased cooling loads in all the buildings. Efficient glazing with low solar heat gain coefficient also cut down on cooling loads by as much as 27%. Additionally, it was realized that external shading could reduce cooling loads and therefore architects must make a conscious effort to provide same when designing these buildings.
In the previous couple of years streaming of video on the web has encountered quick development and will keep on expanding in significance as broadband innovations and authoring tools keep on making strides. As the internet turns into an inexorably famous alternative to traditional communications media, internet streaming will turn into a critical segment of numerous content providers’ communications strategy. In this paper we proposed a solution to HTTP live streaming, which assesses the weights of media segments to choose the transmitting needs taking into account the present playing time and alter the proper transmission path.
Intrusion Detection Alarms Filtering System Based On Ant Clustering Approach
Xiao-long XU, Zhong-he GAO, Li-juan HAN
With the increasing of network attacks, network information security has become an issue of global concern. The problem with the mainstream intrusion detection system is the huge number of alarm information, it has high false positive rate. This paper presents a data mining technology to reduce false positive rate and improve the accuracy of detection. The technique is unsupervised clustering method based on hybrid ANT algorithm, it can discover clusters of intruders’ behavior without prior knowledge. we use K-means algorithm to improve the convergence speed of the ANT clustering. Experimental results show that our proposed approach has higher detection rate and lower false alarm rate.
The Role Of E-Learning In Producing Independent Students With Critical Thinking
Mazen Ismaeel Ghareb , Saman Ali Mohammed
This research provides a thorough background on E-learning and how it has come about. The research focuses on the important role of E-learning in a time when life is saturated with technology. This paper also explains how E-learning can be an effective way of learning. The research attempts to take UHD students from both colleges of computer science and Languages as samples. The paper deals with fundamental achievements one can gain through E-learning. Life is a non-stop process towards technology and E-government with all different parts. In such a livelihood, how can E-learning help improve a nation and help advance a country. This paper also attempts to understand the role of E-learning in advancing KRI. This paper answers the following questions this research provides a thorough background on E-learning and how it has come about. The research focuses on the important role of E-learning in a time when life is saturated with technology. This paper also explains how E-learning can be an effective way of learning. The research attempts to take UHD students from both colleges of computer science and Languages as samples. The paper deals with fundamental achievements one can gain through E-learning. Life is a non-stop process towards technology and E-government with all different parts. In such a livelihood, how can E-learning help improve a nation and help advance a country. This paper also attempts to understand the role of E-learning in advancing KRI.
Speech Enhancement Using Modified Spectral Subtraction Algorithm
Pinki, Sahil Gupta
The term “Speech Enhancement” refereed as to improve quality or intelligibility of speech signal. Speech signal is often degraded by additive background noise like babble noise, train noise, restaurant noise etc. In such noisy environment listening task is very difficult at the end user. Many times speech enhancement is used for pre processing of speech for computer speech recognition system. This paper presents speech enhancement methods like Spectral Subtraction and Modified Spectral Subtraction to reduce additive background noise. Basically these methods are single channel speech enhancement methods. Here , we have tested the clean wave file and observe the spectrogram for the noisy wave file and effect of existing spectral subtraction and proposed spectral subtraction algorithm on the file.
Overview of Hidden Markov Model for Test-To-Speech Synthesis Methods
Sangramsing N. Kayte , Monica Mundada
Speech synthesis is the process of production of artificial speech. The system used for generation of speech from text is called as text-to-speech (TTS) system. In TTS system, text and voice models for a particular language or multiple languages are given as input to the system, which generates speech as output corresponding to the provided voice models. Speech synthesis systems can be extremely useful to people who are visually challenged, visually impaired and illiterate to get into the mainstream society. More recent applications include spoken dialogue systems and communicative robots. HMM (Hidden Markov Model) based Speech synthesis is the emerging technology for TTS. HMM based speech synthesis system consists of training phase and synthesis phase. In the training part, phone and excitation parameters are extracted from speech database and modeled by context dependent HMMs. In synthesis part, the system will extract the suitable phone and excitation parameters from the previously trained models and generates the speech.
An Automated Guided Model For Integrating News Into Stock Trading Strategies
Pallavi Parshuram Katke, Ass.Prof. B.R.Solunke
The proposed automated model represents merging news into stock trading strategies using genetic programming. Events are retervied from news in free text. The introduced model can be tested by deriving trading strategies based on technical indicators and impacts of extracted events. The trading strategies take the form of system that combine technical indicators with a news variable and revealed through the use of genetic programming. The news variable contains in the best trading policy, indicating the added value of news for predictive purposes and validating our proposed model for automatically merging news in stock trading strategies
Fake Currency Verification Using Blue Pixel Region Analysis And Image Difference
Rajarshi Paul, Dr. Sanjib Kalita
Indian paper currencies are now a days feign in various tricky techniques by the counterfeiters and is very difficult to detect. A regular bearer may easily be mislead with the note and can consider it to be a genuine one for transaction purpose. It is a very challenging job to verify an Indian currency by image processing techniques. This paper demonstrates a comparison method of colors of RGB color model on the variable color shifting property region of a genuine with that of a fake note and produces a differential comparison on abstracted regions and subsequent analysis for the validation of the procedure.
Efficient Storage of Defect Maps for Nanoscale Memory
Md. Masud Parvez
Nanoscale technology promises dramatic increases in device density, but reliability is decreased as a side-effect. With bit-error rates projected to be as high as 10%, designing a usable nanoscale memory system poses a significant challenge. Storing defect information corresponding to every bit in the nanoscale device using a reliable storage bit is prohibitively costly. Using a Bloom filter to store a defect map provides better compression at the cost of a small false positive rate (us-able memory mapped as defective). Using a list-based technique for storing defect maps performs well for cor-related errors, but poorly for randomly distributed de-fects. In this paper, we propose an algorithm for parti-tioning correlated defects from random ones. The mo-tivation is to store the correlated defects using rectan-gular ranges in a ternary content-addressable memory (TCAM) and random defects using a Bloom filter. We believe that a combination of Bloom filter and small size TCAM is more effective for storing defect map at high error rate. We show the results for different correlated distributions.
The main aim of this project is to monitor the faults in a motor by wireless communication. In our proposal a solution to both electrical and mechanical based faults in an industrial machine is obtained based on WSN. The industrial machine model and its various types of faults are analyzed using a ARM9 LPC2929 processor. Although tremendous progress made in the past years still lot of problems needs to be resolved and the existing system based on the WSNs are short distance communication and need human interaction. The proposed system has capability to resolve all the issues which happen in the existing methods and proposed method offers long distance communication, safe operation, maintenance of time and increased operation reliability.
DETECTION OF LICENSE PLATE NUMBER USING DYNAMIC IMAGE PROCESSING TECHNIQUE
KUMMARI BHIKSHAPATHI , DAMA HARIBABU
It is believed that there are currently millions of vehicles on the roads worldwide. The over speed of vehicles ,theft of vehicles, disobeying traffic rules in public, an unauthorized person entering the restricted area are keep on increasing. Edge is a basic feature of image. The image edges include rich information that is very significant for obtaining the number plate detection based on morphological operations by object recognition. This paper reveals about the design and development of automatic number plate recognition. Dynamic image processing techniques are used for recognizing the license plate numbers from an image containing it. Recognition of license plate in a picture which is prone to illumination problems is done using this process. We need an automatic public security system. Each vehicle has their own Vehicle Identification Number („VIN‟) as their primary identifier. The VIN is actually a License Number which states a legal license to participate in the public traffic. The proposed paper is to identify the vehicle with the help of vehicles License Plate (LP).LPRS is one the most important part of the Intelligent Transportation System (ITS) to locate the LP. In this paper certain existing algorithm drawbacks are overcome by the proposed morphological operations for LPRS. Morphological operation is chosen due to its higher efficiency, noise filter capacity, accuracy, exact localization of LP and speed.
This research produces a highly efficient energy saving electroluminescent enhanced blue light emitting diode.
The fabrication of this enhanced blue LED involves the growth of a complex semiconductor crystal with advanced heterostructure bandgap design and absolute optimization of the light out-coupling to maximize efficiency.
This research discusses innovative ideas which enhance the efficiency of the white light source by critically analyzing the processes involved. It covers the fabrication of blue GaN LED chip, channeled at improving the efficacy of each building block stage from Substrates, Buffers and Epitaxy, to Physics, Processing and Devices, to Lamps, Luminaires and Systems.
Dos Resistant Cloud Based Secure Authentication Protocol
Dr. Kalavathi, Ms. Y. Haritha
cloud computing has become crucial in todays’ technical environment The identity verification s the man functionality in these cloud computing services. But as on today these identity protection gateways are very inclined from attacks by denial of service. Numerous authentication protocols are available that are very strong in protecting identities in traditional networked applications. But these protocols suffer with DOS attacks when we us them in cloud computing applications. This failure is due to the heavy utilization of cloud resource and lot of verification process involved in these services. In this paper authors propose a new authentication protocol which overcomes with internal and external denial service attacks. The proposed solution involves multilevel authentication schemes. The technique finds out the legitimate user’s requests and allow them at the front of the processing queue.
Speech Classification Using Mfcc, Power Spectrum And Neural Networkand.
Mr. Ashish Gadewar, Mahesh Navale
Speech signal carries rich emotional information except semantic information. Speech and emotion recognition improve the quality of human computer interaction and allow more easy to use interfaces for every level of user in software applications. Common emotion namely exclamatory, neutral and question mark were discussed and recognize through a propose frame work which combines of Mel Frequency Cepstrum Coefficients (MFCC) and Power spectrum are used for feature extraction and back propagation neural network are used for recognition of the emotional speech signals. This further will be used for transplanting that emotion in synthetic speech so that output quality of synthesis is improved.
Decision Forecast Modelling For Effective Execution Of Research Projects In R&D Labs Through Bi Models
R.Sivakami, G.Radhakrishnan, G.Anna Poorani
Research institutions adopting Enterprise Resource Planning packages for their day to day operations generate enormous data. Effective methodology for analysis these data will add value to the research and provide in the decision making process for the management using Business Intelligence (BI) models. For accomplishing this task, specific rules, techniques and algorithms needs to be developed in order to make the ERP system intelligent. These rules will bring out the hidden Business Intelligence in the ERP data and thereby will improve the outcome and deliverables of the project apart from providing support to the scientist. This forms a cyclic structure by again analyzing the outcome of the intelligence found, as this intelligence is incorporated with the ERP database itself. BI has been found used extensively in manufacturing sector and business houses and seldom used in R&D institutions and laboratories. This research work envisages to use BI model to improve the performance and provide support to large R&D labs like Council of Scientific and Industrial Research (CSIR).
Aakansha S Wani, Komal Vanjari, Deepika Shinde ,Prof. Rajasree R.S
Comparative Study Of Different User Authentication Schemes
Aakansha S Wani, Komal Vanjari, Deepika Shinde ,Prof. Rajasree R.S
Authentication is considered as a significance element of security to verify user’s identity. There are many authentication schemes that depend on user name /password ,but they are considered weak techniques of user authentication because they are prone to dictionary attack and man in middle attack , etc. A more secure scheme is 2 factor authentication that does not only verify the user name /password pair, but also needs a second factor such as a token device, biometric. This paper proposes the comparative study of different two and three factor authentication techniques, that can withstand common security attacks as well and has good performance of user authentication.
Flexure mechanisms have massive range in various industrial application required for high precision and frictionless motion. There are many study on concept to make precision manipulators, but only some of them can achieved to satisfy the high speed with precision. Pro-E software is used for parametric modeling of XY positioning table ANSYS is used for Static analysis and dynamic analysis . Deflection of motion is concluded by static analysis with force. The Deformation of XY mechanism is equivalent to S-shaped cantilever beam deformation. Force and deformation curve is linear. There results get compare with mathematical calculation with FEA results.
Efficient Content Based Image Retrieval Using Combination Of Dominant-Color, Shape And Texture Features And K-Means Clustering
Bhagyashri Amrutkar, Lokesh Singh
There is a huge demand for the efficient content based image retrieval system because of the availability of large image databases. In this paper we have present an efficient CBIR framework by extracting the Dominant-color, Texture, edge features and by clustering feature database. We have applied the dominant color extraction using color-quantization technique. Initially the image is divided into some partitions using the color quantization algorithm, here we are dividing into eight partitions and the eight dominant colors are obtained from that partition. Next for shape feature extraction sobel color edge detection technique is used. And local binary pattern (LBP) is performed on gray scale image to extract the texture feature. Then all features discussed above of image are combined to form a single feature vector. K-means clustering is applied over combined feature vector of database images. Finally, to retrieve similar images from database similarity matching is performed by Euclidian distance which compares feature vector of clustered database images with that of query image. The result of this proposed approach provides efficient, more accurate result.
Prediction Search Of Topics Based On Timestamp And Interest Of Domain With Advertising
Annamalai.R , Srikanth.JPrakash.M
In today’s scenario, web search plays an important role for various search services over the internet. So to provide best result of searches, in the proposed model we suggest three modes of the ranking process. This will be done based on three processes 1. User query based Advertisement 2. Time Stamp based Analysis 3. Domain based search. In the first process when the user searches the query the related advertisement will be displayed on the web browser by eliminating the unwanted advertisements and hence makes the view of the web pages more optimized and completely useful for the users. In the second process we trace the session i.e. time stamp of a particular user hence by using the time stamp the page which has the maximum amount of time in second will be ranked as first. In the third process based on the particular search query it displays the forum based on the domain given as option when they search. Before entering the search the user can opt for the domain of his/her interest and finely refine the search results to their interest. This proposed model is implemented by applying Stemming Algorithm and Time stamp.
Providing Multi Security In Privacy Preserving Data Mining
s.Nathiya, c. Kuyin, j.D.Sundari
Privacy Preserving Data Mining (PPDM) addresses the problem of developing accurate models about aggregated data without access to precise information in individual data record. In our setting, the more trusted a data miner is the less perturbed copy of the data (original) it can access.
Under this setting, a malicious data miner may have access to differently perturbed copies of the fake data through various means, and may combine these diverse copies to jointly infer additional information about the fake data and the data owner does not intend to release.
To Preventing diversity attacks is the key challenge of providing multi security in Privacy Preserving Data Mining services. We address this challenge by properly correlating perturbation across copies at different trust levels. We prove that our solution is robust against diversity attacks with respect to our privacy goal. Our solution allows generating perturbed copies of fake data for arbitrary trust levels on demand.
Mr. Murali Krishna Senapaty, Mrs. Padmaja Patel , Mr. Ranjeet Panigrahi
Implementing Radix Sort With Linked Buckets Using Lsd & Msd And Their Comparitive Analysis And Discussion On Applications
Mr. Murali Krishna Senapaty, Mrs. Padmaja Patel , Mr. Ranjeet Panigrahi
Here I have presented about implementing radix sort with linked buckets concept to reduce the memory usage for Large Data Set. In this research paper I have discussed about the various ways to implement radix sort, problems with the radix sort, brief study of previous works of radix sort & elaborating the use of radix sort for large data set. I try to analyse the memory usage problems of radix sort through this algorithm. Here I have taken the help of C language to execute and analyse the algorithm.
Protecting information in potentially averse environments – is a crucial factor in the growth of information-based processes in industries, business, and administration as information is the most important asset of an organization. Cryptography is basically practice and study of techniques for secure communication in the presence of third parties (called adversaries). The Advanced Encryption Standard (AES) specifies a FIPS-approved cryptographic algorithm that can be used to protect electronic data. The AES algorithm is a symmetric block cipher that can encrypt (encipher) and decrypt (decipher) information. Like DES, AES is a symmetric block cipher. However AES is different from DES in a number of ways. The algorithm Rijndael allows for a variety of block and key sizes. It is capable of using cryptographic keys of 128, 192, and 256 bits to encrypt and decrypt data in blocks of 128 bits. This review paper concentrates on framing together efficient implementation of AES algorithms in number of fields. The study also focus on number of ways to expedite the encryption process by using different approaches.