Nowadays, posting reviews online has become very popular way for people to express their opinions and sentiments toward the products bought or the services received. Analyzing large volume of online reviews available would produce the useful actionable knowledge which could be of economic values to the vendors and the other interested parties. Here, we understood and tried to solve the problem of mining reviews for predicting the product sales performance. The reviews which are posted by consumers involve the sentiments. So, these sentiments expressed in reviews and the quality of the reviews has significant impact on the future sales performance of products or services. And these sentiments are hidden in the Document Corpus which is also known as a Comments Document. The document-level sentiment classification aims to automate task of classifying the textual review, which is given on single topic, as expressing either positive or negative sentiment. Hence by getting these sentiments the overall review or feedback about the particular product can be known in a summarized form which will help vendors to know the overall statistics and the future performance of their product.
Dynamic Web-Based Mobile Application For Traffic Police
Komal Kalbhor,Akshaya Misal, Shilpa Kalbhor
The use of mobile devices, such as smart phones and cellular phones, in field data collection has increased recently due to the emergence of Global Position Systems (GPS) and Wi-Fi Internet access. Timely and handy field data collection is required for disaster management and quick response during emergencies. In this paper, we introduce a dynamic web-based system to collect the field data from personal mobile phones. The main objective of this work is to demonstrate a real-time field data collection method that can be used by Traffic Department. The purpose of this system is to allow the Traffic Police Officer to send the field collected data to the server. The information gathered using field data collection can include current location, IMEI number, phone number, license number and images captured during crime scene. The data at server side can then be used for analysis and decision making. The data captured from android will be shown on Google Maps using Google Maps API v3.This data will be available to corresponding departments of government through the web service.
An Automated guided model for merging news into Stock Trading Strategies
Pallavi Parshuram Katke, Ass.Prof. B.R.Solunke
The proposed automated model represents merging news into stock trading strategies using genetic programming. Events are retervied from news in free text. The introduced model can be tested by deriving trading strategies based on technical indicators and impacts of extracted events. The trading strategies take the form of system that combine technical indicators with a news variable and revealed through the use of genetic programming. The news variable contains in the best trading policy, indicating the added value of news for predictive purposes and validating our proposed model for automatically merging news in stock trading strategies.
Image pre-processing tool, created in Matlab, realizes many brightness transformations and local pre-processing methods. The proposed solutions focus on applying Neural Network Algorithm model for character recognition. The primary function of which is to retrieve in a character stored in memory, when an incomplete or noisy version of that character is presented. The idea is to create a theoretical and practical basis of preprocessing for character recognition using forward-feed neural networks. The Feed Forward Algorithm gives insight into the enter workings of a neural network; followed by the Back Propagation Algorithm which compromises Training and Testing. Character recognition is one of the most interesting and challenging research areas in the field of Image processing. English character recognition has been extensively studied in the last half century. Nowadays different methodologies are in widespread use for character recognition. Document verification, digital library, reading bank deposit slips, reading postal addresses, extracting information from cheques, data entry, applications for credit cards, health insurance, loans, tax forms etc. are application areas of digital document processing.
An Efficient Sdmpc Metric Based Approach For Refactoring Software Code
B Ramalkshmi, D Gayathri Devi
Software Engineering is an about development, design operation and maintenance of software. But there are some factors that make software maintenance difficult. A code clone is nothing a similar or duplicate code in a source code or created either by replication or some modification. Code clone is one of the factors that increase software maintenance and also cause code bloating. Thus the clone has to be removed. To remove clone, refactoring has to be determined and applied. Refactoring is done to improve the quality of a software systems’ structure, which tends to degrade as the system evolves. While manually determining useful refactoring is a challenging, search-based techniques can automatically discover useful refactoring. Refactoring approach uses the concept of Pareto optimality which naturally applies to search-based refactoring. Before refactoring is done, the test case should be generated. A formal written test-case is characterized by a known input and by an expected output, which is worked out before the test is executed
This paper proposes a method for removing clone through refactoring. In order to do refactor the clone, first the concept of Pareto optimality and a Pareto front is defined. Jsync refactor tool is used to refactor the programs. The coupling between object classes (CBO) metric represents the number of classes coupled to a given class. The second metric LSCC is represents the classes. Meaningful class coupling and cohesion metric helps object-oriented software developers detect class design weaknesses and refactor classes accordingly. CBO, LSCC and SDMPC metrics are used to check the accuracy of the refactored programs. The advantage of this system is helps the developers to program faster and it takes less time for clone removal. It improve the design of the software and it makes softer easier to understand. Overall performance of the system is highly improved by the proposed system.
Word sense disambiguation is the name of the process of assigning proper sense to a word having the potentiality of multiple senses in an automatic computational platform such as computational linguistics. It is one of the well discussed problems of natural language processing. The process involves a lot of difficulty due to lack of reasoning power and commonsense of computers. So human has to formalize some algorithms to computerize the process of sensing the appropriate sense of a word within a given context. This is the main aim of WSD approaches. During this survey, we have basically discussed about different approaches and also implementations of these approaches in different languages like Chinese, Japanese and also in Indian languages like Hindi, Nepali and Tamil.
Increasing of CIR Performance of the OFDM System without Increasing Hardware Complexity
Macharla Prasad
Symmetric symbol repeat (SSR) inter carrier interference (ICI) self cancellation scheme has proved to be a simple and convenient technique to reduce ICI caused by frequency offsets. It utilizes data allocation and combining of (1,-1) on two symmetrically placed subcarriers to mitigate the effect of ICI. However, the data allocation factors (1,-1) are not an optimum. In this paper, an optimum data allocation (1, -
Implementation of a visible and invisible video watermarking technique
K. Shamini, C. Bhagya, M. Sri Sowmya
Although tremendous progress has been made in the past years on video watermarking, there still exist a number of problems. We believe that the most important one is related to the compression rates, robustness against attacks and high security for privacy data. In digital image processing domain “achieving better compression rates in dual digital watermarking” is still area of concern. The proposed work shows the embedding of visible and invisible watermark during compression on the video encoder and the respective embedding approach on the video is termed as optimized compression/watermarking algorithm and system. The performance of the video watermarking is better when the complexity is low and this low complexity is achieved in our proposed work by discrete cosine transform (DCT). Finally, the results show the high correlation against different attacks in the extraction section. The proposed algorithm is more successful in order to overcome, the conventional algorithm drawbacks and more suitable to applied in the real time applications.
AppML combines the most up-to-date techniques and ideas of modern web development. AppML is very easy to understand, and develop. AppML is not a programming language. It used to describe applications. By using AppML you can create Internet applications without programming. AppML allows the programmer to redefine both data and functions while the application is running.Since AppML applications are written in XML, AppML applications are self-describing.In this paper ,we are just providing all the key concept related to AppML .
Airline delays caused by bad weather, traffic control problems and mechanical repairs are difficult to predict. If your flight is canceled, most airlines will rebook you on the earliest flight possible to your destination, at no additional charge. Unfortunately for airline travelers, however, many of these flights do not leave on-time. The issue of delay is paramount for any airlines. Therefore we intend to aid the airlines by predicting the delays by using certain data patterns from the previous information. This system explores what factors influence the occurrence of flight delays along with the intensity of the delays. Our method is based on archived data at major airports in current flight information systems.
Classification in this scenario is hindered by the large number of attributes, which might occlude the dominant patterns of flight delays.The results of data analysis will suggest that flight delays follow certain patterns that distinguish them from on-time flights.
Our system also provides current weather details along with the weather delay probability. We have achieved much better accuracy in predicting delays. We may also discover that fairly good predictions can be made on the basis on a few attribute.
A Survey of different decoding schemes for LDPC block codes
Sonia, Geeta Arora
Low-density Parity-check (LDPC) code a very promising near-optimal error correction code(ECC), is being widely considered in next generation industry standards. In this paper, two simple iterative low complexity algorithms for decoding LDPC codes have been explained. These algorithms are implemented using real additions only and also not dependent on power spectral density. The survey paper provides fundamentals in the design of LDPC codes. Decoding algorithms used for LDPC codes require large amounts of data to be processed in a short time. LDPC codes are a special type of error correcting codes that is known for their good decoding performance and high throughput.
Cloud computing seems to be recent technology, but the term’s origin is quite unclear. In this paper, we present an overview of a project namely Heat Template Generator in Openstack. Our aim is to equip our readers with the overall functioning of the Heat component of Openstack and help them to understand the current features that are supported by the same. We then also briefly discuss about what is the current scenario about template writing and how this project will improves the productivity. We conclude by providing brief account of how this Template Generator is future of Orchestration in cloud computing.
Efficient Multiclass Classification Model For Imbalanced Data.
Mr. Roshan M.Pote , Prof. Mr. Shrikant P. Akarte
In this paper, we focused on developing efficient mining algorithm for multiclass classification from large of collection imbalanced data. And gives well sorted data. In the field of data mining, classification techniques can be used to find various selective features. This paper presents an innovative and efficient classification technique which includes the processes of feature selection and map reduction, to improve the effectiveness of using data for finding relevant and interesting information. In proposed system we can take sufficient .txt file as inputs & we apply variance algorithm & generate expected results. Classification is method to perform on poorly & minority class examples when the dataset is extremely imbalanced.
High Speed Fused Add-Multiply Operator Design Based On SQRT-CSLA Adder
Vidya Baby
Digital Signal Processing (DSP) applications involve various complex operations. This paper deals with the high speed operation in the design of the Fused-Add multiply operator for increasing the Performance. This includes the techniques to implement the direct recoding of the sum of two numbers in the Modified Booth (MB) form and involves the high speed addition using the SQRT-CSLA adder. This paper makes use of all the three structured and efficient recoding technique by means of three different schemes available in FAM designs. In comparison to the existing schemes, the proposed technique yields considerable reductions in terms of critical delay, hardware complexity and power consumption of the FAM unit.
“An Effective Image Encryption Based On The Combination Of Scan And Elgamal Method”
Mr. Ravi Mohan , Hira Lal Dhruw, Raghvendra
The requirements of information security within an organization have undergone tremendous changes. Before the widespread use of data processing equipment, the security of sensitive documents depends on filing cabinets with a combination lock for storing paper-based files or documents. However the scenario has change with the introduction of computer in handling businesses in organizations. At the same time, advances in networking and communication technology bring the business organizations worldwide working together as one entity. Due to the impact of this globalization, vast amount of various digital documents such as texts, images, videos, or audio travels from one destination to another via the network line. However some of these documents might be sensitive and confidential and therefore need to be protected. Image Encryption is the process of encoding messages in such a way that eavesdroppers or hackers cannot read it, however that authorized parties. With the huge growth of computer networks and the latest advances in digital technologies, a huge amount of digital data is being exchanged over various types of networks. It is often true that a large part of this information is either confidential or private.
G. Nagaraju, Dr. P. V. Ramaraju, P.M.M.P.Sandeep, SK.Mansoor Nawaz, S.Kiran Bhargav, M.Swaroop kiran
Text Extraction From Images With Edge-Enhanced Mser And Hardware Interfacing Using Arduino
G. Nagaraju, Dr. P. V. Ramaraju, P.M.M.P.Sandeep, SK.Mansoor Nawaz, S.Kiran Bhargav, M.Swaroop kiran
Detecting text in natural images is an important prerequisite. Text Characters and Strings provide valuable information for many applications, extracting text directly from natural scene images or videos is a challenging task because of diverse text patterns and various background interferences. In this paper an attempt is made to extract text in Natural Scene image. This text detection algorithm employs edge-Enhanced Maximally Stable Extremal Regions as basic letter candidates. These candidates are then filtered using geometric and stroke width information to exclude non-text objects. Letters are paired to identify text lines, which are subsequently separated into words. The coding has implemented in MATLAB R2014a and the qualitative analysis is done. Based on the simplicity of eliminating cluttered background. we followed other techniques in extracting text easily such as color thresholding app and Region of interest (ROI).According to the statements in text extracted from natural images the controlling of electrical motors can also be done. This can be explained by hardware implementation of text extraction application using ARDUINO kit.
A Hybrid Encryption Algorithm Based on DES and RSA In Bluetooth Communication
S.Prabhakar, Cha. Swamy, S.Ravi Kumar
The encryption algorithm employed by the Bluetooth to protect the confidentiality of data during transport between two or more devices is a 128-bit symmetric stream cipher called E0. It may be broken under certain conditions with the time complexity O(264 ).
To enhance the security of data transmission in algorithm based on Data Encryption Standard (DES) and Rivets Shamir Adelman (RSA) is proposed
Transformerless Inverter With Virtual Dc Bus Concept For Cost-Effective Grid-Connected Pv Power Systems
Dr.V.Balakrishna Reddy, Mohd.Mastafa
In order to eliminate the common-mode (CM) leakage current in the transformer less photovoltaic (PV) systems, the concept of the virtual dc bus is proposed in this paper. By connecting the grid neutral line directly to the negative pole of the dc bus, the stray capacitance between the PV panels and the ground is bypassed. As a result, the CM ground leakage current can be suppressed completely. Meanwhile, the virtual dc bus is created to provide the negative voltage level for the negative ac grid current generation. Consequently, the required dc bus voltage is still the same as that of the full-bridge inverter. Based on this concept, a novel transformer less inverter topology is derived, in which the virtual dc bus is realized with the switched capacitor technology. It consists of only five power switches, two capacitors, and a single filter inductor. Therefore, the power electronics cost can be curtailed. This advanced topology can be modulated with the uni polar sinusoidal pulse width modulation (SPWM) and the double frequency SPWM to reduce the output current ripple. As a result, a smaller filter inductor can be used to reduce the size and magnetic losses.
Secure Scheme over private Cloud data supporting Semantic Relation
Sayli Bahekar, Jyoti NIkam
Cloud becomes a necessity in the real IT world because of its appealing features It becomes a important tool to handle big data. but encryption makes it difficult to perform basic functionalities. Search scheme is cloud supports on exact keyword matching techniques. The synonym or semantic of the keyword is not considered. Hence a scheme is proposed which considers the semantics also of the submitted
Artificial Intelligence Based Speed Control of Induction Motor- A Detailed Study
Sarath S Kumar, Sibu C M
This paper presents a novel speed vector control scheme of an induction motor (IM) based on robust adaptive variable structure control (VSC) law and its experimental validation are presented. The design of the speed controller greatly affects the performance and efficiency of an electric drive. The combination of non-adaptive variable structure control (VSC) law with the proportional integral (PI) speed controllers is used to control an induction machine. In this proposal in order to increase the efficiency and to provide high performance, PI controllers are replaced by the adaptive controllers. The mathematical model for the adaptive VSC based on genetic algorithm is presented. Root locus and pole assignment techniques are also proposed along with the transient response technique. The simulation and result of this experiment gives high performance dynamic characteristics, change of motor speed and external load disturbances. To provide a numerical comparison between the controllers, a performance based index speed error is assigned.
A Design of Low Power Low Area High Speed Full Adder Using GDI Technique
M . Lakshmi Mohini, V. Ramesh
Full Adder is the basic building block for various arithmetic circuits such as compressors, multipliers, comparators and so on. 1-bit Full Adder cell is the important and basic block of an arithmetic unit of a system. Hence in order to improve the performance of the digital computer system one must improve the basic 1-bit full adder cell. In this, Full Adder is designed by using Hybrid-CMOS logic style. Hybrid designs are used to build a low power Full Adder cell. In the hybrid logic style more than one network is present. In general, it consists of three modules. Here we proposed the new Full Adder design by using the GDI (Gate Diffusion Index/Input) technique.GDI is a new method for reducing the power consumption, propagation delay, with less transistor count and power delay product (PDP).The simulation results are carried out on Tanner EDA tool. The simulation shows that the design has more efficient with less area, less power consumption and high speed as compared to CMOS techniques.
Design And Implementation Of High Efficient And Performance Of Modified Adder Circuit Using 128 Bit CSLA And BEC Adder For Filter Design
Mrs. K. Ramya, Mr. S. Sakthivel M.E
Low Power VLSI is the major area in VLSI design to develop the product in smart way. High performance is the keystone of the designer’s idea. Performance depends upon the speed, reduce-in-delay, less power consumption and majorly cost. In low power VLSI design, the contribution of adder is another platform. Design of an adder is to be more efficient and at the same time the internal parameters such as area, power, delay, cost to be monitor. Along with the benefits of super-fast performance the reduction of delay is a big advantage of using these types of adders. On the other hand, the formations of different inputs and to process those inputs are the difficult to implement. To overcome those difficulties, BEC based CSA are used. To process a complexity mathematical inputs like squaring of inputs some special circuit are to be use. Such a special circuit is the SQRT and along with CSLA it is called as SQRT-CSLA. Discussion is taken place how to implement the SQRT-CSLA to analysis the practical values. ASIC, SOC implementation is most popular in this type of implementation. In an existing methodology, it was analysed that the synthesis by ASIC results BEC-based SQRT-CSLA design involves 48% more ADP and consumes 50% more energy than the proposed SQRT-CSLA, on average, for different bit-widths. We are focusing on this point, how to maximize the efficiency by utilize the power less and maximum result in the output. Also, in an existing system the implementation is going with 64 bit widths as maximum and this clearly describes the factor how increase of the product efficiency took place. But in our proposed methodology, we are going to adopt the processing bits of the range 128 bits
Power Reduction with FlipFlop Grouping in Data Driven Clock Gating
T.Naresh, M.Lakshmi Kiran
In digital circuits Clock signal is one of the factor causing dynamic power consumption.Clock Gating is a method applied for reducing the dynamic power dissipation in sequential circuits.Here the redundant clock pulses in a high frequency clock signal are eliminated by performing AND operation on Enable signal and applied clock signal.Enable signal is determined by performing XOR operation on input and output of sequential element such as Flipflop.ANDed output—the Gated clock signal serves as clock to the existing circuit, which consists of clock pulses at the switching activities of input signal.This method can be extended to group of Flipflops having similarly switching inputs by performing OR operation on the enable signals of all Flipflops in the group.When this group drives a combinational circuit the leakage power exists,when the circuit is in stand-by mode i.e no existence of pulse in Gated clock signal.For eliminating this,we are introducing Power gating in which Gated clock signal is given as a sleep signal to NMOS transistor in pull down section.The simulation results are carried out on Tanner EDA tool.The simulation shows that the design has more efficient with less power consumption in CMOS techniques.
Implementation of Digital Pixel Sensor Based Parallel Architecture For Automatic Braking System in Automobiles
G. Kavitha, K.S. Md. Musa Mohinuddin
While automobiles moving on the road, if a pedestrian or an animal or anything commits to come on to the road unintentionally, then their motion features will be extracted, by which the automatic braking system of the automobile will be activated. Row-Parallel Processing of the input image achieves directional edge filtering and essential motion features are extracted with the Pixel-Parallel Processing based on the Digital Pixel Sensor(DPS) Technology
Non-Linear Data Perturbation Method for Privacy Preserving Data Mining
Dr.M. Gopichand
To hold effective data from big databases, Data Mining Methodologies are put on purpose. For recognizing the connections amid the item sets quickly, academicians and researchers are giving attention on the association rule mining amid many different data mining approaches. The subject of privacy matters when the data is spread amongst number of places or sites. The site owner doesn’t desires to supply their data to other sites. But they are involved to have information about global or whole results existed from mining actions. A new representation is suggested which adopts mix up based protected calculation cryptographic practice considering that whichever locale or site can be dealt as Faithful site to find global association rules for horizontally divided databases.
Intrusion Detection in Heterogeneous Wireless Sensor Networks using Multi-layer Multi Detector Approach
Ms.Smita H.Karande, Mr.S.P.Kosbatwar
In a WSN, there are two ways to find out inconsistency of object (i.e., an outlier) single-sensing find threat and multiple-sensing find threat. In the single-sensing find threat, the outlier can be successfully find threat by a single user sensor node. On the area, in the multiple-sensing find threat , the outlier can only be find threat by multiple cooperating user sensor nodes .In some applications, the sensed information provided by a single user sensor node might be poor for recognizing the attacker. It is because particular user sensor nodes can only sense a portion of the outlier. For example, the location of an outlier can only be determined from at least three user sensor nodes sensing. The goal of using a layered interface model is to minimize computation and the overall time required to find irregular events. The time required to find anomaly an intrusive event is important and can be minimized by filtering the communication overhead among different layers. This can be achieved by making the layers independent and self-handle to manage to block an attack without the need of a central decision-maker. Every layer in The MLMD framework is learn and adopt separately and then deployed sequentially. We planned 4 protocols that related to the four attacks groups mentioned in the map data.
Black Hole detection and avoidance in mobile Adhoc Networks
Asha Guddadavar , Bagali Ashvini A
The paper evolved out to address issues in MANET like security and performance. This paper proposes a cluster based concept to improve security and efficiency and guarantees the optimum utilization of the network resources. Performance proposed of MANET in presence of black hole attack. The simulation of the proposed methodology is carried out using NS2 network simulator and the simulation results reflects the performance of scheme for detection and prevention of the black hole.
Personal Emergency Notification Application for Mobile Devices
Akansha Raj, Asmita Pawar , Ganesh M. Gaikwad
In most recent years, many interesting applications for mobile devices are designed to improve our living quality and deal with house care issues under Android framework. In this paper, an emergency notification application for mobile devices will be designed. In the application, the position function of GPS and an easy used interface capable for sending emergency notification messages or recording systems are included. The application implements a location awareness system which gives the user’s current location, sends this location using SMS (Short Message Service) plus sharing location with friends and family and save the voice recording for future reference. Users can take benefit of this application in emergency situations by using emergency feature of this application. To get the location coordinates, application is using GPS (Global Positioning System) as location provider. The application design has five parts: a mobile client, a application server, a database, GPS system and a map service. A mobile client which consists of a mobile and GPS receiver finds the location of the user to get aware of his location. In order to share this location the mobile client sends this location to the application server from where other users can get this location if they have the authentication provided by the user. Personal emergency notification system is an important tool for personal security and safety.
An approach to implement Energy Efficiency for MAC Protocol in WSN
Arpita Sharma, Dr. Narendra Singh Yadav
In wireless sensor networks, energy efficiency is one of the most powerful factor. In wireless sensor networks, energy consumption can be occur in various forms and states like receiving, idle, transmitting and sleeping state energy consumptions. This is the reason, why implementation of energy efficiency model becomes important and energy model considered as a resource constraints.
In this research, we implemented energy efficiency model for Sensor-MAC, which is a medium access control (MAC) layer protocol in wireless sensor network (WSN). We’ve traced energy consumption in four different states of network that includes sleep state, receive state, transmitting state and idle state. We further calculated average energy consumption for all states. We added energy breakdown in each state to support detailed energy analysis.
We calculated average energy consumption in all states by taking different number of nodes and making different numbers of connection between them in WSN and have a conclusion that when we increase number of connections in a WSN, for the SMAC protocol the energy consumption is also increases. We also noticed energy consumption in network when the network has no connection between nodes.
iSearch (Intelligent Search) is an enterprise search solution developed using Apache SOLR. iSearch system index data and documents from a variety of sources such as: file systems, intranets and document management systems. As the data in enterprise is becoming massive so iSearch helps people to seek the specific information they need of any rich text file format from anywhere inside their company. iSearch identifies and enables specific content across the enterprise to be efficiently searched, and displayed only to authorized users.
Denoising Of Medical Ultrasound Images In Wavelet Domain
Amit Jain
Ultrasonography is regarded as one of the best and most powerful techniques for diagnostic examination and analysis of various imaging organs and soft tissue structures present in human body. It is used for visualizing muscles, their shape and size, their structure and any pathological lesions. The usefulness of ultrasound imaging is degraded by the existence of a signal dependent noise called as speckle noise. This speckle pattern is further dependent on the structure of the imaging tissue as well as on various imaging parameters. In the proposed work, a novel approach has been suggested with an adaptive threshold estimator for image denoising in wavelet domain based on the modeling of different sub-band coefficients at different stages in ultrasound imaging systems. The proposed method has been found to be more adaptive as the estimated parameters for threshold value depends on image sub-band data. The calculated threshold value depends upon scale parameter, noise variance and standard deviation corresponding to each sub-band of the noisy image. The scale parameter is dependent upon the sub-band size and number of decompositions. The experimental results carried out on many ultrasound test images outperformed both qualitatively and quantitatively, when compared with some other existing denoising techniques like Normal Shrink, Median Filter, and Wiener Filter. The clinical validation by a radiologist of the results has also been performed.
Mammogram Image Preprocessing for detection of masses in Breast Cancer
Mrs. Sandhya G, Dr. D Vasumathi, Dr. G T Raju
Breast cancer is the most commonly observed cancer in women both in the developing and the developed countries of the world . Cancer refers to the uncontrolled multiplication of a group of cells in a particular location of the body. A group of rapidly growing or dividing cells may form lump or mass of extra tissue. These masses are referred to as tumors. Cancer cells are termed as malignant tumors. Any form of malignant tumor developed from breast cells is nothing but breast cancer. Breast cancer detection is the standard diagnosis and prognosis. Digital Mammogram has considered as the most popular screening technique for early detection of Breast Cancer and other abnormalities. Digital mammograms are medical images that are difficult to interpret, Develop Computer Aided Diagnosis (CAD) systems that will improve detection of abnormalities in mammogram images. In this paper we present detection of abnormal masses by preprocessing of Breast Images, Region of Interest (ROI). The filters proposed are fully able to isolate and abnormal regions in the breast tissue, If any abnormalities are present it gets accurately highlighted by this filtering and mammogram image preprocessing.
Re-Encryption with Attribute Key Management for Secure Transactions in Cloud
Yugesh Mothukuru and K. John Singh
Transferring of information to the cloud is valuable due to the access and adaptability of critical technical difficulties that are remain. Ensure sensitive information stored in the cloud from being detail and liberated by a cloud supplier that is fair yet curiosity. Novel adjustments to attribute based encryption are permitted to the approved clients to access the cloud information focused on the fulfillment of providing the services such that the higher computational burden from cryptographic operations is allocated to the cloud supplier and the aggregate expenses is brought down for the cloud clients. The cloud provider is to diminish the cost of client repudiation in a portable client environment while saving the security of client information stored in the cloud may alternatively perform information re-encryption. The protocol, which is realize on commercially popular cloud platforms to simulate real- world standards that show the efficiency of the scheme.
PAPR and SNR performance analysis of IFDMA and LFDMA technique in a single carrier frequency division multiple access system
Suyash Kumar Singh, Manish Kumar Patidar
The single carrier multiple access scheme is a novel method of radio transmission currently used in long term evolution (LTE) technology for uplink due to its high data rates and lower peak-to-average power-ratio (PAPR) as compared to OFDM technique. In this paper we analytically derive the time domain SC-FDMA signal and numerically compare the PAPR characteristics using the complementary cumulative distribution function (CCDF) of PAPR with the help of raised cosine (RC) and root raised cosine pulse (RR) shaping method and discuss the resulting PAPR of both the mapping schemes. Comparing the two forms of SC-FDMA, we find that interleaved (FDMA) has lower PAPR than localised (FDMA). We also discuss the SNR (signal to noise ratio) performance of both LFDMA and IFDMA schemes and find that the SNR performance of localised (FDMA) is better than interleaved (FDMA) technique.
A survey on Efficient Geographic Multicasting Protocol
Kalyani S Kumar
There is an increasing demand and a big challenge to design more scalable and reliable multicast protocol over a dynamic ad hoc network (MANET). An efficient and scalable geographic multicast protocol, EGMP for MANET which uses a virtual-zone-based structure to implement scalable and efficient group membership management. A network-wide zone-based bi-directional tree is constructed to achieve more efficient membership management and multicast delivery. The scalability of EGMP is achieved through a two-tier virtual-zone-based structure, which takes advantage of the geometric information to greatly simplify the zone management and packet forwarding. A zone-based bi-directional multicast tree is built at the upper tier for more efficient multicast membership management and data delivery, while the intra-zone management is performed at the lower tier to realize the local membership management. The position information is used in the protocol to guide the zone structure building, multicast tree construction, maintenance, and multi- cast packet forwarding. Compared to conventional topology- based multicast protocols, the use of location information in EGMP significantly reduces the tree construction and maintenance overhead, and enables quicker tree structure adaptation to the network topology change. We also develop a scheme to handle the empty zone problem, which is challenging for the zone-based protocols. Additionally, EGMP makes use of geographic forwarding for reliable packet transmissions, and efficiently tracks the positions of multicast group members without resorting to an external location server.
Continuous stirred tank reactor system is a typical chemical reactor system with complex nonlinear dynamics characteristics. The purpose of this paper is to control the temperature of CSTR using PID controller and with the help of ZIEGLER NICHOLS method and we are done tuning of PID controller. The whole process of model design of and result, simulation are done in MATLAB SIMULINK software.
Various Bit-Rotation Technique On Key-Symmetric Matrix
Mampa Ghosh, Debjani Chakraborty
Cryptography is one type of very much challenging methods in today’s technological world for hiding the secret information. Security is the main purpose of the Cryptography. Actually, the main aim of the Cryptography is protecting the data or informations from unauthorized users or hackers. In cryptography, it has two parts: one is Encryption Algorithm and another one is Decryption Algorithm. In this paper, we have proposed a new Encryption Algorithm to encrypt the plain Text to Cipher Text and the Decryption Algorithm to do the reverse. Here we applied various bit-rotation techniques with complement, shifting and transposing on key-symmetric matrix. These operations are simple and easily to implement.
Dr. P. V. Ramaraju, G. Nagaraju, V.D.V.N.S. Prasanth, B. Tripura sankar, P. Krishna, V. Venkat reddy
Feature Based Detection Of Liver Tumor Using K-Means Clustering And Classifying Using Probabilistic Neural Networks
Dr. P. V. Ramaraju, G. Nagaraju, V.D.V.N.S. Prasanth, B. Tripura sankar, P. Krishna, V. Venkat reddy
Liver cancer is a chronic cancer which originates in the liver. The tumor may be originated elsewhere in the body but latter it migrates towards the liver and makes severe damage to it. In many cases it could not be possible to identify the intensity but symptomatically abdominal pain, jaundice, dysfunction of liver will lead to found its presence. Many of the signs and symptoms of liver cancer can also be caused by other conditions like High blood calcium levels (hypocalcaemia), Low blood sugar levels (hypoglycaemia), Breast enlargement (gynecomastia), High counts of red blood cells (erythrocytosis), High cholesterol levels. Treatment of any cancer mainly depends on tumor size and grading. Hepatocellular carcinoma is the most common type of liver cancer. The best method of diagnosis involves CT scan of abdomen, it provides accurate results. This proposed method includes segmentation and K-means clustering for segmenting the computed tomography (CT) images, and probabilistic neural network is used to detect the tumor in the earlier stages.
Transactions in real-time databases should be scheduled considering both data consistency and timing constraints. In addition, a real-time database must adapt to changes in the operating environment and guarantee the completion of critical tasks. The effects of scheduling decisions and concurrency control mechanisms for real-time database
systems have typically been demonstrated in a simulated environment. Many time-critical applications data may be distributed among multiple sites. For example, such applications include command and control, industrial automation, aerospace and defense systems, telecommunications, banking, etc. In such applications, it imperative that the data be available to the requesting transactions at the time it is needed. In a typical distributed database, the transaction is required to access the remote data directly, at the risk of missing its deadline. Another problem can occur in such a scenario when the requesting transaction accesses the data, but it is not temporally valid. That is, its value is “out-of-date” because the transaction did not read from the most recent update. A replication algorithm creates replication transaction based on client’s data requirements in a distributed real time databases. These replication transaction copy data objects to the site on which they are needed just in time for the deadline to occur. The algorithm carefully computes the parameters of the replication transactions so that we can guarantee that any requests that read data, in fact, read temporally valid data. This algorithm is designed to work in a static environment in which all object locations, and client data requirements are known a priori.
To Know About The Latest Bluetooth As A Wireless Technology
Er. Anup lal Yadav, Er. Sahil Verma, Er. Kavita
Blue tooth wireless technology is vastly different from 802.11b wireless local area network technology. Not only is it significantly slower than 802.11b products, but also it's also completely incompatible with them. To better understand this, let's take a look at what Blue tooth technology is and what exactly it was designed to do. Bluetooth is a telecommunications industry specification that describes how mobile phones, computers, and personal digital assistants can be easily interconnected using a short-range wireless connection. Bluetooth products achieve this by placing a small, inexpensive radio transmitter/receiver module in each electronic device. This module acts as the physical medium to connect these devices and also provides the necessary communication protocols needed for these devices to successfully transmit data.
As an information resource the Internet is increasing in size, depth and complexity. Availability of information is no longer an issue on the Internet. In an era where human presence is worth so much, it is becoming increasingly time consuming to analyze the relevant information from an ever-growing sea of inadequate or inappropriate data and accelerate the laborious process of web browsing. Information provided by search engines to web users is irrelevant, due to the lack of providing structural information and categorization of the documents. The issue of uncategorized data has become a standoff due to the inconsistencies and variations in the characteristics of the data.
In this paper, we present a way to cut short the sea of data to relevant data by forming categories and then providing summarized data with links in specific category. The projected way has insightful ability to improve the drawback and returns efficient outcomes.
Oraetue Chijioke Dennis, Azubogu A. C. O, Nwalozie G. C
Spatial And Temporal Characterization Of 433mhz Radio Base Wsn In Outdoor Environment
Oraetue Chijioke Dennis, Azubogu A. C. O, Nwalozie G. C
In this paper 433MHz radio (KYL500s) based Wireless Sensor Network (WSN) is used in an experimental testbed to investigate spatial and temporal variations of the packet error rate which is a wireless link quality metric in order to determine the link quality. The packet error rate measurements were taken at intervals of 10 meters up to a distance of 600meters in 0o, 90o, 180o and 270o directions and the results obtained show that the packet error rate increases with distance. The experimental WSN was set up for the whole day and the measurement of packet error rate variation with time was taken at fixed wireless sensor distances of 50meters, 100meters, 150meters, 200meters, 250meters, 300meters, 350meters, 400meters,450meters,500meters, 550meters and 600meters from the sink. The temporal result showed that the link quality is worse at approximately 10.00 and 17.00 hours period.
A Survey on Different Methods for Design of Sparse FIR Filter
Navreet Kaur , Ankur Singhal
Sparsity has been a great issue in the design of FIR filter. The objective of a Sparse FIR filter design is to reduce the implementation complexity as the number of nonzero-coefficients is reduced. By increasing the number of zero-valued coefficients, the implementation of the filter becomes simple as the additions and multiplications corresponding to zero-valued coefficients are omitted. The proposed paper discusses the various techniques used earlier for designing a sparse FIR filter. This paper also presents the comparative study of several filter designing methods used by researchers for this work.
Comparative study and evaluation of various data classification techniques in data mining
Vivek Verma, Ram Nivas Giri
Data Mining is knowledge discovery process in database designed to extract data from a dataset and transforms it in to desired data. data processing action is similarly acclimated in get of constant patterns and/or analytical relationships amid variables, and a new to validate the accusation by applying the detected patterns to new subsets of knowledge. Data categoryification is one in every of the info mining technique to map great amount of data set in to applicable class. Data categoryification is reasonably supervised learning that is employed to predict class for information input, wherever categories are predefined.Supervised learning is that part of automatic learning which focuses on modeling input/output relationship the goal of supervised learning is to identify an optimal mapping from input variables to some output variables, which is based on a sample of observations of the values of the variables. Data classification technique includes various applications like handwriting recognition, speech recognition, iris matching, text classification, computer vision, drug design etc. objective of this paper is to survey major techniques of data classification. Several major classification techniques are Artificial neural network, decision trees, k-nearest neighbor (KNN), support vector machine, navie-bayesian classifier, the aim of study to make comparative analysis of major data classification techniques.
A Survey on Coverage Problem in Wireless Sensor Network
Nidhi Chaudhary, Sahil Gupta
A wireless sensor network (WSN) is a composed group of a small tiny battery-equipped device capable of sensing, communication and computation which can be scattered over a vast region for the purpose of detecting or monitoring region perfectly. Sensing coverage and network connectivity are two of the most fundamental problem in WSNs. In WSN is usually defined as a measure of how well the sensing field is monitored or sensors are able to observe the physical space. Connectivity can be defined as the ability of the sensor to reach the data sink. Finding an optimal deployment strategy that provides high degree of coverage with network connectivity is extremely challenging. Therefore, maximizing network coverage as well as maintaining network connectivity using the resources constrained node is a non-trivial problem. In this survey article, we classify the coverage problem; analyze the relationship between coverage and connectivity and research challenges and existing problem in this area.
Search Engine Results Clustering Using TF-IDF Based Apriori Approach
Hetal C. Chaudhari, K. P.Wagh and P.N.Chatur
Internet is being used at a greater extent nowadays. All the types of data are available very easily on the internet. The user submits a query to the search engine and thousands of related documents are retuned as a result to the query. The web documents contain different types of data like text, images, videos, etc. So, the web documents are not structured properly and are unorganized. It becomes much difficult for users to find specific document from thousands of documents. The solution to this problem is clustering the web documents. Clustering congregates the documents showing similar context to the user query. The similar documents are assembled in a cluster. So, clustering reduces user’s task to discriminate among the thousands documents returned as a result to a query. Also, ranking can be applied further to view the most relevant documents at the top. Different documents in a cluster are ranked and the documents can be arranged according to their similarity. Different functions can be used to calculate the similarity measure among the documents. We combine these two concepts and propose a tf-idf based apriori scheme for web document clustering and ranking. In this scheme, first clustering is applied on the documents. The modified tf-idf based apriori algorithm is used to serve this purpose. And then, ranking is performed to arrange the most pertinent documents at the top with regard to the user query. We use online web pages returned as results for a query as the dataset for our experimental work. This approach gives a good F-measure value, i.e. 81%. The proposed method is found superior to some traditional clustering approaches.
Design and Optimization of Planar Antenna With horizontal slit for ISM Band(2.4 GHz) Applications
Parthrajsinh K. Jhala, S.Sreenath Kashyap
The ever increasing demand of communication equipment and the emergence of new technologies require an efficient design of antenna that is of smaller size for wider range of frequency for ISM Band application such as W-LAN.
The (ISM) bandranges from 2 GHz to 2.8 GHz which also include WLAN according to IEEE 802.11g standard that extended throughput to up to 54 Mbit/s using the same 2.4 GHz band. Because of the various advantages like low profile, ease of integration, low cost etc. Microstrip antenna found vast application in communication system. Here we report the investigation on Planar antenna using probe feed technique. Analysis is done on the two different substrate i.e. FR-4 and RT Duroid 5880 with dielectric constant 4.3and 2.2 respectively with introduction of a horizontal slit along the width of the patch. The antenna is simulated using CST (Computer Simulation Technology) microwave studio v.2011 and the antenna parameters like return loss, VSWR, Gain, Bandwidth have been analyzed. The design procedure for patch antenna for ISM Band application is illustrated.
Rohan.D.Balgude , Swapnil.P.Rupanavar, Pradeep.S.Bagul, Asst Prof. Manoday.R.Ramteke
Designing of Hydraulic Ram Pump
Rohan.D.Balgude , Swapnil.P.Rupanavar, Pradeep.S.Bagul, Asst Prof. Manoday.R.Ramteke
This paper describes the techniques and guidelines to successfully install the modern hydraulic ram pump. The proposed technique illustrates the methodology that can be used for the primary design considerations and applications in various ways. This Hydraulic ram pump plays an important role in areas which are not connected to national electricity grid. This method provides means of continuous water supply. Basic principle of working is ramming action of air inside pressure vessel.
Steganography is going to gain its importance due to the exponential growth and secret communication of potential computer users over the internet. It can also be defined as the study of invisible communication that usually deals with the ways of hiding the existence of the communicated message. Generally data embedding isachieved in communication, image, text, voice or multimedia content for copyright, military communication, authentication and many other purposes. In image Steganography, secret communication is achieved to embed a message into cover image (used as the carrier to embed message into) and generate a stegoimage(generated image which is carrying a hidden message). In proposed we are embedding video file in the image for secret sharing of data and also keen on obtaining the same quality of image after decoding.This paper intends to give an overview of to hide videos inside image, its uses and techniques. In this project video hidden inside the image for which video get compressed and stored inside the image.features are extracted before encryption for effective re-construction
Prof. A. M. Magar, Akshay A. Kulkarni, Akshay S. Kyatam
Tracking And Scheduling of State Transport Bus using RFID
Prof. A. M. Magar, Akshay A. Kulkarni, Akshay S. Kyatam
This paper deals with the implementation of a system which is capable of tracking and scheduling of State Transport Bus.For this implementation we are using Radio Frequency Identification(RFID).Proposed system can save the efforts of depot staff to maintain the schedules also it can save paper ,in other words it is eco-friendly .Theproposed system can inform depot managers and other staff members whether the bus is arriving on time ,early or late .Passengers can also checkout last bus depot visited by the respective bus.
Design of Environment Monitoring and Control System
Er.Satvir Singh, Dr.Rajeshwar Singh
For last few years, challenges of monitoring and control of distant environmental parameters accurately has emerged as new field of research. The concept of Internet of Things (IOT) is also emerging very fast where everything around us comes with an internet connectivity for monitoring and control. Monitoring the environmental parameters and initiating a control action from internet is also part of this concept. In our proposed work, we design an environment monitoring system, capable of monitoring and control of environmental parameters like temperature, pressure and humidity. Also, we focus on design of a low cost system that is capable of not only remotely monitoring the environment variables like temperature, pressure and humidity but also initiates some control action like switching devices ON/OFF from the internet. This system uses Wireless sensor Networks for sensing the environment parameters in the area under supervision. Sensors Node has been designed to measure the temperature, pressure and humidity. The Control node has been designed to initiate the control action. The Central Monitoring is based on ARM11 raspberry pi board.
An Optimized Image Encryption Technique for Multikey conservative chaotic system
P.T.Bhuvana, L.Thirumal
In recent years of wireless access communication, the internet and other computer communication technologies are radically changing the ways of communication and information exchanging source. However, along with speed, efficiency and cost-saving benefits of the digital revolution, there exists a new challenge for the security and privacy of communication of information. Hence cryptographic algorithms are efficient tool for encryption. In this paper a new color image encryption algorithm is proposed by combining Lorentz and Rossler attractor with multi-key concept for conservative chaotic system. Also, the pixel values of the plain image are modified randomly using confusion and diffusion process to strengthen the image security. The resultant encrypted image is compared with various results obtained using single and multi key algorithms. The proposed algorithms is also analyzed under different critical attacks and the results show that the proposed system has a better efficiency, image confidentiality, high encryption and decryption speed.
Deepthi Srambical Poulose , Dinesh Kumar Sahu and Anil Rajput
An Experimental Study of Data Mining Techniques in Blood Donors Sector
Deepthi Srambical Poulose , Dinesh Kumar Sahu and Anil Rajput
In today’s computer age data storage has been growing in size to unthinkable ranges that only computerized methods applied to find information among these large repositories of data available to organizations whether it was online or offline. Data mining was conceptualised in the 1990s as a means of addressing the problem of analyzing the vast repositories of data that are available to mankind, and being added to continuously. Data mining is necessary to extract hidden useful information from the large datasets in a given application. This usefulness relates to the user goal, in other words only the user can determine whether the resulting knowledge answers his goal. The growing quality demand in the blood bank sector makes it necessary to exploit the whole potential of stored data efficiently, not only the clinical data and also to improve the behaviours of the blood donors.
Internet communication has become the essential part of the infrastructure of today’s world. The information communicated in various forms and this information is used in many applications which are in a secrete format. Security is one of the most important factors for transferring different types of important documents. Internet is one of the most important media which is used to transfer documents. This is used in bank transfers, email communications, credit card purchases on large number of daily email and military application. But in reality the internet is not a secure form. So we are suggesting VC that is visual cryptography method which is simple, fast and provide security while sharing secretes documents over the internet. These documents are presented in a bit map format file, and extend it into two or more encoded file shares which can be transferred to the receiver in a cover image using BPCS that is Bit Plane Complexity Segmentation technology through electronic mail. The Bit Plane Complexity Segmentation allows hiding large secret information into cover image. This cover images are selected for analysis and according to the threshold value it will increase the complexity of each segment to verify the changes occurred in the original image. And data will be stored at higher complexity. Steganography is technique of data hiding. Steganography is a technique in which secret data is hidden into vessel image. All other techniques have limited data hiding Capacity and other technique can hide up to 15% of data amount of vessel image. But hiding capacity of steganographic technique is up to 50 –60%. This technique is called Bit Plane Complexity Segmentation (BPCS) Steganography. Therefore, the final image can be obtained only when the number of shares is combined together at receiving side. Thus a combined use of VC and BPCS technology provides data security to all forms of documents during transfer over internet.
Hidden Markov Model With Biclustering Cache Replacement Policy For Location Based Services In MANET
Ms. Bhakti D. Shelar , Mr. D. K. Chitre
System performance of mobile client is very important in mobile environment. Frequently accessed data items are cached toimprove performance. Cache replacement technique is used to find appropriate data items for eviction from cache. Selection of suitable cache replacement strategy is very important as cache size is limited. Available policies do not take into account the movement patterns of the client. Here, we propose a new cache replacement policy for location dependent data in mobile environment. We use hidden markov model as a location prediction tool and then huge data is clustered as per location and type of data. This makes the policy adaptive to client’s movement pattern unlike earlier policies that consider the directional / non-directional data distance only. Simulation results show that the proposed policy significantly improves the system performance in comparison to previous schemes in terms of cache hit ratio.
Compression On Wireless Sensor Network With Performance Guarantee
Reshma Mary Abraham, Dr. P. Sriramya
Multimedia contains enormous amount of files it consumes more memory space, hard disk, energy and time. To overcome these problems we propose ―Robust information driven data compression architectures for wireless networks. The relationship between the values of sensors will be useful to improve the compression performance. In this, we are using two types of techniques; they are ―logical mapping algorithm and state of the distributed data compression. The cluster-based and information-driven architecture (RIDA) uses two types of transforms, discrete cosine transform and discrete wavelet transform. This architecture will reduce up to 30% of energy, 80-95% of bandwidth can be saved, and we can find the missing nodes also.
Bokefode Jayant D, Ubale Swapnaja A , Pingale Subhash V. , Rajguru Abhijit A.
Secure Role Based Cloud System By Integrating Trust and Role Based Encryption Scheme
Bokefode Jayant D, Ubale Swapnaja A , Pingale Subhash V. , Rajguru Abhijit A.
Now a day’s cloud based systems are getting more attraction as the number of users data is rapidly increasing. The organizations are worried about the storing of data and its security. The Role Based Encryption Scheme (RBAC) provides a way to the users for managing and sharing the data in a cloud. The RBAC will combine cryptographic techniques and access controls to secure the data. With the use of these techniques the owner of data will encrypt the data in such a way that the appropriate user of the role can decrypt and view the data. In this paper we provide a hybrid cloud in which the organization data will be stored in a private cloud while the authorized users are allowed to store their data in public cloud. The role based encryption (RBE) scheme will be helpful to solve the security issue. In this paper we are integrating the RBE scheme using the AES algorithm for encryption and decryption of the data. The paper also shows the mathematical model for calculating the trust of the user. This can be used by the owner when he wants to upload any data on the cloud.
Deepthi Srambical Poulose , Dinesh Kumar Sahu and Anil Rajput
An Experimental Study of Data Mining Techniques in Blood Donors Sector
Deepthi Srambical Poulose , Dinesh Kumar Sahu and Anil Rajput
In today’s computer age data storage has been growing in size to unthinkable ranges that only computerized methods applied to find information among these large repositories of data available to organizations whether it was online or offline. Data mining was conceptualised in the 1990s as a means of addressing the problem of analyzing the vast repositories of data that are available to mankind, and being added to continuously. Data mining is necessary to extract hidden useful information from the large datasets in a given application. This usefulness relates to the user goal, in other words only the user can determine whether the resulting knowledge answers his goal. The growing quality demand in the blood bank sector makes it necessary to exploit the whole potential of stored data efficiently, not only the clinical data and also to improve the behaviours of the blood donors.
B Suresh Kumar, Thirupathi Rao D, SK Gouse Basheed
Modelling And Simulation Of STATCOM For Compensation Of Reactive Power To Improving The Power Quality By Using PI With Fuzzy Logic Controller
B Suresh Kumar, Thirupathi Rao D, SK Gouse Basheed
Static Synchronous Compensator (STATCOM) regulates the voltage and corrects the power factor at the point of common coupling (PCC) by injecting reactive power. Also, this device plays a vital role as a stability aid for small and large transient disturbances in an interconnected power system. This paper deals only with power-factor correction mode and show the Total Harmonic Distortion (THD) in source current is drastically reduced fuzzy controller is included in the STATCOM control circuit..The use of STATCOM a FACTS device is increasing in the power system for enhancing power transfer capability and dynamic reactive power support in power system. In this paper a Fuzzy logic new control method for STATCOM is proposed and applied for damping oscillations. Simulations using MATLAB / SIMULINK are carried out to verify the performance of the proposed controller. Significant improvement of damping oscillations has been achieved with the proposed fuzzy logic supplementary controller. The results of these studies show that the designed controller has an excellent capability in damping power system oscillations.
The main aim of this paper is to design a wireless communication between helmet and the bike, so that the bike rider would mandatorily use his helmet, indeed to have a safe drive.In this paper, helmet plays a vital roleand so we named this device as INGENIOUS ARMOR, which is cleverly and originally devised to be well-suited for its purpose i.e.,for protection from death and head injuries.This paper is designed around a Microcontroller which forms the control unit of the project. According to this paper, the bike doesn’t ignite until and unless the bike rider wears the helmet. This is done by communication between helmet (transmitter) and the bike (receiver).In the same way, i.e., if the bike rider is in drunken state then also bike doesn’t get ignite. And if the person unfortunately meets with an accident then a message would be sent to his beloved family members, police, and ambulance in order to save a life to a possible extent. And the bike rider is indicated with a buzzer at wide curving’s or any accident prone areas and of school zones. The above two cases are carried with use GPS &GSM modules.
Wireless Networks – Analysis on Prevention of Jamming Attacks
Ramesh Kumar Mojjada
Jamming a wireless communication device that transmits on the same frequency range as a cell phone to create strong cell tower interference and block cell phone signals and call transmission. Jammers are usually undetectable and may experience minimal effects such as poor signal reception and devices may be used in any location but are typically deployed where cell phone use may be disruptive or wireless may get vulnerable to interference attacks. To prevent from jamming in any communication hiding the information is the best concern typically jamming has been addressed under an external threat model.Adversary is a short period of time, jammer selectivelytargeting messages of priority high importance. Our analysis providesprevention of jamming in terms of network performance degradation andeffort by presenting machine learning algorithm like clustering & Classification, a selective attack on wireless network on routing. We show comparative work on jamming attacks can belaunched by performing real-time packet classification. Cryptographic primitive is hides the data, to mitigate the attacks prevents real time packet classification, provides the security communication.
As a huge data source the internet contains a large number of worthless information, and the data of information is usually in the form of semi-structured in HTML web pages. This paper uses a new methodology to perform the task automatically. It consists of two steps, the foremost one is identifying individual data records in a page, and the next is aligning and extracting data items from the identified information data records. For the foremost process, A method based on visual information to segment of data records, which is more relevant to past methods. The next process, uses a novel partial alignment is a technique based on hierarchical parent child matching method. Partial alignment means we aligning only those data fields in a pair of data records that can be aligned (or matched) , and make none relevant information on the rest of the data fields. This approach does enables more reliable alignment in the multiple data records. The results are using a large number of Web pages from diverse domains show that the proposed two-step technique is able to segment information data history, align and retrieved data from the very well matched with relevant result. The parameters used are precision, recall and f-measure is used for evaluating the performance of the existing and proposed methods of web data extraction method. The process results prove that the proposed method is better than the existing method.
Better Performance Codes Of UWB Based Systems For Wireless Body Area Network
Gundeep Kaur
In this paper, I will propose and analyze binary ZCD (zero correlation duration) code and PN (pseudonoise) codes of ultra wide band (UWB) systems for wireless body area network (WBAN). In wireless body area network, there are many types of wireless communication devices on a human body, multiple access interference (MAI) can occur. Systems with less MAI effect have more efficiency and less harmful effect. The performance is checked and resulted in terms of BER (bit error rate) of UWB system for different devices in WBAN channel and different length of ZCD and PN (pseudonoise) codes. Firstly for two devices(one with ZCD code and one with PN code) and then for three devices. From simulation results, it is confirmed that the UWB system with the ZCD code achieves better performance compared with that of (PN) code.
Towards Secure and Dependable Message Authentication in WSN
Kavitha M, Rekah H, Dr. Siddappa M
In recent years, Wireless Sensor Network (WSN) finds applicability in every domain namely the military, health care, structure monitoring, forest surveillance, agriculture, etc, as they can be deployed in unattended hostile environment. WSN will be an integral part of human lives. However one of the main concerns in WSN is its limited resources. Wireless sensor networks consist of one or more base stations and a number of sensor nodes that get stimulated from external events. This paper provides message authentication scheme in wireless sensor networks. Message authentication is the effective ways to prohibited unauthorized users and unwanted messages forwarded in wireless sensor networks. Many message authentication methods have been developed, based on either public-key or symmetric key cryptosystems. But most of them have the lack of scalability, limitations of high computational and communication overhead, resilience to node compromise attacks and threshold problem. To solve such problem, a new authentication scheme has been developed using the elliptic curve cryptography. This scheme proposes any node can transmit any number of messages without threshold problem and also provide message source privacy.
A Traffic Model for Routing Logic in Network On Chip
C.G. Joy Merline, R.S. Niveditha
The increased demand for on-chip communication bandwidth as a result of the multi-core trend has made packet-switched networks-on-chip (NoCs) a more compelling choice for the communication backbone in next-generation systems. However, NoC designs have many issues like power, area, and performance trade-offs in topology, buffer sizes, routing algorithms and flow control mechanisms—hence the study of new NoC designs can be very time intensive. The effectiveness of the NoC system can be found by hardware implementation. For this a traffic generator can be designed and the data flow in the NoC can be checked. The Traffic generator with random number generator is used for traffic creation and Bernoulli process based traffic injection is used to inject congestion in the NoC. The TG is also capable of generating error free traffic using it’s analyser unit.
A 16-Bit Ripple Carry Adder Design Using High Speed Modified Feedthrough Logic
Avinash Singh, Dr. Subodh Wairya
This paper presents the design and simulation of high speed 16-bit ripple carry adder using a new CMOS logic family called feedthrough logic(FTL).FTL arithmetic circuits provides for smaller propagation time delay when compared with the standard CMOS technologies. The proposed circuit has very small propagation time delay as compared to existing dynamic logic circuits. The proposed modified feedthrough logic completely eliminates the output distortion occurs in existing FTL structure having reference voltage Vdd/2. In This paper, a long chain of inverters (20-stages) and 16-bit ripple carry adder is designed by modified feedthrough logic. Then comparison analysis has been carried out by simulating the circuits in 180nm CMOS process technology from TSMC using Tanner EDA 14.11.
Secure Data Retrieval for Decentralized Disruption-Tolerant Military Networks
Ms.S.Iswary, Mr.D.Jayakumar
Mobile nodes in military environments such as a battlefield or a hostile region are likely to suffer from intermittent network connectivity and frequent partitions. Disruption-tolerant network (DTN) technologies are becoming successful solutions that allow wireless devices carried by soldiers to communicate with each other and access the confidential information or command reliably by exploiting external storage nodes. Some of the most challenging issues in this scenario are the enforcement of authorization policies and the policies update for secure data retrieval. Ciphertext-policy attribute-based encryption (CP -ABE) is a promising cryptographic solution to the access control issues. However, the problem of applying CP-ABE in decentralized DTNs introduces several security and privacy challenges with regard to the attribute revocation, key escrow, and coordination of attributes issued from different authorities. In this paper, we propose a se-cure data retrieval scheme using CP-ABE for decentralized DTNs where multiple key authorities manage their attributes independently. We demonstrate how to apply the proposed mechanism to securely and efficiently manage the confidential data distributed in the disruption-tolerant military network.
An Efficient Authentication Scheme for Wireless Sensor Network
Alok Kumar, Akhilesh Yadav
Sensor nodes are usually installed in public or unattended environment and secure authentication scheme performs important role among communication in between user, wireless sensor network (WSN) and sensor node as it secures the communication by sweeping the security flaw. Authentication scheme must be developed by using all possible available resources and without compromising any security flaw/risk. This paper proposes an efficient authentication scheme for wireless sensor network, which is based on username password and smart card, and this scheme provides better security in between communicating nodes. Users are able to choose own username and password. This scheme involves registration, login and session key establishment phase. Most of scheme don’t allow to hide username in entire scheme, which may lead to insider attack. This scheme uses the combination of symmetric and asymmetric keys, and maintains the authenticity of nodes using mutual authentication that prevents the network from man-in-middle attack and makes it more secure against other security attacks such as insider attack, password guessing attack, replay attack etc.
Obam, Sylvester Ogah, (Corresponding author) and Taku, Kumator Josiphiah
Comparative Study of Some Engineering Properties of Aluminium Roof Sheets Manufactured In Nigeria and China
Obam, Sylvester Ogah, (Corresponding author) and Taku, Kumator Josiphiah
Man has utilized various natural resources through technological methods to create environmentally safe, effective roofing materials. The materials include straw, mud, wood, tile, metal, aluminium, etc. A good roofing material should adequately withstand the loads it is subjected to within its life span. This depends on the quality of the material such as tensile strength, density and material composition. Roof materials are often destroyed by wind and heavy rainfall in Nigeria. Nigeria imports most of her Aluminium roofing sheets from China. This research compares some Strength properties of aluminum roof sheets produced in Nigeria with that from China. Both samples were subjected to some laboratory tests. The mean specific gravity of the local and foreign roof sheets was found to be 2.64 and 2.61 respectively. The local sheet was found to be 14 per cent more elastic. The mean ultimate tensile strength for the local and foreign Aluminium Roof Sheets is 52 and 43 N/mm2 respectively. The average Young Modulus for the local and foreign materials is 190 and 225 N/mm2 respectively.
Leveraging Social Networks for P2P Content-Based File Sharing in Disconnected MANETs
Bhuvaneshwari D , P Sreelakshmi
Current peer-to-peer (P2P) file sharing methods in mobile ad hoc networks (MANETs) can be classified into three groups: flooding-based, advertisement-based, and social contact-based. The first two groups of methods can easily have high overhead and low scalability. They are mainly developed for connected MANETs, in which end-to-end connectivity among nodes is ensured. The third group of methods adapts to the opportunistic nature of disconnected MANETs but fails to consider the social interests (i.e., contents) of mobile nodes, which can be exploited to improve the file searching efficiency. In this paper, we propose a P2P content based file sharing system, namely SPOON, for disconnected MANETs. The system uses an interest extraction algorithm to derive a node’s interests from its files for content-based file searching. For efficient file searching, SPOON groups common-interest nodes that frequently meet with each other as communities. It takes advantage of node mobility by designating stable nodes, which have the most frequent contact with community members, as community coordinators for intracommunity searching, and highly mobile nodes that visit other communities frequently as community ambassadors for intercommunity searching. An interest-oriented file searching scheme is proposed for high file searching efficiency. Additional strategies for file prefetching, querying-completion, and loop prevention, and node churn consideration are discussed to further enhance the file searching efficiency. We first tested our system on the GENI Orbit testbed with a real trace and then conducted event-driven experiment with two real traces and NS2 simulation with simulated disconnected and connected MANET scenarios. The test results show that our system significantly lowers transmission cost and improves file searching success rate compared to current methods.
Hybrid Stegnography using ImagesVaried PVD+ LSB Detection Program
Mrs. Shruti
Image Stegnography can be achieved by two techniques i.e LSB and PVD. In LSB technique it provide high capacity and good quality but it can be detected very easily by human eye. In other technique PVD, it usually provide less capacity and higher distortion but it can not detected by human eye. So, in this paper I combine the advantages of both the technique and with this I get with the result of higher capacity , low distortion and secrete data will not be detected very easily by human eye.
Aatreyi Ravikumar, Harshitha S, Meghana M, Mr. Anand M
Anonymous Authentication of Cloud Data Storage using Decentralized Access Control
Aatreyi Ravikumar, Harshitha S, Meghana M, Mr. Anand M
The cloud verifies the authenticity of the series without knowing the user’s identity in the proposed scheme. For the purpose of secured data storage in clouds, we propose a new decentralized access control supporting anonymous authentication. The feature of access control in which only authorized users can decrypt the stored information. In this we prevent replay attacks and also helps in creating, modifying and reading the data stored in clouds. This paper also provides user revocation. The access control scheme and authentication that has been provided is decentralized and robust, whereas the other access control schemes support centralized data in clouds.
Efficient Technique for Providing Authentication of Short Encrypted Messages
T Vijaya, Sivagama Sundari G
with the advancement in the technology, many of the applications depend on the existence of the small devices to communicate by exchanging messages. In such applications, the messages exchanged are short. The confidentiality and integrity of communicated short messages is of at most important in such applications. Here in this paper, we propose two new techniques for authenticating short encrypted messages that meets the requirement of such applications. The basic notion behind the proposed techniques is exploiting the security that the encryption algorithm provides to design an authentication mechanism that is more efficient than the existing authentication technique in the literature of cryptography.
P.Chaitanya (Pg Scholar) , Mr. Ch. Rajendra Prasad M.Tech
Performance Evaluation Of PAPR Reduction In OFDM System Using Non Linear Companding Transform
P.Chaitanya (Pg Scholar) , Mr. Ch. Rajendra Prasad M.Tech
Orthogonal Frequency Division Multiplexing is considered as an advanced communication model which has wide range of applications such as 3G, 4G, and Wi-Fi etc. Though OFDM has lots of advantages over conventional communication models such as FDMA, CDMA still it has some disadvantages in that the major drawback is Peak to Average Power Ratio. In literature several theories are proposed to resolve this issue but none of the algorithm is designed to meet the practical approach. In our work a novel approach is present to resolve the Peak to Average Ratio by introducing the variable slopes parameters K1, K2 and inflexion points A (A>0) and CA (0<C<1) in the probability density function in the Non Linear Companding Transform Technique then based on the inflexion point and variables parameter analysis, The performances of two incompatible features PAPR and Bit Error rate can be achieved. Selection of Non Linear Companding transform parameters plays an essential role as overall performance, robustness depends on it. The proposed theoretical work results can be view through MATLAB simulation.
A Method For Measuring The Accuracy Of Multi-Modal Medical Image Fusion Using Non-Subsampled Contour Transform And Pulse Couple Neural Network
Bharati R Bharambe, Prof.D.J.Agrawal
The multimodal medical images fusion is useful in the field of medical sciences.The main aim is to obtain the applicable information from the medical image sources and fuse them together to provide a single output which forms as an important system in the medical diagnosis.The fusion criterion is to minimize different error between the fused image and the input images.With respect to the medical diagnosis, the edges and outlines of the interested objects is more important than other information.Therefore, how to preserve the edge-like features is worthy of investigating for medical image fusion.In term of this view, the project proposed a new medical image fusion scheme based on discrete contourlet transformation and pulse couple neural network , which is useful to provide more details about edges at curves.It is used to improve the edge information of fused image by reducing the distortion.The pixel and decision level fusion rule will be applied selected for low frequency and high frequency.The fused contourlet coefficients are reconstructed by inverse NS contourlet transformation.The goal of image fusion is to obtain useful complementary information from CT/MRI multimodality images. By this method we can get more complementary information.
Comparison of Image Fusion Techniques: Spatial and Transform Domain based Techniques
Veerpal Kaur, Jaspreet Kaur
Image Fusion is a process of combining image details from a number of images into a single image, where the resultant fused image will have most of the details and is more informative and complete than any of the input images. Image fusion techniques can increase the quality and populate the application domain of these data. In this paper we present a detailed study and survey on some of the image fusion techniques for image fusion like, primitive fusion (Averaging Method, Select Maximum, and Select Minimum), Discrete Wavelet transform based fusion, Principal component analysis (PCA) based fusion etc. Study and Comparison of all the techniques helps to determine the better approach for its future research and implementation.
The project design is a user-friendly and effective system for the task of land-use/land-cover (LULC) recommendation where the goal is to automatically provide the user with information about the object or event depicted in an image and the surrounding where the image was taken. The system improves the image viewing experience by presenting supplementary information such as location images and geographically nearby images. The information is automatically collected from the databases various sources on the Internet based on the SIFT and GLCM methods.
Recently, due to the availability of vast information, which results more difficult to find out and discover what we need. We need tools which help us to establish, examine and recognize these huge quantities of information. For automatically establishing, understanding, examining, and summarizing large automatic accounts, subject modeling delivers some processes: 1. Determine the unseen subjects in the collection 2. Make notes on the documents giving to these subjects 3. Use comments to establish, review and examine. Hence for these purpose Probabilistic Latent Semantic Indexing approach is used to automate document indexing by using a statistical latent class model for factor analysis of count data. In this paper, we find out a set of query-focused which are summarizing from search results. As there may be several subjects related to a given query in the search results, hence to summarize these results in proper order, they should be classified into subjects, and then every subject should be summarized separately. There are two types of redundancies need to be reduced in this summarization process. First, every subject summary should not comprise any redundancy. Second, a subject summary should not be analogous to any other subject summary. In the summarization process, we emphasis on the document grouping process as well as reducing the redundancy between summaries. In this paper, we also suggest the PLSI approach which is a way to summarize the search results. Due to the process of evaluation results, our method accomplishes well in categorizing search results and reducing the redundancy between summaries.
Text Detection and Character Segmentation from Natural Scene Images Based Using Graph Cut Labeling
Shruthi .V , Sunitha. R
This paper presents an innovative scheme for character detection and segmentation from natural scene images professionally. In detection juncture canny edge and sobel edge is employed. To detect the edges of the text and so that it is achievable to extract only the text regions and non text regions are filtered out by using some geometric properties of text. For segmentation phase graph cut labeling technique is used for extracting characters from detective text regions rapidly.
Implementation of High Efficiency Video Coding (HEVC) Decoder for Medical Video Applications
Sunitha N , Sunitha R
High Efficiency Video Coding (HEVC) is recent digital video coding standard of the ITU-T committee. The main goal of the HEVC standardization was to increase significant compression performance compared to H.264 standards in the range of fifty percentage bit-rate reduction for equal perceptual video quality. In this project we have developed and evaluated the performance of HEVC decoder for lossless I-frames decoding. Subjective and objective quality metrics of the compressed medical ultrasound video samples of different resolutions are evaluated. We also experimentally measured the performance of the HEVC decoder for medical ultrasound video sequences. The results show that, with HEVC decoding of the ultrasound video samples, equivalent clinical quality videos can be obtained with values of the quantization parameters in range up to 36 with PNSR ratios.
The Wireless Sensor Network is a widely used approach mainly used for real time applications. Real time applications raise various fundamental problems like limited energy resources. We propose a paper on Hybrid approach which is the combination of Grid based approach and Cluster based approach. the main task of a sensor network is to minimize the energy consumption of wireless sensor network. So On Hybrid approach grid based network depends on the location where it is divided into different grid sizes. And in clusters, wireless sensor network, cluster head works like a base station that send all sense information to the sink node of the network. We are working on QualNet tools to simulate the scenario of the Grid based cluster network.
Categorization of Text into Appropriate Sentiment for Automatic Synthesis of Expressive Speech
Sayeda Swaleha Peerzade
The attention of researchers is directed towards the generation of expressive speech. Hence the text which would enter the text-to-speech system must be such that, the sentiment which it encompasses is to be known. So that this sentiment analyzed text can be then used to synthesize expressive speech. This paper concentrates on identifying and categorizing the sentiment present in text, which is the input to the text-to-speech system, which is the preliminary step for the production of expressive speech. The sentence is preprocessed and classification is done using the classifiers and sentiment tagged sentence is passed as input to the text-to-speech system and text-to-speech system selects the voice based on the sentiment and converts text to expressive speech. The twitter corpus is used as the dataset for training and testing the classifier because of its affect based expressiveness.
Speech is the best way to express one’s thoughts. The better quality of speech helps in better communication. Speech enhancement is the method to improve the quality of speech by using algorithms. The main aim of the speech enhancement techniques is to provide noiseless communication. This paper discusses the various methods used for improving the quality of speech
Hybrid Intrusion Detection Architecture for Cloud Environment
Sumalatha Potteti, Namita Parati
The Cloud computing system can be easily threatened by various attacks, because most of the cloud computing systems provide service to so many people who are not proven to be trustworthy. Due to their distributed nature, cloud computing environment are easy targets for intruders[1]. Intrusions have been a major problem in terms of computing resources such as grid computing, ubiquitous computing ,cloud computing, distributed computing and so on. Intrusions are hard to detect but there has been a lot of work done on detecting and removing the intrusions .The focus of intrusion detection should be mainly on detecting the intrusions at the system resources and at the network level for a predefined network. In this paper we proposed a system is to detect intrusions in the cloud computing using Behavior-based approach and knowledge-based approach. If first approach unable to detect the data, second approach again verifies the data and compare it with the signatures within the database. In the proposed system definitely we will have very low false positive alarm. This paper surveys the intrusion detection and prevention techniques and possible solutions in Host Based and Network Based Intrusion Detection System. Different Intrusion Detection techniques are also discussed namely anomaly based techniques and signature based techniques. It also surveys different approaches of Intrusion Prevention System.
Power Aware Cluster Based Routing With Sleep Scheduling in WSN (PACBR)
Broshni Cyriac, Manoj R.
Wireless Sensor Networks (WSN) is an emerging technology in today’s world. A WSN consist of thousands of sensor nodes it will perform sensing, computation and communication. Most of the sensor nodes are powered by batteries so there is a need of an energy efficient routing scheme in WSN. Many cluster based routing algorithms have been developed based on energy efficiency. In this paper, we propose a power aware cluster based routing with sleep scheduling in WSN for prolonging the sensor network lifetime. Here Cluster Head(CH) is selected based on number of neighboring nodes and the remaining residual energy of sensor node. In order to prolong network lifetime we are also using Sleep Scheduling in WSN. Most of the nodes are kept to sleep state to conserve energy and sleep scheduling will increase network lifespan. Evaluation of the system is done with Network Simulator-2 (NS-2). We are also comparing proposed system with system without sleep scheduling. The simulation results shows that the proposed approach prolongs the network lifetime and balancing the energy consumption among the nodes of the cluster.
Prevention And Elimination Of Vampire Attacks That Drains Life From WSN Using IDS
Archana Krishnan, Mr. Manoj. R
Mobile ad-hoc network is an infrastructure-less network in which the routing operation plays an important role. Due to the infrastructure-less nature of ad-hoc network, it has different issues like MAC layer, routing, security, network survivability etc. One of them is network survivability which needs more concern. It is the ability of a network keeping connected under attacks and failures in the design and performance of wireless ad hoc sensor networks. The large portion of research efforts concentrates on maximizing the network lifetime, where the lifetime of network is evaluated from the time of deployment to the point when one of the nodes has spent its limited power source and becomes in-operational – commonly referred to as first node failure. There is a class of resource consumption attacks called vampire attack which permanently disables the whole network by quickly draining nodes battery power. These types of attacks alter targeted packets by preparing long routes or misguiding the packets. Malicious nodes use false messaging, or alter routing information. This action affects the bandwidth and node battery power. This paper proposes a system that detects and eliminate the vampire attacks by using IDS and thus make the network live.
Lecture videos are becoming ubiquitous medium for e-learning process.E-lecturing has evolved more competent popular lectures. The extent of lecture video data on the World Wide Web is increasing fastly. Therefore, a most appropriate method for retrieving video within huge lecture video library is required. These videos consist of textual information on slides as well as in presenter’s speech. This paper estimates the virtual utility of mechanically recovered text from both of these sources for lecture video retrieval. This approach gives content based video searching method for getting most relevant results. To implement this system, firstly we have to separate out contents on presentation slides and speaker’s speech. For mining textual information written on slides we apply optical character recognition algorithm and to translate speaker’s speech into text we will apply automatic speech recognition algorithm. Finally, we will store extracted textual results into database against particular timestamp and unique id by performing automatic video indexing. When user will put a search query, then results will be displayed according to video contents. This technique will be beneficial for the user to search a suitable video within a short period of time.
Parvathy Menon, Associate Prof.Deepa S Kumar, Prof.M Abdul Rahiman
Security in Wireless Sensor Networks using Trust Based Distance Bounding
Parvathy Menon, Associate Prof.Deepa S Kumar, Prof.M Abdul Rahiman
Determining the position of sensors is a very essential part for most WSN algorithms. When neighbor recognition fails, protocols performance and communications deteriorate. In networks affected by relay attacks the failure may be more precise. In this paper, we propose and discuss a technique that aims to circumscribe all the sensor nodes in the network using distance bounding which is a secure neighbor detection method by using secure localization. For this, we used trust based position identification for finding the position of the sensor nodes in the neighbor by applying trust based distance bounding protocol.
A Review Paper On The Performance Analysis Of LMPC & MPC For Energy Efficient In Underwater Sensor Networks
Supreet Kaur
Due to the use of acoustic channels with limited available bandwidth, Underwater Sensor Networks (USNs) often suffer from significant performance restrictions such as low reliability, low energy-efficiency, and high end-to-end packet delay. The provisioning of reliable, energy-efficient, and low-delay communication in USNs has become a challenging research issue. In this paper, we take noise attenuation in deep water areas into account and propose a novel layered multipath power control (LMPC) scheme in order to reduce the energy consumption as well as enhance reliable and robust communication in USNs. To this end, we first formalize an optimization problem to manage transmission power and control data rate across the whole network. The objective is to minimize energy consumption and simultaneously guarantee the other performance metrics.
An Energy Efficient Multichannel Mac Provisioning In MANETS
M. Nivetha, C.Rajinikanth
This paper proposes a TDMA-based multichannel medium access control (MAC) protocol for QoS provisioning in mobile ad hoc networks (MANETs) that enables nodes to transmit their packets in distributed channels. The IEEE 802.11 standard supports multichannel operation at the physical (PHY) layer but its MAC protocol is designed only for a single channel. The single channel MAC protocol does not work well in multichannel environment because of the multichannel hidden terminal problem. Our proposed protocol enables nodes to utilize multiple channels by switching channels dynamically, thus increasing network throughput. Although each node of this protocol is equipped with only a single transceiver but it solves the multichannel hidden terminal problem using temporal synchronization. The proposed energy efficient multichannel MAC (EM-MAC) protocol takes the advantage of both multiple channels and TDMA, and achieves aggressive power savings by allowing nodes that are not involved in communications to go into power saving “sleep mode”. We consider the problem of providing QoS guarantee to nodes as well as to maintain the most efficient use of scarce bandwidth resources. Our scheme improves network throughput and lifetime significantly, especially when the network is highly congested. The simulation results show that our proposed scheme successfully exploits multiple channels and significantly improves network performance by providing QoS guarantee in MANETs.
Vansylic Israel Pintu J , Dr. Manivannan, Jeremiah JothiRaj
Flexibility Analysis of A Bare Pipe Line Used For Cryo Application
Vansylic Israel Pintu J , Dr. Manivannan, Jeremiah JothiRaj
The cryogenic piping circuit is studied to design and to handle the Liquid hydrogen. It is one of the important piping circuit networks present in the fuel booster turbo pump which mainly comes under the cryogenic upper stage in Geo Synchronous Launch Vehicle of Indian Space Research Organisation at Mahendragiri.. It consists piping elements like expansion joints/loops with optimal placement of supports. This paper mainly discusses about the Thermal stresses induced in the piping circuits when liquid hydrogen flows through it and how these stresses can be reduced by incorporating various expansion loops. Cryogenic fluid servicing pipelines are tend to develop thermal stress due to contraction/ expansion of piping material during chilling/ warming from ambient to cryogenic temperature or vice versa. The present project is to design, analysis of cryogenic piping using Finite Element Method (FEM) tool and detailed engineering. Flexibility results can be determined by structural analysis by altering the elbow angles respectively.
Parametric Modeling & Study Of Structural Characterics Of Wind Turbine Blade
A.Sethu Ramalingam, Dr.S.Rajakumar
The structural aspects of a long blade in a an up wind, horizontal-axis wind turbine were hybrid composite structure using glass and carbon fiber plies was created yielding a light-weight design with a low tip deflection. The results confirmed the design to have acceptable performance with regard to tip deflection, maximum and minimum strains, and a critical load. Detailed descriptions of the structural components and ply layups are presented along with resulting maximum and minimum strains and deflections. In addition, optimization techniques were introduced to provide insight for future studies with the blade a detailed review of current state of art for wind turbine blade design is presented. A design different airfoil cross section (‘I’,‘C’, Double ‘C’).and Structural analysis, FatigueCharacterics of Wind Turbine Blade.
Extended Two Phase Commit Protocal In Real Time Distributed Database System
Sanjeev Kumar Singh Kushwaha
The two phase commit (2pc) protocol is widely used for commit processing in distributed data base system (DDBSs). The blocking problem in 2pc reduces the availability of the system as the blocked transaction keeps all the resources until receive the final command from the coordinator after its recovery. To remove the blocking in 2pc, three phase commit (3pc) protocol was proposed. Although 3pc protocol eliminates the blocking problem, it involves an extra round of message transmission, which degrades system performance in DDBSs. Both 2pc and 3pc having problem which degrades system efficiency .In order to remove this problem in 2pc and 3pc ,E2PCP protocol was introduce to enhance system performances as compare to 2pc and 3pc.
To reduce blocking, we propose an extended two phase commit protocol (E2PCP) by attaching multiple participant sites to the coordinator sites work as a backup sites or as substitute sites for coordinator sites. In this protocol, after receiving responses from all participant sites in the first phase, the coordinator communicates the final decision to the backup sites in the back phase. Afterward, it send final decision to the participants. When blocking occur due to failure of coordinator site, the participant site can terminate the transaction by consulting backup sites of the coordinator. In this way E2PCP protocol achieving non-blocking in most of coordinator sites failures.
Performance Evaluation of Various Cluster based OFDM Schemes
Ruchi Gupta, Sumita Mishra, Brajesh Kumar Gupta
As more and more people started using the communication equipments, the demand for high data rate increases quickly. Orthogonal Frequency Division Multiplexing (OFDM) is one of the latest modulation techniques used in order to combat the frequency-selectivity of the transmission channels, also achieves high data rate without inter-symbol interference. The basic principle of OFDM is gaining a wide spread popularity within the wireless transmission community. This paper deals with the communication system that uses BPSK, QPSK and QAM techniques to transmit information using OFDM technique over Rayleigh communication channel. In terms of Symbol Error Rate (SER), the performance of different modulation schemes using OFDM techniques in Rayleigh channel is analyzed. For different types of modulation schemes (BPSK, QPSK, 16-QAM, 64-QAM) has been used with OFDM techniques. In this paper we analyze the Bit Error Rate (BER) of Rayleigh Fading Channels in OFDM systems using BPSK, QPSK and QAM Modulation Schemes.
Finite Element Analysis on the Structural Behaviour of Bolted Joints
Naveenkumar P , Dr. Manivannan A
Bolted joints are one of the most common elements in construction and machine design. A finite element model with three-dimensional solid elements was taken to investigate the bearing failure of stainless steel bolted connections under shear. This paper presents an extension of the finite element investigation onto the structural behaviour of stainless steel bolted connections, and three distinctive failure modes as observed in single lap and double lap shear tests are successfully modeled: bearing failure; shear-out failure; and net-section failure. Determine the material analysis of lap joints of the plates in accordance with the parametric optimization technique. Parametric optimization can be engaged by making a hole over the plate with several parameters. By analyzing the parametric study the net-section and shear-out failure are eliminated effectively. In bearing failure material optimization can be engaged by using stainless steel in accordance with the variation of loading conditions and parameters respectively.
Web consist of vast chunks of information on Web sites the user can retrieve this information by using search input query to Web databases & obtain the relevant information. Web databases return the multiple search records dynamically on browser, these search records contain the Deep Web pages in the form of HTML pages. The conventional search engine does not index the hidden Web pages from Web databases. Several existing techniques have addressed the problem of how to extract efficient structure data from Deep Web. The deep web refers to the hidden database used by web sites. But the information extraction & annotation is key challenge in web mining. The retrieval of information should be done automatically & arrange in a systematic way for processing. Different techniques like wrapper induction is been induced. Various types of annotators are used on the basis of the data to be labeled. This paper describes the automatic annotation approach on the basis of different feature of text node and data units.
An Efficient Design and Implementation of ALU using Gated Diffusion Index
K. Lakshmi Swetha, K. Kalpana
This paper implements a design of a 4-bit Arithmetic Logic Unit (ALU) using the concept of gated diffusion index (GDI) technique. In central processing unit and microprocessors ALU is the most crucial component. 4x1 multiplexer, 2x1 multiplexer and full adder are used in ALU design to implement the arithmetic operations such as ADD,SUBSTRACT etc. and logical operations such as AND,OR. In multiplexers and full adders GDI cells are used. The simulation is carried out Tanner EDA 13.0 simulator using TSMC BSIM 250nm technologies and compared with previous designs realized with Pass transistor logic and CMOS logic. CMOS uses both PMOS and NMOS transistors. CMOS design gives high power dissipation, and delay is also high. The occupation of area is also high. The simulation shows that the design is more efficient with less power consumption, less surface area and is faster as compared to pass transistor and CMOS techniques.
Design and Parametric Optimization Of Heavy Duty Leaf Spring
Vasanth G , Manivannan A
The project deals with the analysis of a leaf spring, which is employed in heavy duty vehicle belonging to the medium segment of the Indian automotive market. In the design of this kind of spring both the elastic characteristics and the fatigue strength have to be considered as significant aspects. In addition to this particular elastic property, as a consequence of the research effort in reducing the mass of components typical of the automotive industry, these springs have to face very high working stresses. The structural reliability of the spring must therefore be ensured. So for this purpose the static stress analysis using finite element method has been done in order to find out the detailed stress distribution of the spring. In this spring is analyzed for stresses and deflection and same as plotted. Nine different parameters have chosen for the analysis. In this project main aim is to improve the fatigue strength by reducing shear stress of the spring, so each material is analyzed for its displacements and stresses. The material with the lowest displacement and stress is selected. The software used for the finite element meshing is ANSYS. Result values obtained for deflection and stresses are compared for each material. This project deals stress variations for all the set of parameters, and to pick the best spring among the rest.
Steganography in True Color Images Using Even Odd Bit Slicing
Sandeep Singh, Jaspreet Kaur
Steganography is a tool for hiding information inside an image. It usually deals with different approaches of hiding the existence of the communicated data or message in a way that it remains confidential. It maintains integrity of message between two communicating bodies. In image steganography, message image is hidden in cover image and a stego-image is generated. There are different types of steganography techniques having pros and cons. In this paper, we review the various data hiding techniques that are used to implement a steganography such as LSB, ISB, MLSB etc.
Mrs. K. Krishna M.Sc , M.Phil, M.E., Dr. D.Murugan M.E, Ph.D
A Novel Semi Automatic Web Composition for Users Social Interactions
Mrs. K. Krishna M.Sc , M.Phil, M.E., Dr. D.Murugan M.E, Ph.D
The recent growth and dependence on Social networks is evident as it helps people opinion on diverse subjects, generating massive data. The analysis of this data is a complex activity. Data mining has many techniques for detecting trends and patterns from huge datasets [1] and Web composition on these complex data sets is an uphill task. Composing services need to leave traditional frontiers and compose for different situations. Web 2.0’s related technologies make it easier to collaborate. A social dimension to these compositions can help satisfy user requirements and thus generate knowledge from user interactions. This paper introduces a Novel Semi Automatic Web Composition for Users Social Interactions (SAWCUSI) from a user’s social perspective. It discovers and selects based on user interactions in a social network.
Performance Analysis Of Various Phases Of Srm With Classical & New Compact Converter
S.M.Mohamed Saleem, L.Senthil Murugan
In recent years, considerable attention has been given to find the compact and low cost power converter topology for Switched Reluctance Motor (SRM) drive to meet the emerging applications such as plotters, fans, pumps, screw rotary compressor drives, high speed application drives above 30,000 RPM. This paper is concerned with such as attempt to formulate a new compact power converter for SRM drive. The proposed power converter has reduced number of power electronic components which makes the converter compact and also reduce the switching losses. The power factor plays a vital issue in the usage of power electronic converters. The power oost converter and PI controller. A Simulink system is developed for 3Φ SRM by using MATLAB software. The proposed converter performance is compared with the classical converter and analysis results are presented.
Design and Development of MANETs with Quality of Service Parameters
Meena Rao, Neeta Singh
Mobile Ad Hoc Networks (MANETs) are a self-configuring network of mobile nodes connected by wireless links. In MANETs, each mobile node works as a host as well as a router. MANETs are used in various and varied applications like setting up of conferences, e-classrooms, patient monitoring, detection of earthquakes etc. With the growth and proliferation of these devices in every aspect of society, the need for such devices to communicate in a seamless manner is becoming increasingly essential. Also, as MANETs gain popularity, their need to support real time and multimedia applications is growing as well. Real time and multimedia applications supported by MANETs have stringent Quality of Service (QoS) parameters such as efficient bandwidth utilization, minimum delay, minimum packet loss, good throughput etc. Providing QoS is difficult in MANETs due to a lack of centralized infrastructure based system, limited bandwidth availability, constant movement of nodes, contention for channel access and the highly dynamic topology of the wireless network. This paper is a study on the design and development of MANETs with necessary QoS parameters like low packet loss, good throughput, less delay.
T C srinivasa rao, Dr.S.S Tulasi Ram , Dr.J.B.V.Subrahmanyam
Computational Intelligence for fault detection in power distribution network using ANN
T C srinivasa rao, Dr.S.S Tulasi Ram , Dr.J.B.V.Subrahmanyam
Early detection and location of faults in networks has been a major challenge in power systems engineering as it results in loss of energy,revenue and damage to equipment and facilities.Transmission and distribution lines are vital links between generating units and consumers. They are exposed to atmosphere, hence chances of occurrence of fault in transmission line is very high, which has to be immediately taken care of in order to minimize damage caused by it. This paper focuses on detecting the faults on electric power distribution network using artificial neural networks.
Analysis on neural networks with varying number of hidden layers and neurons per hidden layer has been provided to validate the choice of the neural networks in each step. The developed neural network is capable of detecting single line to ground and double line to ground for all the three phases. Simulation is done using MATLAB Simulink to demonstrate that artificial neural network based method are efficient in detecting faults on distribution system and achieve satisfactory performances.
The paper proposes optimal placement of fuel cells and sizing of Distributed Generators (DG) in distribution networks are determined using optimization. The objective is to improve the reliability indices. The placement and size of DGs are optimized using a Genetic Algorithm (GA). To evaluate the proposed algorithm, the IEEE 34 buses distribution feeder,is used. And it also proposes new methodology using Particle Swarm Optimization approach (PSO) for the placement of Distributed Generators (DG) in the radial distribution systems to reduce the real power losses and to improve the system reliability. A hybrid objective function is used for the optimal DG placement. It has two parts, in first part the power loss purpose as one index named Power Loss Reduction Index is considered .in second part the effect of DG on reliability improvement of system is considered and it is considered as one index named as Reliability Improvement Index. The proposed method is tested on standard IEEE 12 bus test system and the results are presented and compared with different approaches available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.