Data Mining is a procedure of extracting potentially helpful information from raw data, so as to get better the excellence of the information service. Cloud computing can give infrastructure to huge and multifaceted data of data mining, in addition to innovative demanding issues for data mining of cloud computing research are emerged. This paper introduces the essential thought of cloud computing and data mining and their security.
The diagnosis of heart disease in most cases depends on a complex combination of clinical and pathological data. Because of this complexity, there exists a significant amount of interest among clinical professionals and researchers regarding the efficient and accurate prediction of heart disease. In case of heart disease time is very crucial to get correct diagnosis in early stage. Patient having chest pain complaint may undergo unnecessary treatment or admitted in the hospital. In most of the developing countries specialists are not widely available for the diagnosis. Hence, automated system can help to medical community to assist doctor for the accurate diagnosis well in advance. So the decision support systems play an important role in the diagnosis of heart disease. However, accurate diagnosis at an early stage followed by proper subsequent treatment can result in significant life saving.
Feature Extraction and Selection of a Combination of Entropy Features for Real-time Epilepsy Detection
B. Abhinaya1, D. Charanya1 K. Palani Thanaraj2
Epilepsy is associated with the abnormal electrical activity in the brain which is detected by recording EEG (Electroencephalogram) signals. This signal is non-linear and chaotic and hence, it is very time-consuming and tedious to analyse them visually. In this work, we have extracted five entropy features such as Approximate Entropy, Sample Entropy, Fuzzy Entropy, Permutation Entropy and Multi-scale Entropy for characterizing the focal signals. We have used Sequential Forward Feature Selection (SFFS) algorithm to select two significant features for epilepsy classification. These two features are given as input to the Least Square Support Vector Machine (LS-SVM) classifier to differentiate normal and focal signal. The classification accuracy of our method is 82%. Moreover, the average computational time for the selected feature set is 47.94 seconds.
MANET and VANET are the active research areas and lots of routing protocols have been proposed for use in these areas.. In MANET, nodes are connected through wireless channels in a network and each node acts as a router and as a host. One of the scenario of MANET is Vehicular ad-hoc networks. For communication in VANET, efficient Routing Protocols are needed. Because of highly changing network topology and frequent disconnection it's strenuous to design an efficient routing protocol for vehicles, there can be two types of VANET that are V2V(Vehicle to Vehicle) and V2RSU(Vehicle to Road Side Unit). Because of daily happening of accidents VANET is one of the affecting areas for the refinement of Intelligent Transportation System (ITS) which can insure passengers and road safety. The Intelligent Transport Systems gives information if there exists any emergency and tells about traffic density. the traffic and traffic density. The existing routing protocols for VANET are not efficient enough to meet all traffic scenarios. Worthy routing protocols are required to initiate communication between vehicles in future for passengers and road safety. This paper shows literature survey related to Reactive and Proactive Routing Protocols of MANET as AODV, DSDV, OLSR, and DSR. Analysis and characterization of these protocols is shown in the paper which helps in further improvement of existing routing protocols.
During the next few years there will be a profound change in the generation and usage of energy, influenced factors such as environment, capital and production costs as well as geopolitical factors, old power stations in the world will be phased out. Newer power stations, having large output capacity will be difficult to permit in many existing political climates. With global warming now apparent, environment and efficient fuel utilization have become significant factors in the adaptation of newer and emerging energy conversion technologies. High installation cost is the major obstacle of the commercialization of fuel cell for distributed power generation. A fuel cell power system that contains a dc–ac inverter tends to draw an ac ripple current at twice the output frequency. Such a ripple current may shorten input cell lifespan and worsen the efficiency and output capacity. In this paper, an advanced active control technique is proposed to incorporate a current control loop in the dc–dc converter for ripple reduction. This will reduce both size and cost of the system. The proposed active ripple reduction method has been verified with MATLAB simulation.
Harishchandra Mahale1 Nikhil Pagar2, Swapnil Devgir3,Vaibhav T
A Review Based On Plc For Mixing And Filling Of Liquids”
Harishchandra Mahale1 Nikhil Pagar2, Swapnil Devgir3,Vaibhav T
In today’s fast moving world for survival of company many factors are important such as, cost effective production flexibility of work according in work , production on time etc. In this paper we are going to survey about the liquid mixing and bottle filling process using PLC. The importance of use of PLC is that will allow the mixing of different liquids in desired amount ,the high speed process, accuracy in amount of liquid to be fill into bottles and in any case liquid is any chemical which is dangerous to human health then the mixing and filling both processes are carried out without human touch.
Digitization of bio medical signals has brought a drastic change in analysis of signals. Electrocardiogram (ECG) is an important tool for the primary diagnosis of heart disease.ECG signal, the electrical interpretation of the cardiac muscle activity is very easy to interfere with different noises while gathering and recording. The ECG signal must be clearly represented and filtered to remove all noise and artifacts from signal. In this paper a new approach to filter the ECG signal from noise using Multi resolution Technique based on Wavelet Transform. This method gives better results than the other technique applied in this field.
Epilepsy is a neurological disorder marked by sudden recurrent episodes of sensory disturbance, loss of consciousness, or convulsions, associated with abnormal electrical activity in the brain. The sudden and seemingly unpredictable nature of seizures is one of the most compromising aspects of the disease epilepsy. Most epilepsy patients only spend a marginal part of their time actually having a seizure and show no clinical signs of their disease during the time between seizures, the so-called inter-ictal interval. But the constant fear of the next seizure and the feeling of helplessness associated with it often have a strong impact on the everyday life of a patient (Fisher et al. 2000). A method capable of reliably predicting the occurrence of seizures could significantly improve the quality of life for these patients and open new therapeutic possibilities. Apart from simple warning devices, fully automated closed-loop seizure prevention systems are conceivable. Treatment concepts could move from preventive strategies towards on demand therapy which resets brain dynamics and minimize the risk during epilepsy.
This article gives an important guideline for Automated Toll Collection System (ATCS) Using NFC and Theft Vehicle Detection. ATCS emerges as a converging technology where time and efficiency are important in toll collection systems nowadays. In this, NFC tag will be placed by toll authority having unique identification number (UIN) and user details. Active NFC tag will be attached to the RC (Registration Certificate) Book or Smart card . When vehicle passes through the tollbooth system, data on NFC will be read by NFC Reader and also sent to the server for verification. Server will check details and toll amount will be deducted from user's account. Theft Vehicle Detection is done with the help of various algorithms such as OCR and BLOB Detection.
Pruthviraj Pardeshi1 Akshay Zagade2, Sachin Chaudhari3, v
Risk predictor & Health-care information manager using Android and Cloud
Pruthviraj Pardeshi1 Akshay Zagade2, Sachin Chaudhari3, v
This android application creates updates, deletes and retrieves records on the cloud database.Google’s android OS provides great amount of flexibility to the implementation of an android application that is as pervasive as that. The given information can be of any format like DICOM image or data file in text. The system can be evaluated using any cloud service. Users are entities like administrator and normal patient. Cloud computing makes us to use out data in pervasively, ubiquitously and in distributive manner. The app stores and retrieves the data using above principles of cloud computing. The patient health record management can be done using these capabilities of Google Android and we can also store DICOM images. This app gives prediction of risk of occurrence of a disease.
Fragmentation Study for Deduplication in cache Backup Storage
M.Sakthivel1, Karnajoy Santal2,Bhaskar Rao K3
In backup environments field deduplication yields major advantages. Deduplication is process of automatic elimination of duplicate data in storage system and it is most effective technique to reduce storage costs. De duplication effects predictably in data fragmentation, because logically continuous data is spread across many disk locations. Fragmentation mainly caused by duplicates from previous backups of the same back upset, since such duplicates are frequent due to repeated full backups containing a lot of data which is not changed. Systems with in-line deduplicate intends to detects duplicates during writing and avoids storing them, such fragmentation causes data from the latest backup being scattered across older backups. This survey focused on various techniques to detect inline deduplication. As per literature, need to develop a focused on deduplication reduce the time and storage space. Proposed novel method to avoid the reduction in restores performance without reducing write performance and without affecting deduplication effectiveness.
Big Data and Large Scale Methods In Cloud Computing
Khair Unnisa Begum
In this paper I want to highlight what we believe to be the key technology dimensions for evaluating data management solutions. This paper offers and explore define what is meant by big data, We review analytics techniques for text, audio, video, and social media data, We make the case for new statistical techniques for big data, We highlight the expected future developments in big data analytics, Examines cloud data management architectures and Covers Big Data analytics and visualization.It not only considers data management and analytics for vast amounts of unstructured data but also Explores clustering, classification, and link analysis of Big Data.
Prof. Anindita Khade Sanket Yerigeri, Kaustubh Sonde , Shivaganesh Pillai
Online Fir Registration and Sos System
Prof. Anindita Khade Sanket Yerigeri, Kaustubh Sonde , Shivaganesh Pillai
This paper is on the project ‘Online FIR registration and SOS system’. The system Online FIR registration and SOS project is the first of its kind .It is designed to bridge the gap between the police and the common people. There are plenty of applications nowadays for shopping, travel and even for gaming purposes. However there is no application for the purpose of registering FIR or for helping the people while facing emergency situations. We intend to create a system where the users could register an FIR under various IPC sections and inform the police whenever in an emergency situation. We believe this will be a widely used system in the future and will help to bridge the gap between the police department and the people.
Preventing Gray Hole Attack using AOTMDV Routing protocol in MANE
D. Amutha Pandiyan, R. Punitha
MANETs are autonomous and decentralized networks. It is accessible to both genuine users and malicious attackers. In the network services, confidentiality and integrity of the data is affected due to security issues. MANETs are very vulnerable to various attacks from malicious nodes, some classical malicious attacks (eg., DoS attacks, warm-hole attack, gray-hole attack and black-hole attack). In this paper, we are calculating the trust value of nodes using decision factors such as direct degree and recommendation degree. We focus on further improvement of routing, trust based Ad-hoc On-demand Multi-path Distance Vector Routing (AOTMDV) protocol is used, which is an extension form of Ad-hoc On-demand Multi-path Distance Vector Routing (AOMDV) protocol. AOTMDV protocol is to incorporate other decision factors (Incentive function and Active degree) to improve the trust. By enhancing the trust, gray hole attacks will be prevented and the performance of network throughput, packet delivery ratio and malicious attack resistance also improved effectively.
Positioning based services really picked up pace in the early 90’s which came with the emergency response lines like 911, 100, etc. and which due public awareness regarding the location services being offered. These services also extend to the transport and logistics side an example of which would be that nowadays taxies have inbuilt location bases services to track the vehicles.
There are multiple positioning based services available like GPS, GLONASS, Time of Arrival, Angle of Arrival, etc. with each having their own pros and cons.
We shall cover some of these services and review them for their functioning and functionality
C.Md.Shareef Rajasekar Thota,N Vaseem Raja T.Narasimha Reddy
Modelling Of Wind Diesel Hybrid System for Reverse Power Management Using Bess.
C.Md.Shareef Rajasekar Thota,N Vaseem Raja T.Narasimha Reddy
This paper presents the modeling of a Wind Diesel Hybrid System (WDHS) comprising a Diesel Generator (DG), a Wind Turbine Generator (WTG), the consumer Load, a Ni–Cd Battery based Energy Storage System (BESS) and a Distributed Control System (DCS). All the models of the previously mentioned components are presented and the performance of the WDHS is tested through simulation. Simulation results with graphs for frequency and voltage of the isolated power system, active powers generated/absorbed by the different elements and the battery voltage/current/state of charge are presented for negative load and wind speed steps. The negative load step reduces the load consumed power to a level less than the WTG produced power, so that to balance active powers a negative DG power is needed (DG reverse power). As the DG speed governor cannot control system frequency in a DG reserve power situation, it is shown how the DCS orders the BESS to load artificially the system until the DG power falls in a positive power interval. The negative wind step decreases the WTG produced power, returning the power system to a situation where the needed DG power returns to positive, so that the BESS is not needed to load the system
A Survey on Improving Classification Performance Using Data Pre processing And Machine Learning Methods on NSL-KDD Data
Mr.Shobhan Kumar Mr.Naveen D.C*
: This paper gives an indication of our study in building rare class prediction models for identifying known intrusions and their variations and anomaly detection schemes for detecting novel attacks whose nature is unknown. Data mining and machine learning have been subjected to general explore in intrusion detection with emphasis on improving the accuracy of detection classifier. The quality of the feature selection methods is one of the important factors that affect the effectiveness of Intrusion Detection scheme (IDS). This paper evaluates the performance of data mining classification algorithms namely C4.5, J48, Nave Bayes, NB-Tree and Random Forest using NSL KDD dataset and focuses on Correlation Feature Selection (CFS) assess. The results demonstrates that NB-Tree and Random Forest outperforms other two algorithms in terms of predictive accuracy and detection rate
A Survey on Antidiscrimination using Direct and Indirect Methods in Data Mining
Chaube Neha Vinod, Ujwala M. Patil
Data mining is the study of data for relationships that have not previously been discovered. In sociology, discrimination is the hurtful treatment of an individual based on the group, class or category to which that person or things belongs rather than on individual merit: racial and religious intolerance and discrimination. Along with confidentiality, discrimination is a very essential issue when considering the legal and ethical aspects of data mining. It is more than obvious that most people do not want to be discriminated because of their race, gender, religion, nationality, age etc, especially when those attributes are used for making decisions about them like giving them a job, loan, education, insurance etc. Because of this reason, antidiscrimination techniques with discrimination discovery as well as discrimination prevention have been introduced in data mining. Discrimination can be either direct or indirect. Direct discrimination occurs when decisions are taken by considering sensitive attributes. Indirect discrimination occurs when decisions are taken on the basis of nonsensitive attributes which are strongly associated with biased sensitive ones. Here, discrimination prevention in data mining is tackle as well as propose new techniques applicable for direct or indirect discrimination prevention individually or both at the same time. Several decision-making tasks are there which let somebody use themselves to discrimination, such as education, life insurances, loan granting, and staff selection. In many applications, information systems are used for decision-making tasks.
New Ieee Standard For Advanced Audio Coding In Lossless Audio Compression : A Literature Review
Fomazou Nselapi Auristin, Samadhan Mali
Due to non-stop increase of communication subscribers, the loss and distortion of data (voice, video…) is emphasized. Meanwhile, IEEE-SA (Standard Association) has developed a new standard for advanced audio coding (AAC) in August 2013, called IEEE 1857.2 for contributing to those issues. This new standard for AAC is an efficient lossless audio codec (coding decoding) technique, in improving audio quality for compression and decompression, optimizing bandwidth during transmission, saving storage space, speeding up the video streaming, audio streaming and others video or audio data. This survey consists not only of describing the different existing techniques for lossless audio compression, and also of showing the state of the art of this recent lossless standard for AAC upon its most popular predecessors.
Performance Appraisal Of KDD Technique In Shopping Complex Dataset
Dr. K. Kavitha
KDD field is concerned with the development of methods and techniques for making raw data into useful information. Real world practical application issues are also outlined. The basic assumption in predicting financial markets is that it is not possible. This is consistent with remarks many financial professionals have made. In this survey, lot of information from the customer and vendors of shopping mall is collected. Vendors are needed to extract the required combination of customers. To find out the solution of this automatic research application to improve the business strategy is undertaken. This paper highlights the concepts, review the status and limitations.
The most accepted payment mode is credit card for both online and offline in today’s world, it provides cashless shopping at every shop in all countries. It will be the most convenient way to do online shopping, paying bills etc. Hence, risks of fraud transaction using credit card has also been increasing. In the existing credit card fraud detection business processing system, fraudulent transaction will be detected after transaction is done. It is difficult to find out fraudulent and regarding loses will be barred by issuing authorities. Hidden Markov Model is the statistical tools for engineer and scientists to solve various problems. In this paper, it is shown that credit card fraud can be detected using Hidden Markov Model during transactions. Hidden Markov Model helps to obtain a high fraud coverage combined with a low false alarm rate.
An Optimal Cache Partition Based Algorithm For Content Distribution And Replication
Bandari Muniswamy1 , Dr.N.Geethanjali2
Generally, users can cast up two sorts of requests, such as elastic requests that contain no delay constraints, and inelastic requests that take an inflexible delay constraint. The distribution of WSN's in multiple areas like, target tracking in battle fields, environmental control needs an optimization for communication among the detectors to serve information in shorter latency and with minimal energy consumption. Cooperative data caching emerged as a productive technique to accomplish these ends simultaneously. The execution of protocols for that network depends mainly on the selection of the sensors which will call for special roles in accordance to the procedure of caching and take forwarding decisions. A perception of Wireless content distribution was shown in which there are numerous cellular base stations, each of which encompass a cache for storing of content. Content is typically partitioned into two disjoint sets of inelastic as well as elastic content. Cooperative caching is shown to be capable to reduce content provisioning cost which heavily depends on service and pricing dependencies among several stakeholders including content providers, web service providers, and end consumers. Hither, a practical network, service, and economic pricing models which are then utilized for making an optimal cooperative caching strategy based on social community abstraction in wireless nets are broken. Inelastic requests are provided by means of broadcast transmissions and here we develop algorithms in support of content spread by means of elastic and inelastic requests. The developed framework includes optimal caching algorithms, analytical models, for evaluating the operation of the suggested scheme. The primary donations are: i) formulation of economic cost-reward flow models among the WNET stakeholders, ii) developing optimal distributed cooperative caching algorithms, iii) characterizing the impacts of network, user and object dynamics, and finally iv) investigating the impacts of user noncooperation
Providing Confidential Policy Conjecture To Images Uploaded By Users On Social Sites
Pushpa Rani, BG Nagar,Mandya
Today users sharing large volume of images through social sites inadvertently become a major problem of maintaining confidentiality. To help the users to control access to their shared content needs some tools. An Adaptive Privacy Policy Prediction (A3P) used in this paper to address the confidentiality problem.A3P system helps the user to compose confidentiality setting of their images by examine the role of social context, image content and metadata these act as a possible indicators of users privacy preferences.A3P system uses the two-level framework according to users available history on the site to determines the best available privacy policy for users images being uploaded. The solution relies on an image classification framework for image categories which may be associated with similar policies, and on an algorithm which predict the policy to automatically generate a policy for each newly uploaded image, also according to user’s social features. The generated policies fallow the evolution of users’ confidentiality attitude.
Design Of Pid Controller With Compensator Using Direct Synthesis Method For Unstable System
G. Atchaya1, P.Deepa2, V.Vijayan3, R.C.P
In industrial processes, unstable system produces undesirable peak overshoot. So, PID controller with compensator and set point filter is designed using direct synthesis method. The set point filter reduces the peak overshoot. PID controller with compensator improves the overall response of the system. In this method, the characteristics equation of the system with PID controller and a compensator is compared with a desired characteristics equation. A single tuning parameter is used to find controller parameters, compensator and set point filter.
Location Tracking And Detection Of Human Using Mobile Robot And Wireless Sensor Networks
. Muntaha Sakeena
In order to avoid dangerous environmental disasters, robots are being recognized as good entrants to step in as human rescuers. Robots has been gaining interest of many researchers in rescue matters especially which are furnished with advanced sensors. In distributed wireless robot system main objective for a rescue system is to track the location of the object continuously. This paper provides a novel idea to track and locate human in disaster area using stereo vision system and ZigBee technology. This system recursively predict and updates 3D coordinates in a robot coordinate camera system of a human which makes the system cost effective. This system is comprised of ZigBee network which has many advantages such as low power consumption, self- healing low data rates and low cost.
M. Zaheer Ahmed#1, S.Md.Imran Ali#2, B.V.Ramana#3, , S. Saleem#4, Thota Sneha Priya#5
A Wireless Sensor Network for Agricultural Applications Using ARM7
M. Zaheer Ahmed#1, S.Md.Imran Ali#2, B.V.Ramana#3, , S. Saleem#4, Thota Sneha Priya#5
— The agricultural field monitoring has become very important in these recent times. The farmers are struggling in many ways to get the yield from the fields. The farmers thus must have to struggle a lot in order to maintain their fields such that their crops are not damaged and they must get the required crops. Few technologies are being developed which are used to monitor the status of the fields. This project is also developed in order to provide the monitoring of the field’s condition. In this project, we are continuously checking water in the field using sensor. If there is no water in the field, the sprinkler should be ON for that indication we have used one motor for pumping and if the water is more in the field automatically motor should be OFF and in rainy season whenever water is more field ending (spool) will be damaged so to avoid that we have used another sensor at field ending (spool) if the water level is high the gate should be open otherwise gate should be closed using a motor and another motor is used to sprinkle the fertilizers. In the field, if fire is occurred user have to receive a message that fire accident had occurred then sprinklers automatically ON in order to control the fire. Thus, this method will be more advantageous for the monitoring of the fields.
Disease Prediction has always been a matter of research due to increasing number of health risks. Modern Medicine System produce huge amount of data which needs to be organized. Medical Data Mining plays a vital role in generating efficient results when it comes to prediction based analytics. Data Mining turns data into patterns which are useful for analyzing the diseases. This research paper focuses on role of K-Means Algorithm in disease prediction
Computer-based sensors and actuators such as worldwide positioning systems, appliance vision, and laser-based sensors have increasingly been incorporated into mobile robots with the aim of configuring independent systems capable of shifting operator activities in agricultural tasks. However, the incorporation of many electronic systems into a robot impairs its trustworthiness and increases its cost. Hardware minimization, as well as software minimization and ease of combination, is essential to obtain feasible robotic systems. A step forward in the application of mechanical equipment in agriculture is the use of fleets of robots, in which a number of expert robots collaborate to accomplish one or several rural tasks.
Big Data and Current Cloud Computing Issues and Challenges
M Sowmya Reddy .
Big data applications are a great benefit to organizations, business, companies and many large scale and small scale industries .We also discuss various possible solutions for the issues in cloud computing security and Hadoop. Cloud computing security is developing at a rapid pace which includes computer security, network security, information security, and data privacy. Cloud computing plays a very vital role in protecting data, applications and the related infrastructure with the help of policies, technologies, controls, and big data tools. Big data is a data analysis methodology enabled by recent advances in technologies and architecture. Cloud computing is a set of it services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. It advantages includes scalability, resilience, flexibility, efficiency and outsourcing non-core activities.
we consider a sub-optimum joint transmit receive antenna selection (JTRAS) scheme in multiple input multiple output(MIMO) systems equipped with N transmit and three receive antennas.At the transmitter,we keep one antenna as fixed and select the best among the remaining N-1 Antenna.After selecting two transmit antennas,we select the receive antenna for which the signal to noise ratio (SNR) is maximum.we assume spatially independent flat fading channels with perfect channel state information(CSI) at receiver and an ideal feedback link.we use Alamouti transmit diversity and derive the exact closed-form expression for the pdf of received SNR, using which we obtain bit error rate (BER) for BPSK constellation. We have presented simulation results and compared them with the derived analytical expressions. We have discussed some special cases of the considered antenna performance of the considered scheme with the other available schemes in terms of number of feedback bits and BER. We conclude that the considered JTRAS scheme reduces number of feedback bits compare with the golden code.
Comparative Study of Forward and Backward Chaining in Artificial Intelligence
Namarta Kapoor 1, Nischay Bahl2
An artificial intelligence system is capable of elucidating and representing knowledge along with storing and manipulating data. Knowledge could be a collection of facts and principles build up by human. It is the refined form of information. Knowledge representation is to represent knowledge in a manner that facilitates the power to draw conclusions from knowledge. Knowledge representation is a good approach as conventional procedural code is not the best way to use for solving complex problems. Frames, Semantic Nets, Systems Architecture, Rules, and Ontology are its techniques to represent knowledge. Forward and backward chaining are the two main methods of reasoning used in an inference engine. It is a very common approach for “expert systems”, business and systems. This paper focus on the concept of knowledge representation in artificial intelligence and the elaborating the comparison of forward and backward chaining.
Proposed New Architecture Of Overlay Network Of Distributed Hash Table
Jyotsana Sharma1, Birendra Kumar2
This paper proposes the new architecture of overlay network of Distributed Hash table (DHT). We introduce a new MultiChord Protocol which is another variant of Chord Protocol defined over overlay network of Distributed Hash table. MultiChord inherits basic properties of Chord protocol with some added new features.
A. Aishwarya1, T. Agalya2 , S. Nivetha3, R. Ramya4
PID/IPD Controller Design For Electro Mechanical Systems – A Study With PSO, BFO And FA
A. Aishwarya1, T. Agalya2 , S. Nivetha3, R. Ramya4
In this work, PID and modified form of PID (I-PD) controller design procedure is proposed for highly non-linear benchmark electromechanical systems such as Vehicle Active Suspension System (VASS) and Magnetic Suspension System (MSS) discussed in the literature. The proposed controller design is implemented using the most successful heuristic algorithms, such as Particle Swarm Optimization (PSO), Bacterial Foraging Optimization (BFO) and Firefly Algorithm (FA). A weighted sum of objective function comprising the overshoot (Mp), settling time (Ts), integral square error (ISE) and integral absolute error (IAE) is chosen to guide the heuristic search in order to find the controller parameters such as Kp, Ki, and Kd. In the proposed work, the major aim is to compare the performance of the considered heuristic algorithms for the controller design problem. The simulation work is implemented using the Matlab software and the performance of this study is validated using Mp, Ts, ISE and IAE values for the reference tracking and disturbance rejection operations. This study confirms that, FA offers faster convergence compared with the PSO and BFO algorithms.
In the era of globalization internet plays a vital role in all spheres of life and industries.Internet is very famous nowadays for satisfying people with various services related to various different fields. It is a very versatile facility which can help you in completing many tasks easily and conveniently with few clicks. It can be any work of daily usage or any specific service which needs a lot of research and formalities to be done beforehand, as well as this marketing is not an exception either. Online marketing, which is also called internet marketing, involves use of interactive,virtual spaces for the sake of promoting and selling goods and services. In fact, new synchronous, internet-based communication technologies had contributed to the restructuration of major economic sectors including marketing. Being cost-effective, flexible, and fast and enjoying an onunprecedented global reach, internet marketing has brought about different businesses incredible gainsrity, privacy, etc, emerged in the field of marketing from implementation of virtual space produce.
Energy Efficient Wireless Performance Monitoring System For Solar Panel
1Alfred.P.F, 2Ansalam Maattius
For a PV array, system monitoring is considered important to analyze the stability and performance. The simple monitoring system involves a data logging system with wired cables for transmitting data. Removing all those drawbacks observed in the existing system this proposed work is designed for the wireless monitoring of photovoltaic cell as a high precision solar array monitoring system. It is planned to measure the basic PV array characteristics like Module Temperature (T), Open Circuit Voltage (Voc), Short Circuit Current (Isc) and wirelessly transmit the data into real time GUI in the computer. The GUI was developed using the PROCESSING software. The commercially available WPAN hardware module ZigBee is used for implementation with API protocol for exchanging information. A sensor node with XBee and a set of analog sensors (eliminating the use of controllers at the sensor node) for measuring current and voltage generated in the PV array has been deployed. A coordinator node with Atmel microcontroller and Xbee connected with a PC to analyze the parameters.
H.T. Rathod a*, K.V.Vijayakumar b C.S.Nagabhushana^c, H.M.Chudamani^( d),A.S.Hariprasad^( e)
NUMERICAL INTEGRATION OVER A LINEAR CONVEX POLYHEDRON USING AN ALL HEXAHEDRAL DISCRETISATION AND GAUSS LEGENDRE FORMULAS
H.T. Rathod a*, K.V.Vijayakumar b C.S.Nagabhushana^c, H.M.Chudamani^( d),A.S.Hariprasad^( e)
Numerical integration is an important ingradient within many techniques of applied mathematics,engineering and scinietific applications, this is due to the need for accurate and efficient integration schemes over complex integration domains and the arbitrary functions as their corresponding integrands. In this paper,we propose a method to discretise the physical domain in the shape of a linear polyhedron into an assemblage of all hexahedral finite elements. The idea is to generate a coarse mesh of all tetrahedrons for the given domain,Then divide each of these tetrahedron further into a refined mesh of all tetrahedrons, if necessary. Then finally, we divide each of these tetrahedron into four hexahedra.We have further demonstrated that each of these hexahedra can be divided into and hexahedra. This generates an all hexahedral finite element mesh which can be used for various applications In order to achieve this we first establish a relation between the arbitrary linear tetrahedron and the standard tetrahedron.We then decompose the standard tetrahedron into four hexahedra. We transform each of these hexahedra into a 2-cube and discover an interesting fact that the Jacobian of these transformations is same and the transformations are also the same but in different order for all the four hexahedra.This fact can be used with great advantage to generate the numerical integration scheme for the standard tetrahedron and hence for the arbitrary linear tetrahedron. We have proposed three numerical schemes which decompose a arbitrary linear tetrahedron into 4, 4( hexahedra.These numerical schemes are applied to solve typical integrals over a unit cube and irregular heptahedron using Gauss Legendre Quadrature Rules. Matlab codes are developed and appended to this paper.
A GDI Approach to Various Combinational Logic Circuits in CMOS Nano Technology
Shashank Gupta1, Subodh Wairya2
Gate Diffusion Input provides one of the effective alternatives in low power VLSI application. With the help of GDI, circuits requiring large number of transistors can be realized with the help of quite less number of transistors. This approach tends to optimize various performance parameters such as area, delay, power dissipation. In this paper GDI cell has been applied in realizing various combinational circuits. One of the novel design has been proposed (XOR circuit using only nMOS) for providing low power in digital circuit. Based on simulation results their waveforms have been analyzed and various performance parameters have been calculated. These parameters are then compared to standard CMOS logic. The schematic and layout are drawn on 120nm technology file on a Dsch tool and their analysis is done on a Microwind 3.1 tool and BSIM simulator.
Orifice plate is a mechanical element used for measuring rate of flow by restricting flow, hence it is often called a restriction plate. The flow exerts some force on the plate due to impact of jet. The orifice plate acts an as obstacle for the flow. Here in our work we have done static analysis for three different geometries for orifice maintaining net impact area and orifice area same in all three cases. At the end we calculated maximum stress and maximum deformation for all the three geometries of orifice for the assumed working conditions, and found the best geometry which has the minimum stress and minimum deformation.
Prof. Amit Savyanavar Bhakti Mehta ,Varsha Marathe Priyanka Padvi and Manjusha Shewale
Multi-Document Summarization Using TF-IDF Algorithm.
Prof. Amit Savyanavar Bhakti Mehta ,Varsha Marathe Priyanka Padvi and Manjusha Shewale
With the increase in amout of data and information one has to deal with,now a days,going through all the documents is a time consuming process.We are implementing an android application that helps organizations such as law firms to manage the hundreds of documents and to get summary of these documents.We are also using concept of ontology for this application.Ontology is basically the relationship between entities.The application that we are implementing allow the users to search for files in the database,upload files and summarize multiple documents.
Comparison between Various Approaches for Customer Relationship Management in Data Mining
Surabhi Aggarwal1, Er. Neena Madan2
In this paper various techniques used for CRM in data mining are defined and compared with each other. Data mining is a useful and powerful tool for any organization especially for marketing people. Data mining is used in managing relationships with customers. Data mining process can be extremely useful for Medical practitioners for extracting hidden medical knowledge. It would otherwise be impossible for traditional pattern matching and mapping strategies to be so effective and precise in prognosis or diagnosis without data mining techniques
A Survey on Moving Object Detection and Tracking Techniques
Shilpa #1, Prathap H.L#2 Sunitha M.R #3
Moving object detection and tracking are the more important and challenging task in video surveillance and computer vision applications. Object detection is the procedure of finding the non-stationary entities in the image sequences. Detection is the first step towards tracking the moving object in the video. Object representation is the next important step to track. Tracking is the method of identifying, the position of the moving object in the video. Identifying the position is much more challenging task then detecting the moving object in a video. Object tracking is applied in numerous applications like in robot vision, monitoring the traffic, Video surveillance, Video in-painting and Simulation. Here we are going to present a brief review of numerous object detection, object classification and object tracking algorithms available.
Energy Efficient Sink Relocation Scheme (Ee-Srs) For Enhanced Network Lifetime in Wireless Sensor Networks
Dr.A.. Vijayalakshmi Alagarsamy1, Shalini
Wireless sensor networks are embedded with distributed set of sensor nodes that are cooperatively monitor physical or environmental conditions, and send their information to a “sink” node over multi hop wireless communication links. The sensor nodes battery energy depletion will significantly affect the network lifetime of a WSN. Most researchers have aimed to design energy-aware routing protocol to minimize the usage of the battery energy to prolong network lifetimes. This paper proposes a sink relocation approach for efficient utilization of sensor’s battery energy called Energy Efficient Sink Relocation Scheme (E-SRS) which considers regulation of transmission coverage range depend on the residual energy of a sensor node. The EE-SRS proposed in this paper discusses the algorithm to find the optimal place for relocating the sink, and “when and where to relocate the sink”. The EE-SRS algorithm is developed and simulated using network simulator. The performance analysis has also been done in terms of the network lifetime, throughput and packet delay.
Efficient Mining Algorithm For Reducing Number Of Itemsets Using Chud
Hari.D1, Suganya.R2 , Parameshwari.M3
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent itemset mining may discover the large amount of frequent but low revenue itemsets and lose the information on the valuable itemsets having low selling frequencies. High Utility Itemset mining identifies itemsets whose utility satisfies the given threshold. It allows the users to quantify the usefulness or preferences of items using different values. A High Utility Itemset which is not included in another itemset having the same support is called Closed High Utility Itemset. Mining High utility itemsets uses Apriori algorithm which takes the Input as Frequent Itemsets from the Transactional database, profit, and price and gives the High Utility Itemsets as the Output. To mine the Closed High Utility Itemsets the system addresses an efficient Depth-First search algorithm named CHUD.
Keywords: Frequent item set, High Utility Itemset, Closed High Utility Itemset, CHUD.
T. Ambritha1, J. Poorani Sri2 , J. Jessintha Jebarani3 M. Pradhiba Selvarani4
Visual Cryptography Scheme for Colored Image using XOR with Random Key Generation
T. Ambritha1, J. Poorani Sri2 , J. Jessintha Jebarani3 M. Pradhiba Selvarani4
Visual Cryptography (VC), a cryptographic technique which allows visual information to be encrypted in such a way that the decryption can be performed by the Human Visual System (HVS), without the help of computers. Visual Cryptography Scheme (VCS) eliminates complex computation problem in decryption process, by stacking operation we can restore the secret image. This property makes VC especially useful for the low computation load requirement. During encryption, the image is encrypted and then it is divided into two shares. Two shares are superimposed to reveal the secret image in the decryption process. The objective of our project is to get the better quality of decrypted image with the same size as the original image. The OR based VCS degrades the contrast by its monotone property. In XOR based scheme, the share images are superimposed with the help of XOR operation which results in perfect reconstruction. Hence, the XOR operation is proposed in decoding process to enhance the contrast, quality, and to reduce noise.
An Efficient System for Scalable Data Sharing In Cloud Storage
Jenolin Rex M 1, Latha P S 2
Using the cloud storage, users store their data on the cloud without the burden of data storage and maintenance and services and high-quality applications from a shared pool of configurable computing resources. Cryptography is probably the most important aspect of communications security and is becoming increasingly important as a basic building block for computer security. As data sharing is an important functionality in cloud storage, In this paper we show that how to securely, efficiently and flexibly share data with others in cloud storage, Cloud-validation based Flexible Distributed, Migration, ciphertext with aggregate key encryption for data stored in cloud. This scheme provides secure data storage and retrieval. Along with the security the access policy is also hidden for hiding the user’s identity. This scheme is so powerful since we use aggregate encryption and string matching algorithms in a single scheme. The scheme detects any change made to the original file and if found clear the error’s. The algorithm used here are very simple so that large number of data can be stored in cloud without any problems. The security, authentication, confidentiality are comparable to the centralized approaches. A set of constant-size cipher texts such that efficient delegation of decryption rights for any set of cipher texts is possible the best.
The revolution in ICTs has profound implications for economic and social development. It has pervaded every aspect of human life whether it is health, education, economics, governance, entertainment etc. Dissemination, propagation and accessibility of these technologies are viewed to be integral to a country’s development strategy.
A literature survey on Big Data Analytics in Service Industry
Dr. I. Lakshmi1 .
The huge blast of information and Internet gadgets has prompted fast approach of Big Data in later past. Administration industry which is a noteworthy client for these Big Data applications will prompt real change to the conveyance process and new bits of knowledge into utilization example and work processes, which thusly will help with new worldwide conveyance models incorporating new innovations and dispersion of work comprehensively. The Service Industry will utilize Big Data for different choices making information framework and making the work process more ideal. The idea of large scale manufacturing lead to Industrial Revolution, likewise Big Data is relied upon to drive new types of financial movement in Service industry with connected human capital, achieving new level of monetary action, development, and development.
In our day today life quiz competition is rapidly increasing. So to get the appropriate results Fastest finger first (FFF) is used to know the players respond time. It is rapidly used in institute level as well as commercial level. In early days it is very tedious work to know who has buzzer the alarm first in fastest finger first. So the solution for this problem is to use PIC microcontroller 16F877A. This project is designed as a product based. Also it has a vast impact on industrial level for security purpose
Achieving Data Confidentiality And Access Control Through Ciphertext Property Based on Symmetric Encryption with Undeniable Assignment
Chiranth Mohan , Yogaprakash M G
—In the cloud, for achieving access control and keeping data confidential, the data owners could adopt attribute-based encryption to encrypt the stored data. Users with limited computing power are however more likely to delegate the mask of the decryption task to the cloud servers to reduce the computing cost. As a result, attribute-based encryption with delegation emerges. Still, there are caveats and questions remaining in the previous relevant works. For instance, during the delegation, the cloud servers could tamper or replace the delegated ciphertext and respond a forged computing result with malicious intent. They may also cheat the eligible users by responding them that they are ineligible for the purpose of cost saving. Furthermore, during the encryption, the access policies may not be flexible enough as well.
Since policy for general circuits enables to achieve the strongest form of access control, a construction for realizing circuit ciphertext-policy attribute-based hybrid encryption with verifiable delegation has been considered in our work. In such a system, combined with verifiable computation and encrypt-then-mac mechanism, the data confidentiality, the fine-grained access control and the correctness of the delegated computing results are well guaranteed at the same time. Besides, our scheme achieves security against chosen-plaintext attacks under the k-multilinear Decisional Diffie-Hellman assumption. Moreover, an extensive simulation campaign confirms the feasibility and efficiency of the proposed solution.
Clothing Color and Pattern Recognition for Impaired people
Mrs. Anuradha.S.G, Thogaricheti Ashwini
- In the study of Human-computer-interaction (HCI) the design and use of technology using digitalized computer systems mainly focusing on the particular interfaces between people and computers. There is an ongoing research that taking place till today using Human-computer-interactions especially on visually impaired people. This system mainly introduces the thematic study on “Blind and visually impaired people Human computer and access to Graphics” represents a current research study towards solution for impaired people and brings together a new researchers and practitioners. Here, we are approaching one of the methods which can be useful for the visually impaired people in the form of recognising the clothing patterns. Choosing clothing pattern is one of the challenging tasks for visually impaired people. We matured a camera-based model to notice the clothing patterns. The clothing patterns are categories as five types like (plaid, striped, pattern less, horizontal-vertical, irregular etc) and it identifies 11 clothing colors. The system mainly integrates with the microphone, camera, Bluetooth, earpiece for audio signal. The output of our system is given by audio signal. To recognize clothing patterns, we propose a Hough line Transformation for the detection of pattern and canny detection for detection of edges in the clothing pattern. we proposed the CCNY Clothing Pattern dataset and other different pattern datasets to our method. Using various other performances our method is under the study. In this project we are using OpenCV library for capture the images. Thought such a system would support more independence in blind person’s daily life.