Open Source Dashboard Technology and IEDSS for Higher Education Institutions
Hemant Sharma
Open Source Dashboard Technology plays an imperative role to integrate IEDSS (Intelligent Decision Support System) in the establishment of the higher education in technology for revolutionized decision making. Because of modern factors, decision making has become complex requiring knowledge of business, data, and data representation; hence an intelligent computerized system is required. “An IEDSS is an intelligent information system that reduces the time in which decisions are made in an environmental domain and improves the consistency and quality of those decisions [Haagsma & Johanns, 1994]”. Decision-makers try to apply new strategies and use new tools to convert data into useful information that would contribute to managerial problem-solving. Decision makers require reports that are accurate, timely, and which give the whole “business picture”. Dashboard Technology and Intelligent Decision Support System (IEDSS) systems come to the rescue of decision makers. This paper proposes the integration of Open Source Dashboard Technology and IEDSS to improve the quality and efficiency of higher education’s systems. The dashboard enables executives to measure, monitor and manage organizational performance more effectively. If Dashboard Technology and IEDSS are applied to higher education processes, it would help to improve students’ performance, their life cycle management, selection of prospective students, course design, selection of courses to measure their retention rate, infrastructure, and development and the grant fund management of an institution. This is an approach to examine the effect of using open source dashboard technology and IEDSS for Higher Education Institution in strategic planning. Finally, an IEDSS system for higher education planning is introduced to illustrate and evaluate the use of dashboard technologies as intelligent decision-making tools.
In this paper, the task of design and implementation of digital pre-distortion technique has been divided into two parts. The behavioural model of power amplifier has been developed and digital predistorters module has been developed. AM-AM characterstics for the proposed model is presented and power spectral density (PSD) have been considered as the major parameters to check the performance and modelling accuracy. Parameter estimation and their automatic adjustment is a key issue in adaptive digital predistorter design.
The Comparison of Various Decision Tree Algorithms for Data Analysis
Kiran Singh*, Raunak Sulekh**
The Main objective of this paper is to compare the classification algorithms for decision trees for data analysis. Classification problem is important task in data mining. Because today’s databases are rich with hidden information that can be used for making intelligent business decisions. To comprehend that information, classification is a form of data analysis that can be used to extract models describing important data classes or to predict future data trends. Several classification techniques have been proposed over the years e.g., neural networks, genetic algorithms, Naive Bayesian approach, decision trees, nearest-neighbor method etc. In this paper, our attention is restricted to decision tree technique after considering all its advantages compared to other techniques. There exist a large number of algorithms for inducing decision trees like CHAID, FACT, C4.5, CART etc. But in this paper, these five decision tree classification algorithms are considered – ID3, SLIQ, SPRINT, PUBLIC and RAINFOREST.
Watermark Image Enhancement from various Attacks by SWT Using Hybrid Meta-heuristics
Vikas Sharma, P.S Mann
Digital watermarking enables one to protect the document; it is the kind of material authentication. The major problem in hypermedia technology is attacks on digital watermarking.In digital watermarking single attack on a given watermark image has effective outcome but multiple attacks on a given watermarked image and other watermark scrambling need to be improved. This paper purposes a new watermarking technique using integrated approach of SWT with GA and PSO for watermarking scrambling is used. The proposed methodology enhances imperceptibility and robustness in the watermarked image which has result in improving the visual quality of watermark.
Durga Bhavani.A, Aparna Cleetus, Jennifer V Xavier, Chaitra B.S, Priyanka.R
Housing Society Management, Authentication and Security App
Durga Bhavani.A, Aparna Cleetus, Jennifer V Xavier, Chaitra B.S, Priyanka.R
The standard of living in urban areas such as towns and cities has increased so much that most of the people are living in a housing society. When a person moves into a new housing society he is unaware of the various facilities provided by the society and his surroundings .Also a person who is already a resident of the society may have issues concerning whom to contact for various repair works, report his grievances, security issues etc. There is no proper platform for the resident to communicate with the admin. Housing Society Management, Security and Authentication app provides the solution by offering a platform to share issues between residents and the society managers which leads to rapid issue resolution and fewer misunderstandings. In the current scenario there is no proper alert mechanism to inform residents in case of emergencies such as gas leakage. Keeping track of people visiting the housing society is also a herculean task. Our project resolves these issues by making use of gas sensors and RFID technology
-It is important to understand the architecture of Non linear systems and the components that are responsible for introducing nonlinear distortion. In order to evaluate the effect of nonlinear distortion on system performance the modeling and simulation is the requirement of present scenario of modern wireless communication systems. We have presented the orthogonalization of the behavioral model is beneficial for the better prediction of nonlinear distortion and extracts the uncorrelated component of the nonlinear output that is responsible for the degradation of system performance. Orthogonalization of power series nonlinear model is performed.
Efficient Energy Management System for Smart Grid Using Bacterial Foraging Optimization Technique
Dr.R.Vijay, M.Nithya
Distributed devices in smart grid systems are decentralized and connected to the power grid. The connection made through different types of equipment transmits. This produces the numerous energy losses, when power flows from one bus to another bus. The most efficient approaches to reduce energy losses is to integrate the renewable energy resources in Distributed Generation’s (DG’s). The uncertainty of DG may cause instability issues. The major issue includes congestion in the power grid due to the sudden power consumption by the customers, which affects the efficient energy delivery. Energy management with DG regulation is one of the most efficient solutions to solve these instability issues. In the considered power system with DG’s and consumers, the Locational Marginal Pricing (LMP) based unified Energy Management System (uEMS) model is considered. This model increases the profit benefits for DG’s and increases the stability of Distributed Energy System (DES). In this paper, the Bacterial Foraging Optimization (BFO) is employed to reduce losses i.e. based on Loss Reduction Allocation (LRA) method. Using LRA method the energy loss reduction is calculated and this model accurately rewards DG contribution and offers a good competitive market. Moreover, the entire DG’s profit is increase by the BFO technique. The IEEE 37 bus feeder system is to be considered to validate the proposed uEMS model to increase the DG system stability. Furthermore, this implementation gives the idea of formulating efficient energy management system of future Indian scenario.
-In this paper, we presented the ergodic capacity of free space optical communication systems over Gamma-Gamma atmospheric turbulence fading channels with perfect channel state information for the developed system. The intensity fluctuations of the received optical signal are accounted for Gamma-Gamma atmospheric turbulence. A closed-form expression for the average capacity of the heterodyne differential phase shift keying for free-space optical (FSO) communication systems over gamma-gamma turbulence channel is derived.
In free space optical communication the signal power is changing with the bandwidth therefore PSD curves are very important in the link design. The power spectral density(PSD) of intensity modulation scheme i.e. Digital Pulse Interval Modulation (DPIM) has been presented. Results which are generated by simulations of an experimental system are presented. Which shows that DPIM has higher transmission capacity compared with digital pulse position modulation (DPPM) and is less complex to implement
Design, Materials, Production, FEM and Experimental Analysis of an I.C. Engine Crankshaft – A Review
Ketan V. Karandikar* and Dr. Subim N. Khan
A crankshaft can be called as the heart of any I.C. engine since it is the first recipient of the power generated by the engine. Its main function is to convert the oscillating motion of the connecting rod into rotary motion of the flywheel. Irrespective of having an intricate profile, crankshafts are mass produced. In this paper, the design considerations, finite element stress analysis and experimental stress analysis of the crankshaft are studied. The crankshaft of a mini truck, car and motorcycle was designed using standard design formulae, followed by the creation of its 3-D model using suitable design software like CATIA or PRO/E. Static and dynamic stress analysis was performed using ANSYS software to determine the values of total deformation and equivalent stress, based on which the areas susceptible to failure are identified. The analysis was carried out for commonly used materials for manufacturing a crankshaft, like cast iron and forged alloy steel. The work carried out by various researchers on design and FEM based analysis of a crankshaft is reviewed in this paper. Aspects such as materials, manufacturing processes, causes of failure, experimental stress analysis are also reviewed in this paper.
Cloud computing has boomed its horizon with large pace as commercial infrastructure in the IT industries meeting the vast requirements of computing resources. There are several issues such as load balancing, virtual machine migration, automated service provisioning, algorithm complexity, etc., demanding to be resolved. Each of these issues needs load balancing to be resolved. This aims for distributing the unwanted dynamic workload between the nodes residing in cloud and this desires of every computing resource must be assigned on proficient and reasonable ground. Load balancing has become crucial for efficient performance in distributed environments. Cloud computing is an emerging technology demanding more services and better results. Thus load balancing for the cloud is very interesting and important research area. Many algorithms are proposed to provide efficient techniques for assigning the client's requests to available cloud nodes. This paper studies cloud computing along with research challenges in load balancing. Load balancing has been a major issue for cloud computing environment. Efficient load balancing scheme ensures efficient resource utilization by providing the resources to cloud on-demand of users’ basis. By implementing appropriate scheduling criteria load balancing may prioritize users. The aim of this study is to peep in various load balancing algorithms to address its challenges in variety of cloud environment. This study provides a perspective view of the latest approaches in load balancing that will certainly help the future researchers in this field
Anti-Swing Control of an Overhead Crane by Using Genetic Algorithm Based LQR
Ümit Önen , Abdullah Çakan
Genetic algorithm based optimization of a linear quadratic regulator (LQR) controller which is designed for position and sway control of an overhead crane is presented in this study. Equations of motion of two degrees of freedom (DOF) crane system are derived by using Lagrange formulation and presented as state-space model. A LQR controller is designed by using trial and error method for position control and swing suppression of the crane system. Then, parameters of the LQR controller are optimized by genetic algorithm in order to obtain the best control results. Simulation studies are carried out on a nonlinear crane model which is created in MATLAB/Simulink environment. Performance of the designed controller is evaluated through the simulation results and compared with a pre-designed classical PID controller.
Dhanya G S , Dr. R. Joshua Samuel Raj , Sam Silva A
Segmentation of brain MRI –A tri level PSO based approach
Dhanya G S , Dr. R. Joshua Samuel Raj , Sam Silva A
Brain MRI plays a very important role for radiologists to diagnose and treat various brain diseases. One of the most important applications of graph partitioning is image segmentation. Various graph based methods for image segmentation application was developed, but it produces unbalance parts and NP Complete. To address such limitations, we have investigated and developed swarm intelligence based approach in which a Tri Level Particle Swarm Optimization (TLPSO) can be applied for partitioning the graph, obtained from an image to be segmented. The proposed classification method includes three stages namely conversion, implementation, selection and extraction. To check the performance of this proposed algorithm, we carried out quantitative as well as qualitative evaluation. Segmentation by graph partitioning in which PSO technology is combined with three levels helps to reduce partitioning imbalance and considers local as well as global features for segmentation. This method generates better segmentation quality and time of convergence is reduced by considerable extent.
Predicting and Diagnosing of Heart Disease Using Machine Learning Algorithms
Sanjay Kumar Sen
Prediction and diagnosing of heart disease become a challenging factor faced by doctors and hospitals both in India and abroad. In order to reduce the large scale of deaths from heart diseases, a quick and efficient detection technique is to be discovered. Data mining techniques and machine learning algorithms play a very important role in this area. The researchers accelerating their research works to develop a software with the help machine learning algorithm which can help doctors to take decision regarding both prediction and diagnosing of heart disease. The main objective of this research paper is predicting the heart disease of a patient using machine learning algorithms. Comparative study of the various performance of machine learning algorithms is done through graphical representation of the results.
Image Denoising Technique Using Trimmed Based Median Bilateral Filtering Method
Dr.Sandeep Kumar, Divya Kapil
Most of the image processing techniques such as edge detection, segmentation, object tracking, pattern recognition etc. do not perform well in the occurrence of noise. Thus, image restoration as a preprocessing step is performed before applying the image to any of the beyond mentioned techniques. This dissertation work presents an Decision based unsymmetric trimmed median filter algorithms for the removal of impulse noise has been proposed with color images rather than gray scale images by separation red- green- blue plane of color image. This proposed algorithm shows better results than the Standard Median Filter (MF), Decision Based Algorithm (DBA),Adaptive Median Filter(AMF) , Hybrid Filter Based Algorithm (HMF), and Trimmed Median Filter (TMF). The show of the system is analyzed in terms of Mean square error (MSE), Peak signal to noise ratio (PSNR) image enhancement factor (IEF) and time required for executing the algorithms for different noise densities. Simulation results shows that proposed algorithm outperforms the existing algorithms even at high noise densities for color images. Many experiments are conducted to validate efficiency of the proposed algorithm.
Cooks Distance and Mahanabolis Distance Outlier Detection Methods to identify Review Spam
Siddu P. Algur , Jyoti G. Biradar
In the era of web 2.0, huge volumes of consumer reviews are posted to the internet every day. As the access to Internet has been so much easier, there is an increase in people using online applications more than ever. Online marketing, in fact, the whole e-commerce is getting enormous day by day if not in every minute. Online reviews play a very important role in this field and proved itself to be auspicious in terms of decision making from a customer's point of view. Customers use the reviews for deciding quality of products before purchasing them. Companies or vendors use opinions to take a decision to improve their sales according to intelligent things done by other competitors. However, all reviews given by customers or users are not true reviews. Manual approaches to detecting and analyzing fake reviews are not practical due to the problem of information overload. The design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews. As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. The main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. Hence, a novel approach, distance based outlier detection methods namely Cooks distance and Mahanabolis distance is used to identify spam reviews. M
Symmetric Stream Cipher Based On Chebyshev Polynomial
D. Sravana Kumar , P. Sirisha , CH.Suneetha
The rapid development of information technology turned internet as the basic means and wide choice for communications. Due to extensive adoption of internet for communications it is essential these days to conceal the message from unintended reader. The present paper describes a new encryption algorithm using Chebyshev polynomial of I kind. The plain text is encrypted in three rounds each round consisting of two stages with the concatenation of the previous cipher character with different keys. For making the algorithm more secure the key for the first round of encryption is generated from the main key (agreed upon by the sender and the receiver) and the subsequent round keys are concatenated with the previous round keys. The stream cipher proposed here has several advantages over conventional cryptosystems.
Optimization of Image Compression algorithms using DWT- DCT method
Nikhilesh Joshi , Dr Tanuja K Sarode
Information is generated at very high speed in today’s Digital Era. Images have vast share in Information generated, so it’s important to reduce the image file size for storage and effective communication. Image compression technique helps in reducing storage and bandwidth requirement for transmission. Optimization of Image compression algorithms is the basic crux of this paper. DCT is most popularly used lossy image compression technique. It has disadvantage that at higher compression ratio the quality of image is lost. DCT is applied to set of images, followed by DWT. The resultant compression metrics are calculated and visual quality of image is analyzed. Experimental analysis shows reduction in storage space requirement and effective optimization using different methodology.
Mining High Utility Patterns in One Phase without Generating Candidates
Rashmi Reddy M, Kavitha Juilet
Utility mining is a new development of data mining technology. Among utility mining problems, utility mining with the itemset share framework is a hard one as no anti-monotonicity property holds with the interestingness measure. Prior works on this problem all employ a two-phase, candidate generation approach with one exception that is however inefficient and not scalable with large databases. The two-phase approach suffers from scalability issue due to the huge number of candidates. This paper proposes a novel algorithm that finds high utility patterns in a single phase without generating candidates. The novelties lie in a high utility pattern growth approach, a lookahead strategy, and a linear data structure. Concretely, our pattern growth approach is to search a reverse set enumeration tree and to prune search space by utility upper bounding. We also look ahead to identify high utility patterns without enumeration by a closure property and a singleton property. Our linear data structure enables us to compute a tight bound for powerful pruning and to directly identify high utility patterns in an efficient and scalable way, which targets the root cause with prior algorithms. Extensive experiments on sparse and dense, synthetic and real world data suggest that our algorithm is up to 1 to 3 orders of magnitude more efficient and is more scalable than the state-of-the-art algorithm
Tejaswa Chaudhary, Gulzar Ahmad, Vikramjeet Mallick, Thirunavukkarasu K.
Using predictive modelling for forecasting blood donor response
Tejaswa Chaudhary, Gulzar Ahmad, Vikramjeet Mallick, Thirunavukkarasu K.
Blood and blood products are essential for medical treatment of all age groups. The primary source for blood products in the world is volunteer donors. Thus, donor recruitment and donor retention are vital factors for a blood bank to maintain its supply. We propose that developing a better understanding of donors' motivations to donate and personalizing their donation objectives would improve a blood bank's ability to secure a more robust supply of blood. The research paper puts forward a solution to predict whether a specific donor will donate blood in the coming month or not. This will help the blood banks to forecast their stock of various types of bloods and prepare accordingly.
Increasing network lifetime by energy-efficient routing scheme for OLSR protocol
Reema D, J Nagesh Babu
One of the main considerations in designing routing protocols for Mobile Ad -Hoc Network (MANET) is to increase network lifetime by minimizing nodes’ energy consumption, since nodes are typically battery powered. Many proposals have been addressed to this problem; however, few papers consider a proactive protocol like Optimized Link State Routing Protocol (OLSR) to better manage the energy consumption. Some of them have explored modifications to the MPRs selection mechanism, whereas others have investigated multiple cross layer parameters to increase the network lifetime. In this paper, we explored both modification to MPR selection and integrating appropriate routing metrics in the routing decision scheme to lessen effects of reason that lead to more energy consumption. Our power-aware version of OLSR is proven by simulations in NS3 under a range of different mobile scenarios. Significant performance gains of 20% are obtained in network lifetime for our modified OLSR and little to no performance gains in term of Packet Delivery Ratio (PDR).
Mobile Communication is a wireless connection between two nodes which is having limited bandwidth and high rate of data disconnnection. So there is a requirement of new MANET routing protocol having low rate message overhead. Hence it is necessary to enhancement the performance of MANET. The reduction of routing overhead decreases the usage of bandwidth of the network. The increase of bandwidth will increase the throught and decrease the latency of the nodes of the network. This Paper proposes the new Routing Protocol which will increase the throughput,reliability and decrease the latency of the network. 1. Introduction:The mobile adhoc network[1] does not have any fixed infrastructure and base station of the network and thats why the name is used.There are many applications for ad-hoc networks like conferencing, emergency services, personal area networks, embedded computing, and sensor dust. A MANET is a network that allows direct communication of two nodes, when radio propagation conditions exist between two nodes. If there is no direct connection between the source and the sink nodes, multi-hop routing is used. In multi-hop routing, a packet is forwarded from one node to another, until it reaches the destination. A routing protocol is necessary in adhoc networks; this routing protocol has to adapt quickly to the frequent changes in the adhoc network topology. Ad-hoc routing protocols are classified into three categories. The first category is Table-driven (Proactive) routing protocols such as DSDV [2], CGSR [3], GSR [4], FSR [5], and OLSR [6]. The second category is on-demand (Reactive) routing protocols such as AODV [7], DSR [8], ABR [9], SSA [10], and TORA [11]. The third category is Hybrid (Reactive and proactive) routing protocols such as ZRP [12] and ZHLS [13].
Climate induced variation in forest fire using Remote Sensing and GIS in Bilaspur District of Himachal Pradesh
Shruti Kanga*, Sumit Kumar$ and Suraj Kumar Singh
Himachal Pradesh forests have frequent incidents of forest fires especially in the months of March to June which causes loss of forest wealth, biodiversity, ecology and environment. Due to the scarcity of a broad and efficient work to be acquainted with its fire-sensitive forest regions. Hence, the current study was carried out in Bilaspur district of Indian Western Himalaya to identify forest fire by overlaying geographical coordinates of weighted thematic layers such as elevation, slope, aspect, mean annual temperature, humidity, wind speed, accessibility to habitation & settlement and fuel map of the region. In the present study Climate induced variation in forest fire in Bilaspur District of Himachal Pradesh were estimated. The LISS 3 and ASTER DEM has been used for forest fire risk map, Forest density map, land use /land cover ,extraction of DEM features and other spatial object in which the topography of the area represents the forest type and various land forms of the study area. The fire-sensitive regions were further prioritized into very high, high, medium, low, and very low forest fire-prone areas. The Pinus roxburghii forest type, low elevation, high temperature, high slope, south-west facing aspect, May month and anthropogenic disturbances were identified as major factors responsible for forest fire in the region. The half of the forest cover was identified as fire sensitive.
A Study on Development of Online Marketing Information System for Agricultural Sector of KSA
Dr. Syed Khizer
Agriculture sector is a contributor of GDP and foreign exchange for a nation. Agriculture sector generates labor, capital and fulfill domestic demand to support growth in other sectors as well. Agriculture sector plays a key role in ensuring national food security. Access to the agricultural marketing information is an essential factor in promoting competitive markets, globalization, efficient marketing, market liberalization and improving development of agricultural sector. Majorities of the stakeholders of the said sector are deprived of agricultural marketing information. The stakeholders of agriculture sector of Saudi Arabia are in need of Agricultural Marketing Information System (AMIS). In this paper we proposed the AMIS for the Kingdom of Saudi Arabia. In summary, the said system is aimed at providing agricultural marketing information to all the stakeholders of Saudi agriculture sector
In the past few decades Internet and digital technology has grown rapidly. The increasing and rapid advancement of Internet has made it extremely easy to send multimedia data accurate and fast to destination. It has provided various advantages like easy sharing of digital images, copying of digital images without quality degradation and editing of digital images. Also due to increasing trend of Internet, multimedia data is tend to duplicate and modify which makes multimedia security as an extreme concern to take care of. Modification and misusing of valuable data is very common and thus sending multimedia data to intended recipient has become more important. The problems arises includes privacy, corruption or processing of image and counterfeiting. This paper throws light on information hiding in order to protect original data from illegal duplication, distribution and manipulation through ”Digital Image Watermarking”. Watermarking is an art of hiding information into another file, which could be video, audio, text or image. The paper includes various type of watermarking, different techniques of watermarking.
Flexible Approach for Data Mining using Grid based Computing Concepts
Abdul Ahad, Dr.Y.Suresh Babu
Now days, in the field of life sciences and business, knowledge discovery has become a common task in both for the growing amount of data being gathered and for the complexity of the analysis that need to be performed on it. Due to some unique characteristics of today’s data sources, such as their heterogeneity, high dimensionality, distributed nature and large volume. Distribution of data and computation allows increasing trend towards decentralized business organizations; distribution of users, software, and hardware systems magnifies the need for more advanced and flexible approaches and solutions. Here we present the state of the art about the major data mining techniques, systems and approaches. This paper discusses how distributed and Grid computing can be used to support distributed data mining. In particular, a distinction is made between distributed and Grid-based data mining methods.
Karan R. Jagtap, Swapnil A. Patil , Shripad P. Raje , Sachin R. Pawar , Prof. Vinay J. Raje
Design and Manufacturing of Pneumatic Vehicle for Industrial Purpose
Karan R. Jagtap, Swapnil A. Patil , Shripad P. Raje , Sachin R. Pawar , Prof. Vinay J. Raje
Compressed air as a source of energy in different uses in general and as a non-polluting fuel in compressed air vehicles has attracted scientists and engineers for centuries. Efforts are being made by many developers and manufacturers to master the compressed air vehicle technology in all respects for its earliest use by the mankind. Nowadays, almost every industry is trying to develop light and efficient vehicles. Today, all the the vehicles running on conventional & non-conventional fuels are known for producing a large amount of harmful gases like CO2 , SO2 , NO2 , etc. which acts increases global warming. The motto of our project is to design & fabricate vehicle running on air pressure for material handling in industries and reduce power consumption. It is rear wheel drive. We develop the concept of pneumatic vehicle from pedal operated tricycle. The vehicle looks like three wheeler in which manual operation is replaced by compressed air pressure. The following report gives a brief description of how a compressed air vehicle using this technology was made. While developing of this vehicle, control of compressed air parameters like temperature, energy density, requirement of input power, energy release and emission control have to be mastered for the development of a safe, light and cost effective compressed air vehicle in near future.
R. Nithyanandhan, Dr.G.Umarani Srikanth, Mr.S.Muthukumarasamy
Secure Web Authentication Using OPASS to Prevent Secret Key Stealing
R. Nithyanandhan, Dr.G.Umarani Srikanth, Mr.S.Muthukumarasamy
Text passwords have been adopted as the primary mean for user authentication in online websites. Humans are not experts in memorizing them, therefore they rely on the weak passwords. As they are the static passwords there are some adversary who can launch attacks to steal passwords, and suffers quitely from few security drawbacks: phishing, keyloggers and malware. This problem can be overcome by a protocol named oPass which leverages a user’s cellphone and an SMS to thwart password stealing. Opass greatly avoids the man-in-middle attacks. In case of users lose their cellphones, this still works by reissuing the SIM cards and long-term passwords. This is a efficient user authentication protocol and is at affordable cost
The first thing that a visitor sees on a website is the content present on its web pages. It is generally difficult for many organisations to keep their web site content as up to date as per their requirement. Most of the time there are delays getting new content online. Due to the large volume of content and unwillingness to deal with the coding and other technicalities in the present-day context necessitates development and deployment of Content Management Systems. This research paper is aimed to justify that why so many organisations are turning to CMS for their website development
Usage of Brain Tumor Segmentation in MRI Images Using Intelligent Water Drops Algorithm
Parmeet Kaur, Harish Kundra
The image processing is the technique which processes the input image and generates required results. In this work, the technique is been proposed which is improvement in intelligent water drops algorithm to detect brain tumor. The MRI images are taken as input which will be given to IWD algorithm for tumor detection. In this work, improvement is been proposed which is based on SVM classifier. The output of IWD algorithm is given as input to SVM classifier which will classify the cancer and non-cancer cells from the MRI images. The proposed and existing techniques is been implemented in MATLAB and it is been analyzed that accuracy of detection is increased upto 20 percent and execution time is reduced to 1.5 seconds
Mrs. Sayantani Ghosh, Prof. Samir K. Bandyopadhyay
A Proposed Method for Semantic Annotation on Social Media Images
Mrs. Sayantani Ghosh, Prof. Samir K. Bandyopadhyay
Semantic annotation attaches to various concepts (e.g. people, things, places, organizations etc) in any other content. A semantically annotated document is easy to interpret, combine and reuse by computers. In human history Web plays as the greatest information source. Many researchers believe semantic annotations can be inserted in web-based documents for information extraction and knowledge mining. These annotations use terms defined in an ontology. This paper uses the annotation process based on textual patterns for improving annotation.
Gyaneshwar Mahto, Dr. Umesh Prasad, Dr. Rajiv Kumar Dwivedi, Santosh Kumar Srivastava
Exploring The Emerging Role Of Data Mining In Retail Forecasting : Need For The Retailers
Gyaneshwar Mahto, Dr. Umesh Prasad, Dr. Rajiv Kumar Dwivedi, Santosh Kumar Srivastava
In the modern age the Retail marketing is growing very fast but as fast as it is growing, Different challenges arise for the retailers. The arrival, growth and spread of Information Technology and related tools have revolutionized the way modern day businesses are being conducted at the global as well as national levels. For the Retailers optimum and precise decisions are important in the given dynamic and competitive business environment. Further, forecasting decisions also assume crucial significance across all areas of business with Retail also not being an exception. Forecasting decisions in context of Retail are associated with their own share of challenges and issues. The present working paper seeks to explore the emerging roles of Data Mining in Retail Forecasting as needed for the retailers for better business idea. This work further explores the possible applications of other related IT technologies and also discusses the related issues and trends, so the retailer need to invest less and may be able to earn more.
A Survival Study on Pattern Recognition for Ischemic Stroke Detection on CT Images
B. Sowmya , Dr. P.R.Tamiselvi
The stroke disease is caused because of the cerebrovascular accident which does not allow the vessels to supply blood to the brain. Occurrence of stroke is due to the burst or blockage of the blood vessel. For efficient diagnosis of ischemic stroke, Computed Tomography (CT) images are used with life support devices. Segmentation is one of the methods through which ischemic stroke region get differentiated from healthy tissues in CT images. However, accurate segmentation of original CT images is not obtained in minimum response time and pattern recognition is not carried out. Our research aims to perform early detection using segmentation and extracts the trivial features through optimality for classifying the ischemic stroke CT images into stroke and non-stroke images. In addition, pattern recognition is carried out for differentiation between strokes and non-strokes model.
Mr. Lucky P. Bajaj , Mr. Siddharth A. Kokalki, Mr. Abhilash.R.Ghagare
Notice App: An android application to understand and identify students’ perspective
Mr. Lucky P. Bajaj , Mr. Siddharth A. Kokalki, Mr. Abhilash.R.Ghagare
— In older days college notices were displayed on wooden notice board that are mounted at different locations in college building [1]. Generally students were not aware about notices which are very time bounded and also not habitual to read notices regularly. As technological evolution done very rapidly, it is need to update with technology. Smart phones and android phones are now a days ease to use and everybody aware of its handling. In this paper, we proposed an android application to display different kind of notices organized into various categories such as department, TPO, office, sports, cultural, hostel etc. so that students will get all college notices anytime and anywhere. This application mainly focuses on update, delete notices and specific feedback generation using association rule on student dataset. We will try to find students perspective by analyzing specific feedback questionnaire.
Soha S. Zaghloul, PhD , Ashwag Homod Alotebi , Noura Saleh AL Maghrabi , Asma Abdurahman, Almulifi4 , Hend Ibrahim Alshaya
Deterministic Routing Algorithm for Shared Memory Processing (SMP)
Soha S. Zaghloul, PhD , Ashwag Homod Alotebi , Noura Saleh AL Maghrabi , Asma Abdurahman,...
Parallel computing involves the simultaneous deployment of various resources and computers to solve computational problems by using multiple processors. The most common parallel computing model is the shared memory processing (SMP) model. In this model, a number of identical processors communicate with each other by using one large logical shared memory with the same amount of access time for the entire memory area. The parallel programming performance is affected by the run time, which is in turn affected by the number of processors and the size of the problem. Therefore, this paper presents an application of XY deterministic routing in an SMP system based on a 4x4 2D mesh topology network with two cores and two threads per core. The sequential run time and the related parallel run time for each thread were measured. The sequential time that must be achieved to compensate for the warm-up overhead time consumed by the Java Virtual Machine (JVM) was at least 60 seconds. The results revealed that the achieved sequential time was equal to 69.057 seconds, whereas the achieved parallel times for threads 1 and 2 of the first core and threads 1 and 2 of the second core were 70.066, 68.112, 44.869, and 42.412 seconds, respectively. On the basis of the degree of parallelism, the parallel programming performance was evaluated in terms of speedup and efficiency. The performance evaluation results demonstrated that an increase in the degree of parallelism results in faster speed up and decreased efficiency.
Soha S. Zaghloul,PhD, Laila M. AlShehri ,Maram F. AlJouie, Nojood E. AlEissa , Nourah A. AlMogheerah
Analytical and Experimental Performance Evaluation of Parallel Merge sort on Multicore System
Soha S. Zaghloul,PhD, Laila M. AlShehri ,Maram F. AlJouie, Nojood E. AlEissa , Nourah A. AlMogheerah
Parallel programming has evolved due to the availability of fast and inexpensive processors. This technique allows us to determine which portions of an algorithm may be executed simultaneously by exploiting different processors. Recently, the focus has shifted to implementing parallel algorithms on multicore systems in order to increase performance. One of the most common operations performed by computers is sorting, which is a permutation on elements. Merge sort is an effective divide-and conquer sorting algorithm that is easy to understand relative to other sorting strategies. The aim of this paper is to describe and evaluate the performance of the parallel merge sort algorithm over its sequential version using the Java threading application program interface (API) environment, which allows programmers to directly manipulate threads in Java programs. The main idea of the proposed algorithm is to distribute the input data elements into several sub arrays according to the number of threads in each level. The experiments were conducted on a multi-core processor and examined the running time, speedup, efficiency, and scalability. The experiments also tested against different array sizes and different numbers of processors. The experimental results on a multi-core processor show that the proposed parallel algorithm achieves a good performance compared to the sequential algorithm.
Classification of Age variant CARPAL AREA BONES by Support vector machine with RBF Kernel
Simerjeet Kaur, Nirvair Neeru
The BoneXpert technique recreates, from hand radiographs, the 15 bones borders consequently and afterward figures ldquointrinsicrdquo bone ages for 13 bones each. It changes the characteristic bone ages into Tanner Whitehouse or Greulich Pyle bone age. The reconstruction bone strategy consequently rejects images with anomalous bone morphology or exceptionally poor quality of image. From the methodological perspective, BoneXpert contains the accompanying advancements: 1) a generative bone reconstruction model; 2) the bone age prediction from surface score, shape, and force gotten from central segment examination; 3) the concept of consensus bone age which characterizes each bone’s age as the bone age best gauge of alternate bones in the hand; 4) a typical female and male bone age model; and 5) the unified GP and TW bone age modelling. BoneXpert is produced on 1559 images. Examined for foreseeing a compound's quantitative or unmitigated natural action in view of a quantitative portrayal of the compound's atomic structure. Random Forest is a troupe of unpruned characterization or relapse trees made by utilizing bootstrap tests of the preparation information and random selection of feature in induction tree. Prediction is made by collecting (larger part vote or averaging) the ensembles forecasts. We assembled prediction models for six cheminformatics informational sets. Our examination exhibits that Random Forest is a capable apparatus equipped for conveying execution that is among the most precise strategies to date. We likewise introduce three extra elements of Random Forest: worked in execution appraisal, a measure of relative significance of descriptors, and a measure of compound comparability that is weighted by the relative significance of descriptors In this thesis use the different group images and extract the geometric features and then use principle component analysis use extracted features and class in classification method and compare the results in our approach SVM with RBF kernel play important role in classification of images and predicting of images. In this done experiment SVM with RBF show 78% accuracy which has significance different from other methodology like linear regression and voted regression.
Mitigating Denial of Service Attacks in OLSR Protocol Using Fictitious Nodes
Nagaashwini Nayak V J , Nagaveni B Biradar
With the main focus of research in routing protocols for Mobile Ad-Hoc Networks (MANET) geared towards routing efficiency, the resulting protocols tend to be vulnerable to various attacks. Over the years, emphasis has also been placed on improving the security of these networks. Different solutions have been proposed for different types of attacks, however, these solutions often compromise routing efficiency or network overload. One major DOS attack against the Optimized Link State Routing protocol (OLSR) known as the node isolation attack occurs when topological knowledge of the network is exploited by an attacker who is able to isolate the victim from the rest of the network and subsequently deny communication services to the victim. In this paper, we suggest a novel solution to defend the OLSR protocol from node isolation attack by employing the same tactics used by the attack itself. Through extensive experimentation, we demonstrate that 1) the proposed protection prevents more than 95 percent of attacks, and 2) the overhead required drastically decreases as the network size increases until it is nondiscernable. Last, we suggest that this type of solution can be extended to other similar DOS attacks on OLSR.
Requirement engineering is the important phase of the software development life cycle. The objective of requirement engineering is to discover and collect the requirements from the client environment. The requirement engineer performs the critical role in examining and prioritize the requirements based on its necessity and feasibility. The requirement engineering makes complex decisions about the requirements in software development process. Selecting and prioritizing the proper requirement from the multiple is critical task. The Analytical Hierarchy process (AHP) is a multi criteria decision making method provides an effective quantitative approach for prioritizing the requirements. The objective of the paper is to evaluate the prioritization of requirements based on feasibility using the AHP.
In this paper, we investigate some properties of the approximations determined by a class of equivalence relations, in Pawlak's single granulation point of view. A comparison of these approximations with the optimistic and the pessimistic multi-granular approximations is also presented. It has been observed that the accuracy measure and the precision of these approximations are greater than those of the two multi-granular approximations. The topology determined by them is found to be stronger than the topology determined by the pessimistic multi-granular approximations. Finally the results are verified through an example in the context of an information system.
Khaja Moinuddin, Nalavadi Srikantha , Lokesh K S , Aswatha Narayana
A Survey on Secure Communication Protocols for IoT Systems
Khaja Moinuddin, Nalavadi Srikantha , Lokesh K S , Aswatha Narayana
The Internet of Things (IoT) integrates a large number of physical objects that are uniquely identified, ubiqui- tously interconnected and accessible through the Internet. IoT aims to transform any object in the real-world into a computing device that has sensing, communication and control capabili- ties. There is a growing number of IoT devices and applications and this leads to an increase in the number and complexity of malicious attacks. It is important to protect IoT systems against malicious attacks, especially to prevent attackers from obtaining control over the devices. A large number of security research solutions for IoT have been proposed in the last years, but most of them are not standardized or interoperable. In this paper, we investigate the security capabilities of existing protocols and networking stacks for IoT. We focus on solutions specified by well-known standardization bodies such as IEEE and IETF, and industry alliances, such as NFC Forum, ZigBee Alliance, Thread Group and LoRa Allianc.
Artificial Neural Network Model for predicting Maintainability Using MOOD and size metrics
Rajbinder Kaur, Mehak Aggarwal
One of the major challenges to software industry today is to provide products with high degrees of quality and functionality. Maintainability is one such quality attribute that accounts for 40-70% of the total cost of the project. As the technology advances, a number of metrics such as CK (Chidamer & Kemerer), MOOD (Metrics for Object Oriented Design), Lorenz and Kidd Suite, etc have been proposed to predict quality characteristics such as maintainability, reliability, usability etc. Currently, software development is mostly based on object oriented paradigm. At the system level, there are patterns that represent the extent of use of encapsulation, inheritance, polymorphism or cooperation among classes which are closely related with the quality characteristics. By finding those patterns developer of a project can say that a certain design is more maintainable than another. Most of the Maintainability models proposed earlier are based on CK metrics. CK are class based metrics but MOOD metrics are project based and represents all the basic mechanisms of Object Oriented Paradigm. Size of project also plays a significant role in maintainability prediction. In particular, larger size systems are hard to analyze and understand. Earlier statistical models were proposed but nowadays machine learning techniques such as Artificial Neural Networks (ANN), fuzzy, neurofuzzy etc. are used. ANN is capable of modeling complex functions and has strong generalization ability. Hence, in this paper MOOD and size metric (Lines of Code) based ANN model is proposed to predict maintainability of a project early in the design phase.
Privacy Preserving Healthcare System using secured cloud environment
Dr. M. Umashankar
Wireless devices that used to measure physiological signals to help develop health management become increasingly popular in recent years. Records are conducive for follow-up health and medical care. Using different type of wireless devices and the factors of a large number of users, large number of data are lead to create a security threads and privacy problems. With the help of wireless technology and cloud computing we may solve the traditional way of problems occurred at the time of data management and data transmission in health management system. This healthcare management application to propose an integrated algorithm technique for the jointly involved parties which is included privacy and security algorithm; it can protect data integrity also. This system provides the better efficiency and it is very useful to remote areas where hospitals are not easily accessible. This research to resolve wireless sensor network security relevant issues Using several patterns can reduce database load, and users to access data efficiently; the privacy control mechanism allow users to store data securely. The results of this research exhibit that the proposed system has better secured database access and maintain privacy for all patient data than the traditional database.
#1Mr. P Dayaker, #2Mr. Y Madan Reddy, #3Ms. K Ramya
A Cost-Efficient Multi-Cloud Data Hosting Scheme with High Availability
#1Mr. P Dayaker, #2Mr. Y Madan Reddy, #3Ms. K Ramya
Greater endeavors and institutions are facilitating their facts into the cloud, maintaining in thoughts the cease intention to reduce the IT guide fee and enhance the statistics dependability. Be that as it is able to, confronting the various cloud dealers and similarly their heterogeneous valuing approaches, customers may well be confounded with which cloud(s) are appropriate for putting away their records and what facilitating method is less costly. The general commercial enterprise as regular is that customers often placed their facts right into a solitary cloud and after that simply agree with to right fortune. In view of entire examination of various reducing aspect cloud merchants, this paper proposes a novel facts facilitating plan which incorporates key capacities fancied. The first is selecting some reasonable mists and a proper repetition device to keep statistics with minimized fiscal cost and ensured accessibility. The second one is placing off a move technique to re-deliver records as according to the kinds of facts get to instance and valuing of mists. We assess the execution of allure utilizing each complies with driven reproductions and version trials. The effects display that contrasted and the primary current plans, allure spares round 20% of money related cost in addition to suggests sound flexibility to statistics and cost alterations.
Paper on Searching and Indexing Using Elasticsearch
Darshita Kalyani, Dr. Devarshi Mehta
In today’s era, it is inconceivable to use traditional techniques / RDBMS to analyse the data as it is growing very quickly. Big data offers the solution for analysing large amount of data. Using technique of Elasticsearch, access to data can be made faster. Elasticsearch is a search engine based on Lucene. It is near real time search platform. Elasticsearch uses the concept of indexing to make the search faster. This paper elaborates the search technique of Elasticsearch.
Selection of Appropriate Web Content Management System (WCMS)
Neeraj Rohilla
In today’s scenario there are so many good websites which are powered by easy to use web content management systems (WCMS) that allow making changes to a website without needing to touch a single line of code. Now a days there is a substantial number of open source WCMS that are available as free for website developer. This research paper aims to make a comparison between three most commonly used WCMS (WordPress, Drupal and Joomla) that are present in the web world today. This comparison will guide which CMS should be selected based on their setup, plug-in, themes, multilingual-ness, security, scalability, administration, Role Management, flexibility, installation time and ease of customization.
Mrs. Nayan Mulla , Prof. Sachin B. Takmare , Prof Pramod A.Kharade
Dynamic Analysis of web system by using model-based testing and Process Crawler Model
Mrs. Nayan Mulla , Prof. Sachin B. Takmare , Prof Pramod A.Kharade
: Modern business applications predominantly rely on web technology, enabling software vendors to efficiently provide them as a service, removing some of the complexity of the traditional release and update process. To increasing web application accuracy and speed user process crawler model.Cutting edge business applications transcendently depend on web innovation, empowering programming sellers to give proficiently them as an administration, uprooting a portion of the multifaceted nature of the customary discharge and overhaul process. While this encourages shorter, more productive and successive discharge cycles, it obliges persistent testing. Having knowledge into application conduct through unequivocal models can to a great extent bolster improvement, testing and support. Model-based testing permits effective test creation taking into account a depiction of the states the application can be in and the moves between these states. As determining conduct models that are sufficiently exact to be executable by a test computerization device is a hard assignment, an option is to concentrate them from running applications.
Power is one of the most critical components of infrastructure resolvent for the economic growth and welfare of nations. The development of sufficient infrastructure is essential for sustained growth of the Indian economy. . Electricity demand in the country has increased rapidly due to the fast growing industries and population growth. Solar play major role to meet the increasing demand. A 25 kW grid-connected photovoltaic (PV) power generation system was installed and monitored at the Kamaraj College of Engineering and Technology in Virudhunagar since March 2014. In this project work the various factors that affect the performance of the solar PV are analyzed. The solar PV panels installed here are mono crystalline, poly crystalline and low concentrated PV modules. Each PV panels are connected in series or parallel. In this PV plant several modules are get damaged due to the abnormal weather condition and some panels are in good working condition. The only solution of this problem is the rearrangement of solar PV panels. By doing reconfiguration, the performance of the solar plant should be improved and also it improves the reliability and reduces the vulnerability to power loss by damage.
Slope Stability Prediction using Artificial Neural Network (ANN)
Arunav Chakraborty and Dr. Diganta Goswami
Artificial neural networks (ANN) usually called neural networks are very sophisticated modeling techniques which are capable of modeling extremely complex functions. They are used for predicting the outcome of two or more independent variables. Predicting the stability of slopes is a very challenging task for the geotechnical engineers. They have to pay particular attention to geology, ground water and shear strength of the soils in accessing slope stability. In this paper, a prediction formula has been developed for predicting the factor of safety (FOS) of the slopes using ANN. A total of 110 cases with different geometric and soil conditions were analyzed using Bishop’s Simplified Method. Out of these, 100 cases were used to train up the prediction model. The computational method for the training process was a back propagation learning algorithm. The prediction model is validated by comparing the results with the remaining 10 cases.
Karanpreet Kaur , Dr. manpreet Singh , Er. Gurdeep Singh
A Survey on Energy Efficient Hierarchical Clustering Algorithm for Wireless Sensor Networks
Karanpreet Kaur , Dr. manpreet Singh , Er. Gurdeep Singh
Wireless sensor network are spatially shared out self-directed sensors to observe physical or environmental situations, such as sound, temperature, pressure, and so on as well as to cooperatively go by their data via network to the central location. The extra new networks are bi-directional, too enabling management of the sensor activity. A topology of WSNs could vary from an easy star network to a superior wireless multi-hop mesh network. In this paper we have explained a comprehensive analysis of the current researches on the wireless sensor network. Authors classify the difficulties into three diverse categories. The first is the internal platform as well as underlying operating system. The second is the communication protocol stack. The third is the network services, deployment and provisioning. The node expires in a network if it does not have enough energy. The detailed analysis of routing protocols has defined based upon energy efficiency. Key Words: WSNS, energy efficiency, sensors.
Dr.Tamer Anwar Ahmed Alzohairy Dr. Mohamed Taha Abu-Kresha, Ahmed Nagy Ramadan Bakry
An efficient method for web page classification based on text
Dr.Tamer Anwar Ahmed Alzohairy Dr. Mohamed Taha Abu-Kresha, Ahmed Nagy Ramadan Bakry
According to Google index , the number of web pages now exceeds 50 billion, and is increasing by millions per day. The global population of internet users is also growing rapidly and then a web page classification problem arises. According to that, automatic classification method is required to deal with this problem of the World Wide Web (WWW). The traditional methods that use text determine the class of the document, but usually retrieve unrelated web pages. In order to effectively classify web pages, we apply different feature extraction techniques with different web page classification methods to find an efficient method for web page classification. The three feature extraction methods used in the study are Term Occurrence (TO), Term Frequency (TF), Term Frequency-Inverse Document Frequency (TF-IDF) and the three classifiers used in the study are K-nearest neighbor (K-NN), Naive Bayes (NB) and Decision Tree (DT). Each web page is represented by the three feature extraction methods. The principal component analysis (PCA) is used to select the most relevant features for the classification as the number of unique words in the collection set is big. The final output of the PCA is sent to the three different classifiers to find the best method for web page classification. The experimental evaluation used demonstrates that the combination of Naive Bayes (NB) and Term Frequency-Inverse Document Frequency provides an efficient classification accuracy compared to other methods
Design and Implementation of UPFC Using Ten Switch Converter With Switch Reduction
D.Aarthi, R.Manivasagam
Unified Power Flow Controller (UPFC) be one of the most comprehensive FACTS devices which can controls three system parameters independently. In this system a novel configuration of UPFC which consists of a ten switch converter parallel with a series. Capacitor has been proposed to inject desired series voltage. It means that operation of ten-switch converter in this configuration be the same as combination of two converters in conventional UPFC. However, proposed configuration requirements less power electronics switches and gate drive circuits and control scheme becomes simpler than conventional UPFC configuration, using series capacitor parallel by ten switch converter reduce the injection voltage‟s THD, eliminate output filter, also decrease the converters power rating in comparison with conventional UPFC composed of series and shunt converters. The proposed UPFC is modeled using MATLAB/SIMULINK software and simulation results are presented to indicate good operation of the new UPFC.
A Design of AMBA AXI4-Lite ACE Interconnect Protocol for Transactionbased SoC Design Techniques Integration
Chiranjeet Kumar1 Dr. M. Gurunadha Babu
The chip design in the 21st century has undergone various changes due to the increased customer demands and this lead to design complexity in systems-on-chip (SoC), network-on-chip (NoC), application-specific integrated circuit (ASIC), and field-programmable gate array (FPGA) designs. This creates a situation to develop an advanced system to resolve the complexity issue in meantime. The verification step consumes the major portion of the VDHL time and transaction-level modeling (TLM) and Bus Functional modeling (BFM) are used in order to reduce this effort. Transaction-level modeling (TLM) is a technique used to describe the system by using the standard function calls which defines all the transactions which are required to verify the functionality of the system at the architecture level. The usage of the transaction based techniques are designed for the software analysis and for the first time, in this research work it is used for the physical hardware design and its analysis based on the AMBA ace-lite architecture. In past AMBA AXI4 Bus Interconnects is used for the hardware system design but it fails to meet the practical design requirements and the proposed AMBA ace-lite architecture has yielded the desired results with low complexity. With the proposed AMBA ace-lite architectural design for hardware system design, several SoC/NoC subsystems can easily be interconnected in basically the same manner as how transaction-based simulation models are being written. The proposed methodology is useful for the hardware design engineers to deal with the complexity simplification issues by bringing the benefits of transaction-based verification (TBV) to it approach.
An overview on Cloud Security and Proposed Solutions
Jamuna K M
Cloud computing has revolutionized the world of computing in the last few years. The benefits such as reduced costs, rapid application deployment, and elastic resources, have made many organizations to utilize cloud resources or host much of their data in the cloud. Recent studies shown that more than 70 percent of the world’s businesses now operate some of their operations in the cloud. But the security of the data stored in cloud is the major concern. This paper gives a vogue idea about the cloud security threads and its proposed solutions.
Dynamic Analysis for Simulating the Effect of Power Quality on Sensitive Electronic Equipment
D.Srinivasulu , V.Sharath Babu, R N V L H Madhuri
: With an increasing usage of sensitive electronic equipment power quality has become a major concern now. One critical aspect of power quality studies is the ability to perform automatic power quality data analysis. The impact of power quality on the operation of sensitive equipment has been illustrated through simulations in MATLAB SIMULINK. Such study is essential to predict the performance of modern loads and also to be able to explain why a specific load fails during a power quality event. The findings are reported in detail in this paper. The paper proposes a neural network solution to the indirect vector control of three phase induction motor including a real-time trained neural controller for the IM angular velocity, which permitted the speed up reaction to the variable load. The basic equations and elements of the indirect field oriented control scheme are given. The control scheme is realized by one recurrent and two feed-forward neural networks. The first one is learned in real-time by the dynamic BP method and the two FFNNs are learned off-line by the Levenberg-Marquardt algorithm with data taken by PIcontrol simulations. The final set up MSE of the LM algorithm is of 10-10. The graphical results of modeling show a better performance of the adaptive NN control system with respect to the PI controlled system realizing the same computational control scheme with variable load.