A Survey on Quality of Service Routing Protocols for Mobile Ad hoc Networks
V.Sathish V.Thiagarasu
:Due to the emergence of communication through wireless mode, infrastructure less transmission of data received greater importance. Mobile ad hoc network (MANET) is one such kind of infrastructure less means of communication. Ensuring Quality of Service (QoS) is one of the major research dimensions in the field of MANETs. This paper surveys the review of literature on QoS routing protocols for MANETs. From the review it is evident that on-demand routing is comparatively better than table – driven routing protocols for ensuring QoS. Also it is noteworthy that on-demand routing protocols have more futuristic research dimensions for carrying out further exploration and invention of routing protocols for MANETs.
From the past few years, demand of high bandwidth, high channel capacity increases rapidly. In order to meet these types of requirements we employ wavelength division multiplexing (WDM) over a bidirectional link. WDM performance mainly depends upon various performance metrics such as modulation formats, channel capacity, channel spacing, range etc. This paper shows the comparative analysis of various advanced modulation formats such as non return to zero (NRZ), return to zero (RZ), carrier-suppressed return to zero (CSRZ) and modified duo-binary return to zero (MDRZ). The designed system has achieved the best result at a distance of 120km over 4-channel bidirectional link (without using any amplifier or any dispersion compensating fiber). System performance has been evaluated in terms of BER, Q-factor and eye diagram on optisystem 14.1. From the results, it is found that MDRZ gives the dispersion less transmission and less BER for long distance transmission.
Encrypted Bigdata Using AES Deduplication in Cloud Storage
N.B. Mahesh Kumar
Benefited from cloud computing, users can achieve an effective and economical approach for data sharing among group members in the cloud with the characters of low maintenance and little management cost. Meanwhile, it must provide security guarantees for the sharing data files since they are outsourced. Unfortunately, because of the frequent change of the membership, sharing data while providing privacypreserving is still a challenging issue, especially for an untrusted cloud due to the collusion attack. Moreover, for existing schemes, the security of key distribution is based on the secure communication channel, however, to have such channel is a strong assumption and is difficult for practice. In this paper, a secure data sharing scheme for dynamic members was proposed. Firstly, a secure way for key distribution without any secure communication channels, and the users can securely obtain their private keys from group manager was proposed. Secondly, this scheme can achieve fine-grained access control, any user in the group can use the source in the cloud and revoked users cannot access the cloud again after they are revoked. Thirdly, the scheme from collusion attack, which means that revoked users cannot get the original data file even if they conspire with the untrusted cloud was protected. In the proposed scheme, by leveraging polynomial function, a secure user revocation scheme was achieved. Finally, this scheme can achieve fine efficiency, which means previous users need not to update their private keys for the situation either a new user joins in the group or a user is revoked from the group. The results will show effectiveness of the scheme for potential practical deployment, especially for big data deduplication in cloud storage.
Dengue fever is a seasonal, vector borne disease which is often deadly. At present, dengue is usually diagnosed through two stages of tests in India. A patient showing the physical symptoms of dengue is first subjected to a screening test, the CBC test. The second test, the dengue serology test, is the truly confirmatory test, but may take up to 10 days to return a correct reading. We propose a simple neural network based model which can detect whether the patient has dengue, with just the preliminary CBC test report’s data. Patient data was collected from a single hospital located in Ghaziabad, India. We found that the system correctly classified the unseen test cases with a significant degree of accuracy. We further propose as future research directions the application and comparison of the modelling results of more pattern recognition techniques for this classification task, testing of the system in real-time hospital conditions, and the inclusion of locality specific factors to build as general and as widely-reproducible a model as possible.
With recent increase in uses of internet and development of web technologies, proportion of web content in Hindi is increasing at a lightning impetus. These particulars can play a vital role for researchers and computer science engineers in devising systems and real-world application for government organisation and private sector which could ease their decision making process. In this paper, we present a modus operandi to perform lexical analysis on Devanagari Hindi language. This was done by building a dictionary containing Hindi words using few information retrieval techniques and implementing some error recovery strategies to recover from lexical errors (if occurred). We achieved an accuracy of approximately 88% while performing Lexical analysis on different Hindi words. Main advantage of our work is, entire processing of the Lexical analysis phase can be completed in offline mode (i.e. without any usage of internet).
Performing Efficient protocol for reducing energy consumption in wireless sensor Networks
Mr.Muruganandam.K, Dr.Sibaram Khara
Wireless Sensor Networks (WSN) are one of the most rapidly developing technologies with a wide range of applications which includes a sensing process, security providence and environmental sensing and military applications. WSN consists of collection of sensor nodes and each sensor nodes are used for sensing the environmental conditions while transmitting data to the base station. Energy consumption is the major issue in WSN. Each sensor nodes can utilize only limited amount of power supply for performing transmission of packets in a wireless environment. In this paper, we study the various routing protocols and compare among them. We also study the trade-offs between energy and communication overheads, highlighting the advantages and demerits of each routing protocol with the purpose of discovering new research Directions. Based On The Identified ResearchGap,WePropose An Optimum Energy Efficient Routing Protocol For Today’s Wsns. Clustering is one of the promising techniques for reducing the energy consumption. In a clustered WSN, sensor nodes are partitioned into a certain number of clusters, each of which has a cluster head (CH) and some non-cluster head members. CH collects information from all the cluster members and then forwards to other CH. while non-CHs are responsible for sensing environmental conditions and transmitting information to the corresponding CH. The simulation results show how the election criteria for cluster heads election such as random election and nodes with different energy level affect the number of cluster heads elected, and the network lifetime. In this paper, we analyse three different types of routing protocols: LEACH, SEP, and TEEN. Simulation results are provided to show the comparative effectiveness of different clustering algorithm on network lifetime and cluster head selection and failure nodes in the network. Sensor networks are simulated using MATLAB simulator.
Keerti. S. Mahajan*, S.S.Jamsandekar *, Dr. A M. Gurav
Machine Learning Approach for Marketing Intelligence: Managerial Application
Keerti. S. Mahajan*, S.S.Jamsandekar *, Dr. A M. Gurav
Marketing Intelligent Approach is a cutting edge for marketing management support system to deal with knowledge by machine learning and other soft computing techniques. The range of potential applications of machine learning techniques in marketing management are consumer behavior, optimization of productmarket structure, managing the market mix, strategic marketing, finance domain etc describes the synergy between marketing and intelligent systems, specially machine learning techniques. Interactive promotion is a matching field for marketing; where intelligent systems can be applied. The above description suggests us that marketing is a multifaceted field of decision making. Marketing decision is a combination of judgments and analysis which involves a huge degree of intuition in which knowledge and expertise are required to take decision, therefore here Artificial Intelligence (AI) can play an important role. Machine learning is an AI component that observes historic data of actions and does experiential learning by putting the knowledge to use to perform similar process in new computational settings1 . This paper describes about the potential benefits associated with the application of machine learning techniques to the field of marketing management. It also describes the fundamental techniques and introduces relevant marketing fields to which machine learning approach such as Data Mining, AI, and Soft Computing techniques could be applied.
Mobile Cloud computing is the ability of cloud computing services in the environment of mobile by providing different and optimal services for mobile users. Mobile Cloud Computing incorporates all elements of mobile networks and cloud computing. In MCC data and computing modules can be processed in the cloud and mobile device have no need to do configuration like memory capacity, CPU speed etc. However mobile device facing up many struggles in their resources like privacy, mobility and security. These challenges have large affect in the improvement of service qualities of MCC. In this paper, we discuss the review on MCC technologies and its architecture, characteristics, applications, advantages, limitations and offloading techniques of mobile cloud computing.
Biometric Authentication using the elements in the hair
N. Ambiga,
Biometric authentication is a security process that relies on the unique biological characteristics of an individual to verify that he is who is says he is. Biometric authentication systems compare a biometric data capture to stored, confirmed authentic data in a database. If both samples of the biometric data match, authentication is confirmed. Human hair consists of proteins, lipids, water, trace elements ,toxic elements, other elements and pigments. The composition of the elements may vary from one hair of one human to another human. the elements present in the hair can be consider for biometric authentication. the various elements in the hair can be calculated and this will be a part of a bio metric authentication. this can be highly useful in the smart cards In the future. The main component of hair is protein and amino acid. The percentage of elements in the hair may vary to each human being. This paper present the theoretical aspect of all the elements in the hair and how it can be used for biometric authentication.
Fuzzy Adaptive Selection of Votes in Probabilistic Filtering Scheme in WSNs
Muhamad Akram Tae Ho Cho
Wireless senor networks comprise of tiny nodes with limited energy and computational resources. Sensor networks are usually installed in unattended volatile environments where they are exposed to sever security attacks. Attackers can generate two main types of attacks i.e. injection of false reports and false MACs. PVFS is a singular en-route filtering scheme which counters both the attacks. However, the number of MACs attached to each report is fixed in PVFS. In this paper, we propose a scheme that supports fuzzy adaptive choice of votes to be included in the report before forwarding it to the base station in probabilistic voting based filtering scheme. The proposed method attains better energy saving when the attacks are not very hostile in terms of their frequency of occurrence, which helps extend network lifetime.
Mrs.J.Sukanya M.C.A., M.Phil S. Vijaya Kumar M.Sc., M.Phil
Applications of Big Data Analytics and Machine Learning Techniques in Health Care Sectors
Mrs.J.Sukanya M.C.A., M.Phil S. Vijaya Kumar M.Sc., M.Phil
Big Data in Healthcare refers to handling the large and complex sets of data and making it accessible at finger tips. Earlier, the data in healthcare used to reach a level of complexity from where it used to become next to impossible to extract that data and bring it to use. This leads to slow growth in healthcare industry. Big Data becomes much more interesting when it is used in healthcare analytics. The other potential benefits that big data provides is the detection (Analysis) of diseases beforehand which allows doctors to act treat proactively. This will also help in detecting fraud in healthcare more quickly and efficiently, By bringing into use of statistical tools and algorithms, there could be a faster development of more accurately targeted vaccines. The tools are used to help developing the cost effective ways for discovering more clinically relevant ways to analyse the disease and treat the patients in an effective manner. The objective of this paper analyzes importance of big data and the various steps involved in machine learning techniques in healthcare. This paper also identifies the big data analytics are helping to realize the goals of diagnosing, treating, helping, and healing all patients in need of healthcare.
Finding Mobile Applications in Cellular Device-to-Device Communications: Hash Function and Bloom Filter-Based Approach
P.Sumanjali , Ch.Sudhakar Reddy
The rapid growth of mobile computing technology and wireless communication have significantly increased the mobile users worldwide. We propose a code-based discovery protocol for cellular device-to-device (D2D) communications. To realize proximity based services such as mobile social networks and mobile marketing using D2D communications, each device should first discover nearby devices, which have mobile applications of interest, by using a discovery protocol. The proposed discovery protocol makes use of a short discovery code that contains compressed information of mobile applications in a device. A discovery code is generated by using either a hash function or a Bloom filter. When a device receives a discovery code broadcast by another device, the device can approximately find out the mobile applications in the other device. The proposed protocol is capable of quickly discovering massive number of devices while consuming a relatively small amount of radio resources. We analyze the performance of the proposed protocol under the random direction mobility model and a real mobility trace. By simulations, we show that the analytical results well match the simulation results and that the proposed protocol greatly outperforms a simple non-filtering protocol
Level of Project Management of Estimation, Planning and Tracking and uses techniques
Rima Debnath, Dr. Amit Asthana
The process begins with estimating the size, effort and time required for the development of the software and ends with the product and other work products built in different phases of development. Model based technique is one of the best techniques used for estimation. Software engineering is the discipline which paves the roadmap for development within given schedule and effort and with the desired quality. The technique uses different parameters for estimation. The estimates should be accurate, failing to which leads to wrong estimates and consequently results in software crisis. The tools available for automating some of the activities are great help in the whole development process. However these tools isolate the process of estimation, planning & tracking and calibration. Secondly Software Engineering is a nascent discipline and still the metrics introduced for quantifying the attributes of software are not judgments. Handling large volume of data for these processes is a tiresome task.
A Secure and Reversible Watermarking using Secret-Fragment-Visible Mosaic Images and Arnold's Cat map
Megha C. Bute, Prof. Shafali Gupta
The image watermarking techniques are specially used to provide copyright protection, owners identification, image authentication and tamper detection. Water-marking technique is required to secure data and prevent unauthorized modification. The security and prevention from unauthorized modifications is due to rapid development of digital technologies, internet technologies and powerful image processing tools. Reversible watermarking becomes a promising technique to embed the information into important images. In this paper, we define the Region of Interest in an image and try to embed the data in Region of Non Interest. A basic idea of reversible watermarking is to select an embedding area in an image, embed the information into it and then rediscover the original image and information. If the amount of information need to embed is larger than embedding area, most of the techniques will consider lossless compression on the original values in the embedding area, and the space saved from compression will be used for embedding the watermark.
Energy Efficient E0 Algorithm for Wireless Transceivers
S.Suresh , R.Nagarajan R.Prabhu N.Karthick
The Bluetooth is main technology which was developed a group called Bluetooth Special Interest Group (SIG), formed in May 1998. The Bluetooth is a propose standard for short range wireless communication of devices. It uses radio waves to transfer information, so it is very susceptible to attacks. To protect the user’s information, it uses algorithm. The Bluetooth offers methods for generate the authenticating keys for users and encrypting the data. The data encryption mechanism used within the Bluetooth security layer is the E0 stream cipher. A stream cipher is a symmetric cipher in other words; the same secret key is employed for both the encryption and the decryption. The E0 stream cipher is a linear feedback shift registers (LFSR) based key stream generator and the key stream thus generated is XORed with the plaintext to get the cipher text. Each time two Bluetooth devices are need to communicate securely; they first undergo authentication and key exchange protocols whose purpose is to agree on a shared secret, which is used to generate the encryption key. In this paper suggest a uniform framework for cryptanalysis of the E0 cipher. Our method requires 128 known bits of the key stream in order to recover the initial state of the LFSR, which reflects the secret key of this encryption engine. The key stream generator comprises of four LFSR of different lengths, which are combined by a simple finite state machine with 16 memory states. The output of this state machine is the key stream sequence or during initialization phase and the randomized initial start value.
A Generalized Reliability Mechanism with minimum -Replica by PRCR
M. Sravan Kumar K. Ramesh T. Subramanyam
The rapid progress of technology and large scale use of internet has resulted in generation of an enormous amount of data .In the recent years, cloud computing has become popular among various fields like information technology and various business enterprises. The fundamental use of the cloud now a days is for storing information and sharing the resources. Cloud is another way to store expensive measure of information. Cloud provides the storage space and offering to use this information to various clients. Additionally it is technique for pay according to we utilize. The two principle concerns in current cloud storage systems are providing the reliable data and storage cost. To ensure data reliability, current cloud systems uses multi-replica strategy (typically three replicas), which incurs a huge storage space, on the effect it leads to more storage cost for data-intensive applications in the cloud. To minimize the storage space, in this paper we proposed a cost effective data reliability mechanism called Proactive Replica Checking for Reliability (PRCR) by using a generalized data reliability model. PRCR guarantees the reliability with the reduced replication factor, which can likewise reduce the storage cost for replication based methodologies. Contrasting to the traditional three – replica strategy, PRCR can diminish the storage space utilization by one-third to two-thirds of the current storage space, thus fundamentally bringing down the storage cost.
RE Flow With Prototype & Its Impact with End- user in System Development
Akshara Dave, Dr. S.M. Shah
This paper generally focuses on different aspects of techniques related to requirement gathering as well as why and how these techniques may help stakeholder and the system for development. The main entity in SDLC is a customer/ stake holder. As the customer is the one for whom any project is built up or any IT company works, it is mandatory to keep them satisfied by providing them accurate solution of their problem. The paper is about the explanation of different majorly used requirement gathering techniques and their importance. From this paper, one can get idea which technique is suitable in various environments.
Performance Metrics on Routing Protocols in Wireless Ad-Hoc Networks under Security Attacks
IR Saidu, IM Anka
Wireless networks have certainly been a revolution in today’s technologically as one of the most vital and active fields in the communication industries. These wireless ad-hoc networks consists of independent nodes with decentralized administration and physical infrastructures, with a dynamic topology in which nodes can easily leave or join the network at all time and also move freely. In other not to compromise the confidentiality, and integrity of network services, the safety of packets data can only be achieved by guaranteeing that the problem of security has met the requisite standard. The performance analyses of routing protocols for Ad-Hoc wireless networks, DSDV, DSR and AODV were investigated using network parameters such as averages throughput, packet delivery ratio and End-to-end delay using various scenarios of mobile node and size of the network. Awk script was used to analyze the trace file and produce the average throughput, packet delivery ratio and end-to-end delay as the result of the simulation. The simulation was run by implementing codes in DSDV, DSR and AODV CC class files to accommodate the behavioral pattern of attacks. The results analysis of routing protocols under the security attacks, it was observed that the DSDV significantly has lower performance as a result of frequent link changes and connection failures which led to heavy overload and congestion problems. Furthermore, when comparing the two reactive routing protocols, AODV performs better than DSR.
Anonymity and the Obfuscation Issues in the Cryptographic Currency: Bitcoin
Rishav Chatterjee
Bitcoin is the cryptographic currency where all transactions are recorded in the blockchain — a public, global, and immutable ledger. Because transactions are public, Bitcoin and its users employ obfuscation to maintain a degree of financial privacy. Critically, and in contrast to typical uses of obfuscation, in Bitcoin obfuscation is not aimed against the system designer but is instead enabled by design. We map sixteen proposed privacy-preserving techniques for Bitcoin on an obfuscation-vs.-cryptography axis, and find that those that are used in practice tend toward obfuscation. We argue that this has led to a balance between privacy and regulatory acceptance.
Questioning Answering Mechanism Using Natural Language Processing
Shelza, Lali Priti Aggarwal Geetanjali Sharma
A Question Answering (Qa) System Provides Direct Answers To User Questions By Consulting Its Knowledge Base. This Paper Presents Basic Architecture Of Qa System That Is Based On Information Retrieval System, Modules And Its Various Types. Along With That This Work Also Describes Different Architectures For Various Types Of Question Answering Mechanism Such As Open Domain Qas, Closed Domain Qas, Rule Based And Web Based Qas. The Comparison Among Different Architectures Of Question Answering Mechanism Is Also Drawn
Performance Evaluation of Quality of Service Parameters for Scheduling Algorithms in Wireless Mesh Networks
Mayuri Panchal, Rajesh Bansode
There is a great popularity achieved by Wireless Mesh Networks in the recent years due to their last mile Internet access, low deployment cost and self-configuring features. It is considered to be an effective solution to support multimedia services in last miles due to their automatic configuration and low cost deployment. The main feature of WMNs is multi-hop communications which may result in increased coverage, better robustness and more capacity. Implemented on limited radio range wireless media, WMNs bring about many challenges such as effective media access control, efficient routing, quality of service provisioning, call admission control and scheduling. The various performance measurements based on reported result analysis use various optimal metrics for energy efficient wireless communication system. The energy efficient wireless communication protocols that are being used in the current work includes viz. data transfer rate, packet size, protocol used, energy efficiency, number of nodes, square topology area, distance between nodes and base station. The parameters achieved during simulation for prototype design of energy efficient network initiated with 300 nodes, clustered into 30 each forming such 10 major clusters, distance between clusters is maintained up to 200m, with 50 iterations/rounds the data packets received is up to 16kbps within the clusters, with square topology area of 1250mx1250m and, power consumption of up to24dB.
- In this paper, we present an android mobile application of the VESIT Library system which can help the college students to view all the books available in the college library which is not practically possible by physically looking at each and every book. This system has two major features- first, the user can view the details of his or her issued books and second, the user gets to know about the number of copies of a particular book available in the library. Library database is made on SQL Server 2000. This application accesses the database using the Laravel framework to provide additional security against database hacking. Students use their college Library ID as their login ID. The interface is smooth with minimal lag and and the operations are quick. Basically, this application simplifies the process of visiting the library just to have a look at the books. The technical details are further outlined in this paper.
Spatial Item Recommendation using collaborative filtering
Yuvraj D. Chougale, Mr. B. S. Satpute
: In the development of location-based social networks (LBSNs), spatial items have recommended and that has become an important way of helping users discover interesting locations to increase their engagement with location-based services. Although human movement exhibits sequential patterns in LBSNs, most current studies on spatial item recommendations do not consider the sequential influence of locations. we propose a sequential personalized spatial item recommendation framework (SPORE) which introduces a novel latent variable topic-region to model and fuse sequential influence with personal interest in the latent and exponential space. The advantages of modeling the sequential effect at the topic-region level include a significantly reduced prediction space, an effective alleviation of data sparsity and a direct expression of the semantic meaning of users’ spatial activities. We evaluate the performance of SPORE on two real datasets and one large-scale synthetic dataset. The results demonstrate a significant improvement in SPORE’s ability to recommend spatial items, in terms of both effectiveness and efficiency, compared with the state-of-the-art methods
The Big Data has too many challenges that face each IT deployment and educational analysis communities, the huge amount of data coming from various sources which are based on data stream and curse of dimensionality. The Big Data depends on 3V challenges specifically, Volume, variety and velocity. It's typically illustrious that the data coming from various data sources in different format and gather together in very high speed and creating ancient batch based model which is infeasible for real time data processing. This can be the most important challenge with the Big Data. As velocity is one of the challenges in Big Data, the crucial issue is to mine most valuable or actual and relevant information. To perform data mining over such high speed information the Big Data technology obtaining importance currently a days. The Feature selection technique is employed for data stream mining on the fly in big data. Feature selection has been widely used to minimize the process load in causing the mining information model. To achieve the query accuracy within minimum processing time and to reduce the processing load the accelerated particle warm optimization (APSO) is employed.
A Taxonomy Of Efficient Hierarchical Parallel Algorithms Using Grid Computing
Natish Kumar, Prof. Rupika Rana
Grid computing is effective and purely dispersed body that gives a highly assessed computing platform that resolve the high level of difficulties through allocation and distribution of computational ability. Yet Ant Colony algorithm for Optimization has the ability to give an optimized result in grid computing background then other algorithms but it does not give the assurance to give proper results when it executed for many time. Ant Colony algorithm provides improbability in merging time to the deviation in the size of the problem and has opposing results onto the program effectiveness. This algorithm lacks organized initial values because of the unsystematic initialization. This assumption defines about poorly chosen values will lead to bad solutions. In the proposed algorithm scheduling of autonomous jobs are selected. In this procedure at one time only one job is selected and proceeds for execution on the main server. If this main server is failed while executing then server is rejected. We have calculated the performance of these algorithms such as FCFS/FF, SJF/FF, LJF/FF and metaheuristic algorithm Ant Colony Optimization for job scheduling and draw comparison between proposed technique
: Tremendous popularity of the ecommerce applications increased now days. Numerous amounts of reviews and ratings are available on the sites. Such as comments, reviews description of the local service. This type of information is valuable for the new user who judges the product online to make their decision. Sometimes this information may be having problem to recommend user about product recommendation because of the lack of reviews and rating of particular product. If any product has two rating one 5 and other is 2 they only have two ratings. So while recommending this product just finds average of ratings and show to user. This not provides quality of service of the recommendation. This is not enough to extract public opinions. To provide quality of service to the recommendation have to improve overall evaluation of rating. We propose framework to find the trust of the user service rating and feature of review, spatio-temporal features. Extract the overall rating confidence and trust of rating and review by combining the overall rating to providing good quality of service.
Improvement the Efficiency CIGS Thin Film Solar Cells by Changing the Thickness Layers
Hashem Firoozi, Mohsen Imanieh
In this study the function of solar cells with the structure of CuIn1-xGaxSe2is examined. CIGS solar cell consists of layers of ZnO (Layer TCO), Cd_S (buffer layer), CIGS (absorbent layer), and Layer MO (substrate), which Cd_S and CIGS layers form a PN Junction. Later using SILVACO software CIGS solar cell is simulated. Then, CIGS solar cell is simulated using SILVACO software, Firstly the thickness of the adsorbent layer is changed. Later thickness of the absorbent layer becomes fixed and the thickness of the Cd_S and ZnO layers are changed respectively and its effect on cell function is discussed and examined. Important parameters of a solar cell that will be discussed here, include open circuit voltage (VOC), short circuit current (ISC), maximum power (Pmax), filling factor (FF) and efficiency (η). After conducted simulations, it was that increasing or decreasing the thickness layers has impact on solar cells function.
The main objective is to provide comfort to the mobile users by providing online information about various cities in TAMILNADU. The band width provided by the early networks are less leading to the network traffic and reduced data rate. In the past few years, circuit-switched networks showed a tremendous growth in data traffic due to the increasing popularity of the internet. Service providers need effective means to share the scarce radio resources between many subscribers. In a circuit- switched mode, a channel is allocated to a single user for the duration of the connection. Consequently new data applications are emerging and are reaching the mobile users. The existing mobile applications provide a text based information directory which is WAP (Wireless Application Protocol). The same can be developed as a GPRS (General Packet Radio Service) application that uses the global internet gateway through which it can be accessed in mobile devices.
Technique for Isolation of Malicious Nodes from the Cloud Computing Architecture
Komal Jeet Kaur
The cloud computing is the architecture in which virtual machine, cloudlets, and data centers are involved in communication. In the network, malicious nodes are responsible to trigger various types of active and passive attacks which reduce network performance in terms of various parameters. In this work, technique will be proposed for the detection and isolation of malicious nodes from the network. the malicious nodes are responsible to trigger virtual side channel attack in the network.
An Extended ERP model for Yemeni universities using TAM model.
Marwa Abdulrahman Al-hadi Nagi Ali Al-Shaibany
The main problems had been discussed for a while are complexity and failure of ERP system in an institution. To solve these problems and reduce this complexity some researchers were concentrated on the effect of perceived usefulness (PU) and perceived ease of use (PEOU) on the attitude toward using enterprise resource planning (ERP) system based on the theory of technology acceptance model (TAM) whereas, others focus on studying critical success factors (CSFs). On the other side, limited researchers put them together to check the influence of critical success factors on PU and PEU, as a key factor for accepting ERP system. However, this paper focuses on studying these CSFs and its affection on Yemeni higher education institutions using an extended technology acceptance model (TAM). This paper analyzes the impact of CSFs on user attitude toward using ERP at Yemeni higher education institutions. The proposed model has thirteen constructs, and they are: 1. Vision and objectives (VO), 2. Top management support and commitment (TM), 3. Business process (BP), 4. Organizational structure (OS), 5. Budget size (BS), 6. Human resources management (HRM), 7. Project management (PM), 8. Training and education (TE), 9. Business process re-engineering (PRE), 10. Communication and connection (COM), 11. Perceived ease of use (PEOU) 12. Perceived usefulness (PU), 13. Attitude toward (AT) using ERP and sixteen hypotheses which was generated to study the relationship between these constructs. The present Partial Least Squares (PLS) involves these relationships based on a survey of 123 users to measure the acceptance of this model. Results suggest important applied attitude toward using ERP and to develop the understanding of how to implement this attitude in higher education institutions and also we find that understanding of user's perceived ease of use and user's perceived usefulness should be taken into consideration for an institution in pre-implementation stage of an ERP system especially at Yemeni higher education institutions.
A detailed study of Software Development Life Cycle (SDLC) Models
Sahil Barjtya Ankur Sharma Usha Rani
This paper provides you comparative study of all the SDLC models and other hybrid methodologies of software development. This paper provide you all the advantages and disadvantages of the existing model and their limitation and also describe best uses of these models according to the situation. We described both contemporary models and traditional models which includes Sprial Model, Incremental Model, Spiral Model and V-Shaped Models are traditional models and Rapid Application Development Model, Agile software development comes under the contemporary models category.
Efficient Texture Classifier for Hand-dorsa Vein Recognition System using Completed LBTP (C-LBTP) Feature Descriptor
C.Premavathi, Dr.P.Thangaraj
Hand-dorsa Vein Recognition System is a biometric authentication system using inherent physiological characteristics to enable identification of individuals. Texture Description and classification are important feature analysis methods in hand-dorsa vein Recognition. In this paper a new feature description method such as Completed LBTP (C-LBTP) has been proposed to represent selected features from Hand vein image system. C-LBTP combines the features of Local Binary Pattern (LBP) and Local Ternary Pattern (LTP). CLBTP is a texture descriptor used to extract the local information from the input image. To find the efficient patterns from feature vectors, a new efficient classifier based on minimum distance classification is proposed. The classification results are checked with accuracy and reliability. The proposed method is evaluated on a NCUT Dataset contains 2040 images from Prof. Yiding Wang, North China University of technology (NCUT) (Wang et al, 2010). Similarity measures of various classification methods such as Chisquare, Cityblock, Euclidean, Chebychev and Minkowski are computed and compared for the better performance. The experimental results show that the proposed C-LBTP feature descriptor achieved good performance.
Experimental Analysis of Long Term Evolution Using Vector Signal Transceiver (VST)
Gaurav Soni Gaurav Megh
Long Term Evolution (LTE) has been introduced by 3rd Generation Partnership Project and dominates the 4th generation of mobile telecommunication network. LTE provides significantly increased peak data rates, with the potential for 100 Mbps downstream and 30 Mbps upstream, reduced latency, scalable bandwidth capacity, and backwards compatibility with existing GSM and UMTS technology. Future developments to could yield peak throughput on the order of 300 Mbps. In this article the performance of LTE system is experimentally evaluated based on VST 5644 using the modulation scheme like QPSK.
A Study on VMM and Resource Allocation Strategies in Cloud Computing Environment
Navdeep kaur Pooja Nagpal
Cloud computing is an internet dependent approach for providing shared resources on demand with the management of storage, networks, servers, services and applications that needs management optimum effort. Virtual machine migration is playing a significant role for improving resources utilization; processing nodes load balancing, application isolation, fault tolerance in virtual machines, to enhance the nodes portability and to maximize the physical server efficiency. To balance the cloud with its resources for better performance with the services to the endusers of the cloud and at the identical time, numbers of users are served by application deployments in the environment of cloud is the main task. The users of cloud may request or rent the resources when they become essential. This paper has provided an overview of the essential components of computing with the discussion of virtual machine migration, resource allocation in cloud computing with its challenges and risks.
This paper represents Punjabi Chunker using bootstrapping approach. Bootstrapping is an approach which does not need any external input that’s why it is also known as self-starting process. It is semi-supervised technique in which collection of both labeled and unlabeled data is taken. It helps to make use of unlabeled data by training a small amount of labeled data. Semi supervised learning is fall in the middle of supervised learning and unsupervised learning Chunking is the process of breaking long strings of information into units or chunks. Chunking is different from parsing. POS, Named entity Recognition and sentence breaking are the main applications of NLP in which chunking are used. This research work is different from greedy algorithm because in this approach both labeled (trained) and unlabeled data set is used to built text Chunker for Punjabi language.
Keywords: Natural language Processing (NLP), Part of Speech Tagger (POS), Punjabi Chunker.
Implementation and evaluation of optimal algorithms for computing association rule learning
Dr. Gurpreet Singh, Er. Sonia Jassi
One of the well-researched and most important techniques of mining data is Association Rule Mining. Association Rules as the name itself indicates includes finding correlations among sets of items in transaction database. The proposed work is based on automobiles study and will help the sellers and customers in making decisions. The objective is to find the important selling factors that affect the relevant sale of vehicles by using the association rule mining algorithm. Most famous algorithm of association rule mining is Apriori is used for knowledge discovery. Research work will improve the existing Apriori algorithm and will reduce some of the drawbacks of the existing algorithm.
Towards achieving Efficient Secure Multi-keyword Searchable encryption over Cloud Data using ECC
Miss.Prajakta Dimble Prof. Pramod B. Mali
Now a day’s data protection is major issues in the cloud.As cloud computing becomes more n more popular, data outsourcing is additionally getting accumulated in large quantity. To provide protection to the privacy of sensitive data, the data needs to be get encrypted before it outsourced . However, this process makes less data utilization. to resolve the above problem the traditional cryptographic techniques provides effective search on the cloud data using keywords only but they have some drawback or limitations like they support only Boolean search techniques. Also, there results are irrelevant, not so accurate because of these techniques doesn’t work on file contents. This creates difficulty level in retrieving the document. Firstly user who doesn’t have any knowledge about encrypted cloud data, have to go through every retrieved file to get the more relevant file, also user will get all the files which are related to queried keywords which increases network traffic. So to overcome all above the drawbacks of existing traditional system we provide an efficient solution using ECC algorithm which will provide security to sensitive data and increases performance. The mining algorithm provides the ranked search which will enhance the usability of the system and returns the most relevant data with help of keyword frequency
This paper develops a methodology to analyse various parameters of student data and predict the probability of the student, getting placed in super dream or dream or mass placement company. This is based on fuzzy logic. Fuzzy logic is an approach to computing based on "degrees of truth" rather than the usual "true or false" (1 or 0) Boolean logic on which the modern computer is based. Fuzzy logic is a form of many valued logic; it deals with reasoning that is approximate rather than fixed and exact. The applications of fuzzy logic ranges from control theory to artificial intelligence. Fuzzy logic will normalise a data set and give values between 0-1. In intelligent student analysis, we normalise the student data and then analyse the normalised data. Analysis completely depends on the data set. In this case, we are dealing with student data where the data set will include student CGPA, Branch and Attendance. With which, we calculate the possibility of placement of the particular student. Techniques and system model are elaborated in this paper.
Research and Improvement on K-means Algorithm Based on Large Data Set
r. Gurpreet Singh, Er. Vanshita Sharma
The highway safety is being cooperated and there are not sufficient safety aspects by which we can examine the traffic crashes before it occurs. A technique is planned by which we can pre-process the unintentional aspects. In order to control these pre-process issues, a clustering technique is used. In clustering technique present k-mean algorithm is improved and this improved K-mean algorithm will apply on traffic dataset. This dataset is composed from National Highway Authority. To collect data in the dataset several assessments and surveys are conducted from people and the staff of National Highway Authority. The elementary impression of this proposed work is to develop highway safety
All types of liquids require a media to store them, in a civilized area these storage media are termed as liquid storage tanks. This storage tanks are made up of different materials and shape. In majority circular liquid storage tanks are used for storing waters in industrial and residential area, in petroleum refiners required circular tanks for storing crude oil, petrol, and kerosene, etc. As these are also one of the important structures, design engineers need to study the behavior of these tank under various load condition, for better performance of tanks and to minimize the disasters, loss of life during earthquake and to make the tank work even after the disaster. An attempt is made to study the behavior of liquid storage tank under three different cases namely- static analysis, frequency / modal participation analysis, time history analysis and dynamic analysis. By determining the responses of the tank such as fluid pressure, hoop stress, principal stresses, tank displacement and natural frequency, mode shape and deformed shape due to dynamic fluid pressure and velocity.
A Comparative Study on Influence of Layered Subsoil to the Response of RCC Structure with Different types of Foundations
Roopa K Manogna H N
Damages caused by the past earthquake have pointed out that the seismic behavior of a structure not only depends upon the response of the super structure but also depend upon the response of the foundation and sub soil. The soil structure interaction effect is prominent for massive structures resting on relatively soft soil, layered subsoil such as high-rise buildings, nuclear power plants and elevated highways. The soil structure interaction effect is prominent for massive structures resting on relatively soft soil, layered subsoil such as high-rise buildings, nuclear power plants and elevated highways. Soil structure interaction is one of the major subjects of earthquake engineering and has been paid more attention in recent decades.
Optimized Software Testing Using Genetic Algorithm
Mrs Sudha Katkuri, Prof P. Premchand
The objective of software testing is to deliver a quality and reliable software product to the end user. Testing is an important phase of software development life cycle (SDLC), which is used to trace the errors and to check whether the developed software fulfills the user requirement or not. We require effective software testing, to have reliable software but it is not an easy job, we have to face certain issues like an effective generation of test cases, prioritization of test cases and so on. Different techniques and methodologies have been proposed for taking care of these issues. Genetic Algorithm (GA) is one of the evolutionary algorithms which produce an optimal solution to any problem technique. The aim of the research paper is to implement Genetic Algorithm for minimization of test cases and to reduce cost, time and effort in order to deliver the good quality software
A Comparative Study on Seismic Analysis of Multi-Storey Building with and without Steel Bracings
Nandeesh K M Guruprasad T N
In general, the structure in high seismic areas may be susceptible to the severe damage due to gravity load and seismic load. The columns and beams are used to transfer the major portion of the gravity loads and some part of lateral loads to the earth. This transfer of loads is not applicable during earthquake. In this project, a research is conducted to check the performance of different bracing system [Steel bracing] for different conditions. For this research G+15 storey RC frame structure 25m x 25 m has been analyzed for 5 days in each direction for seismic zone V. The soil is considered to be as hard soil and soft soil. For this research FE based software ETABS is chosen. Totally 13 models are analyzed for different bracing system with different sections and performance is checked by calculating time period, natural frequency, storey drift and base shear.
A Machine Learning Approach for Early Prediction of Breast Cancer
Younus Ahmad Malla, Mohammad Ubaidullah Bokari
Nowadays by the rapid digitization of the data in the Healthcare sector has resulted in the collection of mountains amount of data in various Electronic Health Records (EHR). As the data is the biggest asset in the modern age, whose proper utilization in the Healthcare sector can lead to the discovery of the dreadful diseases very well in time which in turn will provide high quality of care to patients and at less expenditure. Breast Cancer is a primary cause of death in women whose precise detection of Breast Cancer is important in early stages. Precise results can be achieved through data mining algorithms. Developing a machine learning models that can help us in prediction the disease can play a vital role in early prediction. These Machine learning methods can be used to classify between healthy people & people with different disease. In the given project the light is been thrown on the same disease by using certain selected machine learning algorithms in WEKA tool and a corresponding evaluation of the selected Machine learning algorithms in terms of accuracy is also performed so as to select the best classifier for the early diagnosis of the said disease with better accuracy results In this paper three different types of models were implemented on the Breast Cancer dataset as Naïve Bayes, Logistic Regression and Random Forest. Out of the three Random Forest lead the top by having accuracy of 98% and sensitivity 99% followed by Logistic Regression with accuracy of 96% and sensitivity 98% and finally with Naive Bayes with accuracy of 91% and sensitivity 94%.