Graph Based Approach For Multi Document Summarization
Mr. Vijay Sonawane, Prof. Rakesh Salam
Summarization is the process of decreasing large source document to shorten version of summary which will be easy to read. Document summarization is an emerging technique which is used for understanding the main purpose of any kind of documents. Summarization can be either single or multi document summarization. If summary is to be generated for single document then it is called as single document summarization. If summary is to be created for multiple relevant documents then it is called as multi document summarization. An Graph based approach for Multi Document Summarization is a graph based multi document summarization technique in which, set of documents is preprocessed, undirected graph will be constructed to calculate similarity between sentences, the word class is attached to each sentence, sentences are ranked according to word class and similarity of sentences and top ranked sentences are included in the summary.
A System for Serving Disability using Facial Expression Recognition
K.Kausalya1, K.Subashini2, M.Priyanga3
In human-human communication the face conveys a lot of information. Analyzing faces in human-computer communication is also becoming progressively important in providing security to the users. In existing system the facial expressions are examined and compared with the face images stored in the database. In proposed system TPCF (Tensor Perceptual Color Framework) is used for skin tone color detection and detected color are converted to gray scale value which is used for classifying skin or non-skin. Based on the skin or non-skin detection, the features are extracted and the emotions are calculated which are used for initiating the assigned task. The proposed system identifies the melancholic, smiling, angry emotions, when melancholic emotions are detected alert messages are sent to the attendees and when smiling emotions are detected, the system responds by playing songs etc.
The most clamant need of today’s generation, in view of the sedulous and busy life of people around, the need of some technique, which makes our life safe, and at the same time energy efficient. The concept of automation creeps from this very need-the need to control things to make life secure. But the attempts would be successful only with the right technology. We have started automating with this maiden effort of home automation. Home automation engulfs the idea of controlling lights and fan. The most luring feature which makes it so unique and particular beside the fact that it runs on world wide web is that it has the user end fixed as Android, which, today has emerged as the most lucrative and user friendly technology We have emulated this idea of combining Android platform along with Internet to save electricity. This maiden effort can well be applied to any electrical appliance since here we have restricted ourselves with switching on and off the lights and a fan along with controlling its speed. This automation system needs a dedicated computer machine, which serves the purpose at the spot of automation.
Retinal Images Classification using Graph-Based approach for Disease Identification
Divya.K.S1 , Kamala Malar.M2
Retinal images play vital role in several applications such as disease diagnosis and human recognition. They also play a major role in early detection of diabetics by comparing the states of the retinal blood vessels. This paper presents an automatic approach for Artery/Vein classification based on the analysis of a graph extracted from the retinal vasculature. The graph extracted from the segmented retinal vasculature is analyzed to decide on the type of intersection points (graph nodes), and afterwards one of two labels is assigned to each vessel segment (graph links). Final classification of a vessel segment as A/V is performed through the combination of the graph-based labeling results with a set of intensity features.
File storage in cloud storage is handled by third parties. Files can be integrated, so that the users are able to access the files using the centralized management. Due to the great number of users and devices in the cloud network, the managers cannot effectively manage the efficiency of storage node. Therefore, hardware is wasted and the complexity for managing the files also increases. In order to reduce workloads due to duplicate files, we propose the index name servers (INS). It helps to reduce file storage, data de-duplication, optimized node selection, and server load balancing, file compression, chunk matching, real-time feedback control, IP information, and busy level index monitoring. Performance is also increased. By using INS the files can also be reasonably distributed and workload can be decreased
Experimental Implementation Of Image Restoration Schema Using Inverse Filter Processing Techniques
A. Shakul Hamid*, Dr.S.P.Victor,
Image Restoration in Image processing domain provides the analytical way of implementation towards real-time data with different level of implications. Our experimental setup initially focuses with images with blurring and noises. This paper perform a detailed study of inverse filter schema towards variant effect of noisy blurred images in the field of image processing which can be carried out with expected optimal output strategies. We will implement our experimental image restoration techniques with real time implementation of object representation in the motive of advertisement Domains such as an image clarity required for a sweet stall in Tirunelveli District. We will also perform algorithmic procedural strategies for the successful implementation of our proposed research technique in several sampling domains with a maximum level of improvements. In near future we will implement the Optimal Image Restoration techniques for the de noising structure of images domain.
Optical Character recognition which is the original method of character recognition many times gives poor recognition rate due to error in character segmentation. Segmentation is a very important task in every OCR system. OCR system separates the scanned handwritten image text documents into lines, words and characters. The accuracy of OCR system mainly depends on segmentation algorithm and noise removal technique being used. Segmentation of handwritten Devnagari text is difficult when compared with printed Devnagari or printed English or any other printed document, because of its structural complexity and increased character set. It contains vowels, consonants and half form consonant. This system addresses the segmentation of handwritten Devnagari text document, the most popular script of Indian sub – continent into lines, words and characters. Mainly artificial neural network technique is used to design to pre-process, segment and recognize Devnagari characters.
A Review on Digital Watermarking Techniques, Applications and Attacks
Ms. Komal M.Shukla1, Mr. Ashish K.Mehta2
In today’s digital world, in every field there is massive use of digital contents. Digital documents can be copied and scattered easily to large numbers of people without any cost. People can download audio, image and video files, and they can share them with friends and they can influence or change their original contents. Due to this, there is more probability of copying of such digital information. So, there is an urgent need of prohibiting such illegitimate copyright of digital media. Digital watermarking (DWM) is the dominant solution to this problem. This paper aims to provide an exhaustive survey of digital watermarking techniques especially focuses on image watermarking types and its applications in today’s world.
G. Nataraja Sekhar1, Smt. S. Jessica Saritha2, C. Penchalaiah
HBase Performance Testing On Multi-node Cluster Setup
G. Nataraja Sekhar1, Smt. S. Jessica Saritha2, C. Penchalaiah
HBase is also called the Hadoop database since it is a NoSQL database that runs on top of Hadoop. It combines the scalability of Hadoop by running on the Hadoop Distributed File System (HDFS), with real-time data right of entry as a key and value store and deep analytic capabilities of MapReduce. This article introduces HBase and describe how it organizes and manages data and then demonstrates how to set up a local HBase environment and interact with data using the HBase shell. It is a data model that is related to Google’s big table designed to provide quick random access to huge amounts of structured data. It leverages the fault-tolerance provided by the Hadoop File System (HDFS). It is a part of the Hadoop ecosystem that provides random real-time read or write access to data in the Hadoop File System. One can store the data in HDFS either directly or during HBase. Data consumer reads and accesses the data in HDFS at randomly using HBase. It stands on top of the Hadoop File System and provides read and write access. HBase is key and value store specifically it is a reliable, Distributed, Multidimensional, Sorted map. HBase stores data in the form of a cells here cells are grouped by a row key into something that looks like a row and cells are stored individually the storage is sparse. HBase Performance testing on single node cluster step-up. but here query performance is slow under single node cluster set-up so this reason you have to introduce the multi-node cluster set-up and improvement of the query performance in HBase Cluster Environment.
The objective of designing this robot is simply to facilitate the humans in the future for security purposes. In the present scenario, there are many recent developments of robotics and communication on a large scale. The robot is in the form of a vehicle mounted with a web cam, which acquires and sends pictures PC. The movement of vehicle is controlled by microcontroller. Our idea is to make a robot to tackle the hostage situation &the worst conditions which cannot be handled by human being. Hence Humans are moved out from direct exposure to potentially dangerous situations. Robotic system can perform many security and surveillance functions more effectively than humans. The Keil micro software vision is used for writing Assembly level program code to the robot and for transferring the hex files to microcontroller.
Orthogonal Frequency Division Multiplexing (OFDM) is used as a powerful multiplexing technique in wireless communication to avoid Frequency selective fading and better use of bandwidth efficiency. However, OFDM suffer from a major drawback at transmitter called Peak to Average Power Ratio (PAPR) of the transmitted OFDM signal . In order to overcome this drawback Selected Mapping (SLM) technique is commonly employed.It achieves PAPR reduction with increased computational complexity and side information requirements. In this work, modified SLM based on phase offset technique is proposed in MIMO-OFDM system. To compare the performance of proposed technique, conventional SLM (C-SLM)is considered. Results show that the proposed technique is more effective in reducing PAPR. In addition BER performance of the proposed technique is evaluated and presented.
Nikhitkumar1, Abhijith S2, Shruti Jajoo3, S G Raghavendra Prasad4, Dr. Jitendar Mungara5
Low cost Intelligent Parking management and Guidance System
Nikhitkumar1, Abhijith S2, Shruti Jajoo3, S G Raghavendra Prasad4, Dr. Jitendar Mungara5
In this paper we propose an intelligent parking system which aims to avoid wastage of time and fuel to find an empty spot in a parking lot. Many of the existing systems only provide details about the number of empty spots in the parking area, which is insufficient as the user still has to find the parking spot to park his vehicle the parking lot. Our system not only indicates the number of empty spots but also guides the driver to the nearest parking spot to him/her through an android based application. The system also comprises of a booking feature through which the user can book his parking spot beforehand
Though computers are considered as machines, they are expected to give answers to questions in normal English language, just the way human beings can do. To train a computer to answer English language questions is an interesting and challenging problem. The automatic answering systems are classified under two categories: open domain systems and close domain systems. In this paper an answering scheme is proposed that combines close domain and open domain system. The question answering system for closed domain deals with questions under specific domain. Answers to questions from close domain cannot be searched using a search engine. During a close domain, answers to questions are not alive in the public area and so they cannot be search using a search engine. Hence, the answers to such questions are maintained in a database by the domain expert. During retrieval of answers, the best matched answer searched from database is returned to the user. To perform this matching a template matching technique is used. Open domain question answering system deals with all questions covering all domains. These systems can rely search engines to find the required answer. In this paper, we have combined both the approaches and proposed an answering scheme which accepts input in General English language. Thus the main challenge of the proposed scheme is to understand the English language question.
Augmented reality is the technology that brings life to digital world. The technology offers a way using which a user can directly interact with the digital environment. It increases the intuitiveness of any application that would seem unfriendly at first. This technology can be used or attached to certain existing applications to make them more realistic and intuitive. One such application would be interacting with the computer and accessing the information from the internet replacing the conventional mouse and keyboard.
Sonal Kachare1, Shrisha Vanga2, Ekta Gupta3, Jay Borade4
Fashion Accessories using Virtual Mirror
Sonal Kachare1, Shrisha Vanga2, Ekta Gupta3, Jay Borade4
In this system, the customer has the benefit to shop online in a smart way by trying on different accessories like sunglasses, hat, necklaces, etc using webcam. In order to place the clicked image on the face or other body part automatically, face detection algorithm like Viola-Jones is used. Thus the presence of this unique feature enhances the system and stands out as completely different website amongst other existing websites.
Laser ignition is considered to be one of the most promising future ignition concepts for internal combustion engines. It combines the legally required reduction of pollutant emissions and higher engine efficiencies. In technical appliances such as internal combustion engines, reliable ignition is necessary for adequate system performance. Economic as well as environmental constraints demand a further reduction in the fuel consumption and the exhaust emissions of motor vehicles. At the moment, direct injected fuel engines show the highest potential in reducing fuel consumption and exhaust emissions. Study of ignition of combustion processes is of great importance. In most cases, a well-defined ignition location and ignition time is crucial for an IC engine. Spark plugs are well suited for such tasks but suffer from disadvantages, like erosion of electrodes or restricted positioning possibilities. Thermodynamic requirements of a high compression ratio and a high power density are fulfilled well by laser ignition. Laser ignition can enhance the combustion process and minimize pollutant formation. At present the laser ignition plug is very expensive and commercially not yet available.
Conventionally, Wireless-controlled robots use rf circuits, which have the drawbacks of limited working range, limited frequency range and the limited control. Use of a mobile phone for robotic control can overcome these limitations. It provides the advantage of robust control, working range as large as the coverage area of the service provider, no interference with other controllers and up to twelve controlles. The key function of these robots is their ability to communicate with humans. Since communication via mobile phones is becoming increasingly common among us, it is essential for these robots also to be able to communicate using mobile phones. In this paper we propose the concept of a home robot and describe the essential functions that we consider are essential for the robot. Although the appearance and the capabilities of robots vary vastly, all robots share the feature of a mechanical, movable structure under some form of control. The Control of robot involves three distinct phases: perception, processing and action. Generally, the preceptors are sensors mounted on the robot, processing is done by the on-board microcontroller or processor, and the task is performed using motors or with some other actuators.
Documents on the web exist in digital format, people spend large amount of time on searching using web browsers for finding useful information. The results returned by search engines are in the form of web pages that contains results obtained from different web databases. These results can be used further in many applications such as data collection, comparison of prices and many more but to make these applications successful the search results should be machine processable. so to make them machine processable, it is important that the result pages are annotated in a meaningful manner. The process of annotating has to consider groups of data and obtain final annotation after aggregating them. Annotation can be done for the Web, java, pdf files, text files, xps, mobile, image, multimedia etc. In Information retrieval for decision support and integrating an annotation database can be founded on the parameters such as document, user and time. Automatic extraction of the data from querying result pages is very much important for different applications, such as data integration, Meta querying cooperates with multiple web databases.
Securing your Computer/Laptop and its vital data is the most important concern for everybody in today’s life. We want to secure our device when we are not using it, or might be we are not in the proximity of it. Many a times, it happens that our device password is known by people close to us. They can use this password to login secretly, and steal/change some vital data. Hence, securing our systems, when we are not in its proximity is an important task. To do so, we have to have some lock on our system by which only we could be able to open it. A solution to this is to have additional password in order with the normal Windows password on login. This will be a one-time password and will change on every login. This will act as an additional protection for the system. Also, using the Bluetooth technology we will maintain the user authentication and communication to track the authenticated user.
Prof. S. Y. Raut1, Balwant B. Raut, Mangesh S. Jondhale, Nilesh R. Jaware4, Prachil S. Tambe5
Implementation of Hadoop Pseudo-distributed Cluster on Android using ‘chroot’
Prof. S. Y. Raut1, Balwant B. Raut, Mangesh S. Jondhale, Nilesh R. Jaware4, Prachil S. Tambe5
In this paper, we are discussing porting methods of Linux console based applications to android platform and primarily concentrating on migration of Hadoop from personal computer to android device. There are four different approaches towards porting methodology [1] viz. recompilation of Linux binaries using bionic c library, running android under Linux, running Linux in android environment, and Virtualization method of multiple virtualized Linux running parallel. We are focusing on implementation of Linux with help of ‘chroot’ method. This paper specifically discuss the ‘Hadoop’ migration to an Android device. Hadoop being important factor of big data processing, provides a flexible MapReduce Implementation. With this port, it would be easier to minimize cost of cluster formation. We are also discussing pseudo-distributed node formation on this ported Hadoop.
Two Phase Secured Multiparty Sum Computation Protocol (2PSMC) for Privacy preserving data mining
Selva Rathna1, Dr.T. Karthikeyan2
Secured Multiparty Sum Computation is an important algorithm designed in Privacy preserving Data mining to perform aggregated computation on data distributed between multi parties. In this paper, a new protocol with an improved performance on complexity is designed to perform Secured Sum computation on Multiparty environment which will enable to develop better algorithms for Privacy preserving data mining process such as Classification, Clustering etc.
Networks are Developing Continuously. The number of subscribers are increasing & new services required more network capacity. Higher capacity demands are tackled with network re-segmentation and node splitting, which leads to more effort & costly on-site re-adjustment of network devices. It will not only result in increased risk for human errors and network unreliability, but also require higher operating costs. To reduce this we require network devices which can control all errors & automatically fix these errors & monitor the whole network. For this we require Intelligent Network.
Database system has evolved from a specialized computer application to a central component of a modern computing environment, and, as a result, it has become an essential part of computer science. A major responsibility of the database administrator is to prepare for the possibility of hardware, software, network, process, or system failure. If such a failure affects the operation of a database system, we must recover the database and return to normal operation as quickly as possible. Recovery should protect the database and associated users from unnecessary problems and avoid or reduce the possibility of having to duplicate work manually. Recovery processes vary depending on the type of failure that occurred, the structures affected, and the type of recovery that you perform. Recovery refers to the various strategies and procedures involved in protecting the database against data loss and reconstructing the database after any kind of failure. This paper presents a conceptual level description of various issues and techniques related to recovery.
MR. Praveen Kumar1, S.Sirisha2, N.Breetha3, M.Lakshmi4
A Password authentication in social Networking sites
MR. Praveen Kumar1, S.Sirisha2, N.Breetha3, M.Lakshmi4
Early we use Plain Textual passwords as a security. A password is a word or string of characters used for user authentication. A typical computer user has passwords for many purposes: logging into accounts, retrieving e-mail, accessing applications, networks, web sites, and reading the newspapers online. The common method which we used is a Textual password which is lengthy and consider as secured but difficult to remember thus the user pick short password but short password are easily crack or hack. A fundamental task in security is to create cryptographic primitives based on hard mathematical problems that are computationally intractable for security, initially proposed, is an exciting new paradigm. So we proposed new password authentication technique is Session password and color password. Where it gives the options for user to select the password as a color or alphanumerical grid.
Dhanya C L1, Sanjeev Kubakaddi2 , Dr, Shilpa Mehta3
Automated Identification of Diabetic Retinopathy Stages in Digital Fundus Image using CDR and Micro aneurysms
Dhanya C L1, Sanjeev Kubakaddi2 , Dr, Shilpa Mehta3
Diabetic retinopathy (DR) and glaucoma are the commonest complications of diabetes and is one of the leading causes of blindness. Early detection of occurrence of DR can greatly help in effective treatments. Very effective treatments are available and are optimally used when retinopathy is detected early. For this reason screening programs for early detection of retinopathy is essential part. The retinal fundus photographs are the main resources for screening of DR. In this work we tried to locate the eye optic disc and optic cup using k-means algorithm, morphological operation and watershed transform. Segmented optic disc and cup are then used to compute the cup to disc ratio for DR screening, for grading the disease we use micro aneurysms. Micro aneurysms are the clinical sign of DR they appear small red dots on retinal fundus images, their detection can be used to grade the DR in different stages.
Now a day’s use of internet is increasing rapidly. For broad topic each new user may have his different user goals. Hence the inference and analysis of the user search goals can improve the efficiency of the search engine and also reduce the time needed to search the query as unwanted data can get hide from the user and user get only his goal oriented search results. Currently everyone is searching on the internet and internet provides you ambiguous result of same things as it contains lot of information. In proposed method system will provide the information related to the user goals. In this paper we have discover a novel framework to discover the user goals by clustering the user search goals and then new approach to generate the pseudo document to represent the clustering effectively. At the end we have proposed novel approach CAP to calculate the performance of the search engine.
In this paper we present mobile enabled streaming system stream-me architecture which integrates local storage NAS, cloud storage and peer to peer storage. Our aim to provide users video contents on their mobile device at anywhere and at anytime. Keywords: Cloud Computing; Multimedia server; Multimedia Streaming; P2p Network.
Comparative Performance Analysis of Removal of Impulse Noise using Different Methods
Namrata Shelke1, Prof. Sumit Sharma2
In this research paper we have introduced a comparative performance analysis of the removal of random valued impulse noise using different filters. Removal of random valued impulse noise in digital images with edge preservation is one of the challenging tasks in digital image processing. For elimination of noise from digital images plays a vital role. Effective detection of noisy pixel based on median value and an efficient algorithm for the estimation and replacement of noisy pixel has been carried out in different comparative methods in this paper. Here we have compare a different type of de-noising algorithms they are - Median filter (MF) [4], Centre weighted median filter (CWM) [6] [15], Progressive switching median filter (PSMF) [12], Signal dependent rank order median filter (SDROM) [9] [14], Adaptive center weighted median filter (ACWM) [10] [11], (ACWMF), Tristate median filter (TSM) [13. The presence of high performing detection stage for the detection noisy pixel makes the proposed method suitable in the case of high density random valued impulse noise. Here we have compare different method on the basis of PSNR and MSE values.
Dual-Broadband Planar Antenna For 2G/3G/LTE Base Stations
M.Ramya1, Dr.A.Kannammal2
A novel dual-broadband planar antenna is proposed for 2G/3G/LTE (4G) mobile base stations. The proposed antenna consists of two elements one for the upper band and another for lower band. It can also be arrayed without grating lobes. The lower band has the pair of printed dipoles with a pair of parasitic elements for bandwidth enhancement. The upper band consists of a pair of folded dipoles. Both the dipoles and the microstrip line are etched on the same substrate. The upper band elements are nested inside the lower band elements forming a compact structure. The dual-broadband antenna achieves a bandwidth of around 2 GHz. The designed antenna can be implemented in mobile base stations. The antenna gain achieved around ~12 dBi which is suitable for the mobile communication base station systems.
In this paper we present a methodology for how the contexts are analyzed for the context information finding in terms of web history searching in personal information management system. Context annotation and context degradation for context analysis are presented in the methodology. Hierarchical similarity finding between context attributes and context instances are implemented in this paper.
A Classification of various unicast and multicast Routing protocols in MANET
M.Sravan Kumar Reddy , Vikram Narayandas
Ad hoc network is a multi-hop wireless network, which consists of number of mobile nodes. Mobile adhoc networks may be used in areas with little or no communication infrastructure This paper includes various routing protocols in MANET.our work prompted classification of varoius routing protocols in unicast and multicast routing protocols in MANET and finally our review focused on different tree based routing protocols in adhoc networks. Mobile ad hoc networks are networks which routing is based on multi-hop routing from a source to a destination node or nodes. Each of these protocols is designed to perform its task as well as it is possible according to its design criteria. This Paper deals with a classification of ad hoc routing protocols. Routing in MANET is a critical task due to highly dynamic environment. In recent, several routing protocols have been proposed for mobile ad hoc networks.A range of literature relating to the field of MANET routing was identified which highlight existing protocols as well as the current thinking within the field and the directions researchers are moving in the future.
Role of Software Quality Assurance in Software Development
Deepak Kumar1, Parul2
Software industry is one of the leading one among other industries. The scope of a software application is increasing day by day and also the dependency over the software application is increasing. We can see software usage in medical laboratories, pharmaceutical laboratories, maintaining database of different organization, facilitating the sports person by visualizing their performance etc. So it is the prime responsibility of software industry to provide efficient and durable facility to its customers. This is a well known fact that every organization makes its mind earlier to achieve profit in their business and same here in the software industries. This paper represents the various aspects of the scope of the software application by impacting on the software quality.
Conventional visual secret sharing schemes hide secret images in shares and the image may be printed on transparencies or encoded and also stored in a digital form. The shares of the image can appear like noise pixels or as meaningful images; but it will suspicion and increase interception risk during transmission of the shares. Hence, VSS schemes have limitation that it suffers from a transmission risk problem for the secret itself and for the participants who are involved in the VSS scheme. To overcome the problem of transmission risk, we proposed a natural image based VSS scheme (NVSS scheme) which can shares secret images via different carrier media to protect the secret and the participants during the transmission phase. The above proposed (n, n) - NVSS scheme can share digital secret image over n-1 arbitrary selected natural images (called natural shares) and one noise-like share. The natural shares or natural images can be photos or hand-painted pictures in digital form or in printed form. The shares that look like noisy share can be generated based on these natural shares and the secret image. The unaltered natural shares are various and inoffensive, thus considerably reducing the transmission risk problem.
An increasing rate of multi-media increases the verity of multi-media data such as image video voice and text. The size of these data need more space for storage and more bandwidth for transmission. For the flexibility of storage and transmission used lossless image compression technique. The lossless image compression is very efficient but faced the problem of compression ratio. In series of image compression fractal transform function are used. Fractal transform based by virtue of symmetry. The symmetry nature of fractal wavelet transform provides block level image compression technique. But the process of computational is poor then the performances of fractal transform function are decease. In this paper presents review of lossless image compression based on fractal transform function and used block co-efficient technique
The proficiency of Multi Agent imposes us to use its features to design an approach towards constructing a system for Recruitment of Fresher. Using any single agent we cannot perform complex activities of candidates requirement so we use distributed problem solving approach and utilize it with multi-agent to perform each step towards building FRS. Here we assist our system to get input request by company for the recruitment of candidates then our system provides a list of eligible candidates for the recruitment process of company on the basis of pre-conditions and parameters given by the company. We also use some cognitive parameters (like) which make our agents proactive in selection of zone then state and then colleges.
Towards an understanding of Li-Fi: Next generation Visible Light Communication Technology
Deepali Bajaj1, Isha Mangal2, Asha Yadav3
Li-Fi (Light Fidelity) is an optical version of Wi-Fi (Wireless Fidelity) technology. This technology is based upon the concept of visible light communication (instead of radio frequency waves). This technology introduced by German Physicist Harald Hass, the idea is to communicate through illumination i.e. use of LED light bulbs for data transmission. This paper gives an overview and working principle of the technology including its advantages, disadvantages and applications. Further the Li-Fi is compared to other existing communication technologies and the scope of Li-Fi in existing connectivity scenario is discussed.
The objective of this paper is to develop a WI-FI controlled surveillance robot using wireless Network. We are trying to develop a robot that can transfer a video captured in its camera to nearby laptop that is connect with it via wireless network. This type of robot is mainly used for security purposes like in coal mines and tunnels where human intervention is impossible. This robot can be navigated through the server system via wireless network. The purpose of this surveillance robot is to navigate and deal with multiple-angled monitoring towards the environment with inexpensive hardware and free software cost. This Robot is configured with the IR sensors to enhance its ability to detect and avoid the collision.
Detection of Nutrients and chemicals in food products using sensors in smart phones
Prof. Sukhesh Kothari, Prof.Hemlata Channe.
Sensors are the physical devices whose function is to sense/detect the changes around its environment, and provides a corresponding output. There are differenct types of sendors available which are used for discrete applications. A Biosensor is a device used to detect nutrients (protein, vitamin etc) or chemicals (antioxidants, carbon contents etc). This paper presents the idea of making a costeffective device using sensors to detect nutrient, chemical contents in foods like vegetables fruits etc and the output of these must be displayed in smart phones.
Comparative Seismic Analysis Of An Irregular Building With A Shear Wall And Frame Tube System Of Various Sizes
B.N.Sarath, D.Claudiajeyapushpa
Lateral load effects on high rise buildings are quite significant and increase rapidly with increase in height. In high rise structures, the behavior of the structure is greatly influenced by the type of lateral system provided and the selection of appropriate. The selection is dependent on many aspects such as structural behavior of the system economic feasibility and availability of materials. Few of the lateral structural systems are Shear wall system, Braced frame system, Framed tube system, Tube in tube system, Bundled tube system. The lateral structural systems give the structure the stiffness, which would considerably decrease the lateral displacements. In the present work a Plain frame system, a Shear wall system and Framed tube system are considered for 30, 40, 50 and 60 story structures. The analysis has been carried out using software Etabs The roof displacements, internal forces (Support Reactions, Bending Moments and Shear Forced) of members and joint displacements are studied and compared. It is seen that the Shear wall system is very much effective in resisting lateral loads for the structures up to 30 stories and for structures beyond 30 stories the Framed tube system is very much effective than Shear wall system in resisting lateral loads.
Category, Strategy and Validation of Software Change Impact Analysis
Zeba Mahmood* Rakesh Bharti , Tahera Mahmood
Impact analysis is defined as the process of identifying the potential consequences (side-effects) of a change, and estimating what needs to be modified to accomplish a change. We propose a UML model-based approach to impact analysis that can be applied before any implementation of the changes, thus allowing an early decision-making and change planning process. We first verify that the UML diagrams are consistent (consistency check). Then changes between two different versions of a UML model are identified according to a change taxonomy, and model elements that are directly or indirectly impacted by those changes (i.e., may undergo changes) are determined using formally defined impact analysis rules (written with Object Constraint Language). A measure of distance between a changed element and potentially impacted elements is also proposed to prioritize the results of impact analysis according to their likelihood of occurrence.
Mohit Pal Singh Birdi, Kapil Kumar, Abhinav Hans, Navdeep Singh
Differences and problems task scheduling algorithm -A Survey
Mohit Pal Singh Birdi, Kapil Kumar, Abhinav Hans, Navdeep Singh
Cloud computing is a computing paradigm where applications, resources and services are provided over the internet. Software and hardware can be used on pay as service basis , without buying them. The key role of scheduling is to manage different tasks in different cloud environment. Cloud computing service providers use the available resources efficiently to achieve maximum profit. This makes task scheduling as a challenging issue for cloud service providers. This paper gives introduction about cloud computing, various existing scheduling algorithms in different task scheduling environments, existing problem and the future suggestions in existing algorithms.
Dr. G. Ramesh Babu1, D. Lavanya2, B. Yamuna2, H. Divya2, B. Shiva Kumar2, B. Ashok Kumar2
Speech Enhancement Using Beamforming
Dr. G. Ramesh Babu1, D. Lavanya2, B. Yamuna2, H. Divya2, B. Shiva Kumar2, B. Ashok Kumar2
The problem of enhancing speech degraded by uncorrelated additive noise, when the noise speech alone available, has recently received much attention. Beamforming is one possible method of speech enhancement, because, the beamformer minimizes the output signal power but maintains signals from the desired direction. Beamforming techniques basically approach the problem from a spatial point of view. A microphone array is used to form a spatial filter which can extract a signal from a specific direction and reduce the contamination of signals from other directions. In this paper we survey some Beamforming techniques used for minimize the noise power in the output signal.
Dr. M. Senthamil Selvi1, Adhars Ram B², Ashok K³, Babu V4
Hop Voting scheme with Voting tag Verification Scheme for wireless Networks
Dr. M. Senthamil Selvi1, Adhars Ram B², Ashok K³, Babu V4
Network coding is a process where an intermediate node encodes incoming packets before forwarding. In existing system, they decouple the routing and scheduling components of the algorithm by designing a probabilistic routing table that is used to route packets to perdestination queues. The back-pressure algorithm, while being throughput-optimal, is not useful in practice for adaptive routing since the delay performance can be really bad. However, using this algorithm will lead to deadlock, power consumption and less security. We proposed a system that implements the novel HOP VOTE scheme along with tag based authentication scheme for effective pollution attack detection recovery and blocking. This allows a node to verify if it’s received packet belong to specific rule criteria, even if the encrypted key is expanding over a time. Based on the id and tag creation the data will be routed. Using tag encoding scheme, the communication overhead can be routed. It is fast and reliable. Further, it improves security, throughput and efficiency of the system.
Mobile phone cloning is a technique wherein security data from one mobile phone is transferred into another phone. The other mobile phone becomes the exact replica of the original cell phone like a clone. As a result, while calls can be made from both phones, only the original is billed. Though communication channels are equipped with security algorithms, yet cloners get away with the help of loop holes in systems. So when one gets huge bills, the chances are that the phone is being cloned.(1) Cloning CDMA cell phones(2)A Pattern Recognition Technique is used to classify the telephone users into classes in order to identify if a call does not correspond to the patterns of a specific user; and (3) Distributed Object Technique is used for the implementation of this distributed system (i.e., manager and agents).
“Data transfer through power line is a technology that sends data through Electric line along with electric current. In this we study data transmission applicability over power lines for home automation. This study concluded after the problem defined by previous research. Two major problems such as cable attenuation and excessive noise level are eliminated by our project. We designed this technology in two phases, in first phase, run the performance analysis of this technology. And secondly, build system that represents the real life applications with security issues. In data transfer through power line we transmit the eight bit command through 230volt and 50Hz supply. This project provides large coverage, since the power lines are already installed everywhere.”
Designing Of Novel Low Power Signed And Unsigned Multiplier Using 180nm CMOS Technology In CADENCE
V.G.Santhi Swaroop1, E.Pavani2, Ch.Vasundhara3
Power consumption is the bottle neck of system performance in VLSI design. Minimization of power consumed by the circuit tends to improve the performance and reduce the cost of the system. Power consumption is mainly due to increased number of transistors and leakage power. The reduction of transistor count and leakage power is done by using a technique like “GATE DIFFUSION INPUT”. Multipliers are vital components of any processor (or) computing machine. Performance of microcontrollers and digital signal processors are evaluated on the basis of number of multiplications performed in unit time. Hence, better multiplier architectures are bound to increase the efficiency of the system. This paper presents different types of Multiplier architectures based on “GDI technique” and Low power BaughWoolley multiplier is proposed for both unsigned and two’s complement signed multiplication. The total architecture is designed in 180nm CMOS technology using CADENCE tool and analyze power dissipation.
Encryption Based Access Control Model In Cloud : A Survey
Rachana Chavda, Rajanikanth Aluvalu
Cloud computing is known as “Utility”. Cloud Computing enabling users to remotely store their data in a server and provide services on-demand. Since this new computing technology requires user to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. We can increase security on access of the data in the cloud. Morever we can provide encryption on the data so third party can not use the data. In this paper we will be reviewing various encryption based access control model for enhancing cloud security along with their limitations. We will be concluding with a proposed access control model to enhance cloud security.
Motion Reconstruction Of 3-D Objects From 2-D Correspondences
Vikas kumar, Lokesh Kashyap 3. Nalin Chaudhary
.This paper addresses the problem of recovering 3D non-rigid shape models from image sequences. For example, given a video recording of a talking person, we would like to estimate a 3D model of the lips and the full head and its internal modes of variation. Many solutions that recover 3D shape from 2D image sequences have been proposed; these so-called structure-from-motion techniques usually assume that the 3D object is rigid. Previous work has treated the two problems of recovering 3D shapes from 2D image sequences and of discovering a parameterization of non-rigid shape deformations separately. Most techniques that address the structure-from-motion problem are limited to rigid objects
OFDM is a bandwidth efficient multicarrier modulation where the available spectrum is divided into subcarriers, with each subcarrier containing a low rate data stream. OFDM has gained a tremendous interest in recent years because of its robustness in the presence of severe multipath channel conditions with simple equalization, robustness against Inter-symbol Interference (ISI), multipath fading, in addition to its high spectral efficiency. However, the Peak-to-Average Power Ratio (PAPR) is a major drawback of multicarrier transmission system such as OFDM. High PAPR increases the complexity of analog-to-digital (A/D) and digital-to-analog (D/A) converters, and lowers the efficiency of power amplifiers. The aim of this project work is to investigate the OFDM scheme, and realize a fully functional system in software and analyzing how it reduces the Peak to Average Power Ratio (PAPR). Different PAPR Reduction techniques developed for OFDM are analyzed, based on the Computational Complexity, Bandwidth expansion and error performance. In this Project work, new Companding Transform Techniques is developed for PAPR reduction of OFDM system. This guarantees the improved performance in terms of BER while reducing PAPR effectively and efficiently.
In most of the Universities, results are published on web or send via PDF files. Currently many of the colleges use manual process to analyze the results. Sadly the college staff has to manually fill the student result details and then analyze the rankings accordingly. Our proposed system will extract the data automatically from PDF and web, create dynamic database and analyze data, for this system make use of PDF Extractor, Pattern matching techniques, data mining, Web mining technique and sorting technique.
A Scalable Two-Phase Bottom-Up Specialization Prospective For Data Anonymization Using Map Reduce On Cloud
Dilipprasad.E, Ajay.R , Mr. K.Durairaj
A scalable two-phase bottom-up specialization prospective to anonymize large-scale data sets using the Map Reduce framework on cloud, this is todays emerging trend and requirement.The people theydon’t want to share their sensitive information with unauthorized user, so they want to hide the informationfrom unauthorized person.Emerging techniques such as K-Anonymity, Map Reduceand Data Anonymization are used but there are lot of drawbacksalso occurring in existing method, like privacy, third party access and so on. So in this paperwe are going to see about how bottom up specialization for data anonymization using map reduce on cloud isused for preserving the privacy of user’s data’s.
Performance Analysis of anEnhanced AODV Routing Protocol
Vipin kumar sahu1, Prof. Aishwarya mishra2
ThisPaper has modified AODV protocol, designed to improve the performance of Mobile Ad Hoc networks. This paper describes the characteristics, working and deficiency ofAODV routing protocols, and modified AODV routing protocol (MAODV). Finally, using NS2 simulation, results compared to the AODV routing protocol, MAODV routingprotocol in terms of packet delivery ratio, end to end delay, Routing-load and Throughput. Experimental result shows that MAODV routing protocol is more advantages than the AODVrouting protocol.
Comparative evaluation of Google Glass, Vuzix M100 & Epson Moverio BT-200
Vijay Sanjos Alexander, Venkatesh Babu
Smart glasses are wearable devices which have computing capabilities and are worn like a spectacles. This white paper provides a comparative study of top leaders in Wearable Glasses including key features, technical specification comparison and gives some insights into possible challenges during adoption. This paper also highlights the typical challenges faced by developers while developing applications for such eyewear.
RTOS Based Fault Diagnosis And Efficient Data Transfer On Priority Basis Using Can
K.T Indhupriya B.E, G Balaji M.Tech
—CAN protocol is a broadcast communication protocol and it is used for automotive application. CANalyzer is a software tool which is used for analysis and stimulation of bus communication. It is used to check whether and what type of communication is occurring on the bus. It is used to send or log data. CANalyzer is a high cost tool and it is not suitable for small applications. This paper demonstrates the implementation of priority based message transfer using CAN protocol and also demonstrates the faulty messages in the system without using CANalyzer. The system uses µC/OS II RTOS and ARM.
A Comparative Analysis of Traditional RDBMS with MapReduce and Hive for E-Governance system
Mr.Swapnil A. Kale 1, Prof.S.S.Dandge
India is moving towards digitization. Most of the state and central governments are digitizing the departments due to which the use of E-governance applications has increased. As a result data is getting added in large size daily. Hadoop and MapReduce are used to handle these large volumes of variable size data. Processing and sharing of such a large data by traditional methods is difficult by the use of traditional methods. What is important is the speed at which one can process that data and extract the actionable business intelligence information. In this paper we present the comparison of traditional RDBMS with MapReduce technique and use of HiveQL to implement MapReduce on large databases. The paper also focuses on use of Sqoop connector to connect SQL and Hadoop that illustrates how big data can result in the transformation of the government by increased efficiency and effectiveness in the E-governance service.
Microprocessor performance has improved rapidly these years. In contrast, memory latencies and bandwidths have improved little. The result is that the memory access time has been a bottleneck which limits the system performance. Memory controller (MC) is designed and built to attacking this problem. The memory controller is the part of the system that, well, controls the memory. The memory controller is normally integrated into the system chipset. This paper shows how to build an Advanced Microcontroller Bus Architecture (AMBA) compliant MC as an Advanced Highperformance Bus (AHB) slave. The MC is designed for system memory control with the main memory consisting of SRAM and ROM. Additionally, the problems met in the design process are discussed and the solutions are given in the paper.
Maintainability assurance in Software Product Lines An Activity- Based Model
Manoj Nainwal1, Anurag Awasthi2
Management of maintainability of software in Software Product Lines is still a problematic area. As its very important quality attribute, there has to be a comprehensive basis for assessing and improving it. Several quality models have been proposed to quantify the maintainability. Nevertheless, existing approaches are not activity based. We have proposed a set of activities and set of facts to calculate maintainability in SPL (Software Product Lines) environment. Facts would have a cascaded effect on activities which in turn would have an effect on the sub-factors of maintainability. We have conceptualized the set of activities and facts in the form of an activity based model.
Dr.Yogesh Bhomia1, Dr. S.V.A.V.Prasad2,Pradeep Kumar3
Design and Analysis of Square and Circular type fractal Shapes Microstrip Patch Antenna
Dr.Yogesh Bhomia1, Dr. S.V.A.V.Prasad2,Pradeep Kumar3
This paper presents a design of microstrip patch antenna combining circular and square slots by cutting different slots on rectangular microstrip antenna and experimentally studied on IE3D software. This design is achieved by cutting multi shapes in square pattern combining with circular and square slots & placing a microstrip line feed. This design has been studied in III iterations. The radiation pattern of the proposed microstrip antennas maintained because of the self similarity and centro-symmetry of the fractal shapes. With fractal shapes patch antenna is designed on a FR4 substrate of relative permittivity of 4.4 and thickness 1.524mm and mounted above the ground plane at a height of 6 mm. Details of the measured and simulated results of the case-by-case iterations are presented & discussed.
Authentication at Single-Point-of-Access in Cloud Environments
Rizwan Ahmed*, Mr. Imran Ijaz
Cloud computing is a rising innovation and increase consideration in scholastic and business range. Assets are pooled and offer with clients on interest. With a specific end goal to give better execution of cloud environment, undertaking planning is a paramount issue. The objective of the study to explore the different environments of the cloud infrastructure w.r.t security. It is big challenge for the cloud service provider company to implement the security and meeting the highest level of the service with the providing quality service and secure environments to the clients. This is how the Cloud service providing companies build the trust relationship with their clients. The propose model is the security optimization of the already implemented security in the cloud environments along with the quality service in terms of required bandwidth and required infrastructure like Saas, Paas, or Iaas. Meeting both, quality of service and security, Cloud service Provider Company has to do lot of working as the cloud has multiple integrated infrastructures. It will be a big loop hole if a cloud service provider company give access to their actual servers or live IPs to provide services to the clients. There will be lost of resources, time and availability of resources to the legitimate users. The primary objective of the study is to build a model which provides us un-authenticated and un-authorized users blocked from one single point which is other than the service provider to the clients of Saas, Paas, or Iaas and the resources to authenticate and authorize a user with in the cloud are used by the legitimate users.
Our proposed system uses clustering approach that organizes a large quantity of unstructured text documents into a small number of meaningful and coherent clusters. These measures are compared and analyzed in partitional clustering for text document datasets. Clustering provides extraction and fast retrieval of information or filtering. In document clustering, clustering methods can be used to automatically form group of the retrieved documents into a list of useful categories. Document clustering uses a sample documents as descriptors and performs descriptor retrieval from set of text documents. Descriptors is the sample document in reference to which clusters are formed.
Survey Of Electronic Countermeasures Attacks And Techniques In Wireless Detector Networks
Sundar R*
As wireless detector networks still grow, therefore will the necessity for effective security mechanisms. Because sensor networks might move with sensitive information and/or operate in hostile unattended environments, it is imperative that these security considerations be addressed from the start of the system style. However, thanks to inherent resource and computing constraints, security in detector networks poses totally different challenges than ancient network/ computer security. There’s presently huge analysis potential within the field of wireless detector network security. Thus, familiarity with this analysis during this field can profit researchers greatly. With this in mind, we have a tendency to survey the key topics in wireless detector network ECM and attacks, and gift the obstacles and also the necessities within the detector security, classify many of this attacks and list their corresponding defensive measures/related work, and at last open challenges for the same.
This paper cover what is Spoofing, MAC spoofing; why people use it. This paper is a survey on MAC Spoofing. This paper have an overview of spoofing characterized , Vulnerabilities, working and operation , Detection of MAC Spoofing, Address Resolution Protocol. MAC spoofing. Mac spoofing is computer identity theft, for good or for bad reasons, and it is relatively easy. MAC spoofing has an address on a NIC (network interface card) ; it is inbuilt on NIC. A MAC Spoofing is to view the nextgeneration technology of IT industry. MAC Spoofing is applying to future spoofing servic
Mechanism for secure Big data stored within cloud storage by using cloud computing (Secure cloud storage)
Sandip S. Dabre1 , Mangesh S. Shegokar2
As Associate in Nursing rising technology and business paradigm, Cloud Computing has taken business computing by storm. Cloud computing platforms offer straightforward access to a company’s superior computing and storage infrastructure through internet services. We tend to think about the matter of building a secure cloud storage service on high of a public cloud infrastructure wherever the service supplier is not fully trusty by the client. We describe, at a high level, much architecture that combines recent and non-standard science primitives so as to realize our goal. We survey the benefits such Associate in nursing design would offer to both customers and repair suppliers and provide Associate in nursing overview of recent advances in cryptography motivated specifically by cloud storage
At the time of taking financial decision in construction projects decision maker tempting to think in the short term. In a high construction projects more importance is to place the up-front costs, with less attention to future cost. In order to improve long term decision making life cycle cost analysis is important. Purpose of life cycle cost analysis is to determine cost of project for any number of years. Many times project engineer consider only initial construction cost, with little or no consideration of operation cost, maintenance cost as well as energy cost throughout the life of the building. Life cycle cost analysis (LCCA) method consider initial cost, operation cost, energy cost, maintenance cost, repair cost and residual value to estimate cost effectively. In this paper various economic evaluation methods to calculate life cycle cost (LCC) of building has been discussed and comparison of these methods has been carried out.
Sentiment analysis, which is also called opinion mining is the field of study which analyzes people’s opinions, sentiments, evaluations, appraisals, attributes and emotions towards entities such as products services, organizations, individuals, issues, events, topics, and their attributes through twitter. People use microblogging (twitter) to talk about their daily activities and to seek or share information. It is an online social networking and micro-blogging service that enables users to send and read "tweets", which are text messages limited to 140 characters. In this paper we propose a model that can spot the public opinion with their emotions.
K. Balakrishna1, C. Penchalaiah3 Smt. S. Jessica Saritha2,
Extracting Structue Data From UnStructured Data Through HiveQL
K. Balakrishna1, C. Penchalaiah3 Smt. S. Jessica Saritha2,
RDBMS can store structured data up to some GB of data. Processing of large data is very difficult to handle and also time consumption process. To overcome these problems made of using Hadoop. Apache Hadoop is a framework for big data management and analysis. The Hadoop core provides storing of structured, unstructured and semi structured data with the Hadoop Distributed File System(HDFS) and a simple MapReduce programming model to process and analyze data in comparable, the data stored in this distributed system. Apache Hive is a data warehouse built on top of Hadoop that allows you to query and manage large sets in scattered storage space using a SQL-like lingo call HiveQL, Hive translate queries into a series of MapReduce jobs. In existing system unstructured data stored in HDFS can’t be retrieve into structured format through HiveQL. In this project It is converting twitter data into a structured format by using HiveQL with SerDe. HDFS can stores twitter data by using data streaming process.
CPS-PWM Based Cascaded H-Bridge MLI with Voltage Balancing Capability
R.Priya, C.Kanimozhi
This paper presents a specially designed cascaded H bridge multilevel inverter to increase the number of levels with reduced no of switches and best suited for medium voltage drive applications. An efficient approach to the voltage balancing issue is addressed by implementing third harmonic injection. To enhance the balanced distribution of power in each cell, Multi carrier phase shifted pulse width modulation switching scheme is employed. The midpoint capacitor voltage in unit cells is achieved with total harmonic injection method. This new topology of inverter offers less amount of THD when comparing with the conventional one. The validity of the proposed inverter is verified through MATLAB Simulink model and the scaled down prototype model is implemented in hardware.
Performance evaluation of Energy-aware routing algorithm for wireless sensor networks.
Pawan deepkaur
Wireless Sensor Networks (WSNs) have many sensor nodes having restricted battery power which transmit sensed data to the Base Station that requires high energy consumption. Numerous routing protocols have already been proposed in this regard getting energy efficiency in heterogeneous situations. Though, each protocol is inappropriate for heterogeneous WSNs. Efficiency of WSNs declines as varying the heterogeneity of sensor nodes. This paper has evaluated the performance of Energy efficient routing protocol (ERA) under numerous scenarios. MATLAB tool is employed for experimental purpose. The comparison indicates that the ERA has quite effective results over other protocols.
Color Edge Detection Based On The Fusion Of Hue Component And Principal Component Analysis
Ramandeep Kaur
This paper evaluates the performance of HUE, PCA and fusion of HUE and PCA based colour edge detection techniques. Edge detection has found to be most significant part of many critical vision applications. It really results in the black and white (binary) image where each object is differentiate by lines (either black or white). Edges are basically the location in the image where sharp changes exist. It's been discovered that the all of the existing techniques has neglected the utilization of colours while detecting the edges however in many applications an area could be categorized in relation to the colour. This paper indicates that a majority of the existing techniques fail in the event of images with complex background.
This paper addresses that how we can maximize the utilization of solar power for our cooking process. It is done by installing the cooker at some height and its lower or base surface will consist of transparent glass through which sun rays enters into the cooker after getting reflected by the reflecting mirror placed on the ground. This solar cooker also consists of heating element through which cooker gets heated in cloudy weather as well as in night by electricity. Thermostat arrangement is also provided to control the temperature of cooker while running through the electricity.
Dr. A. Balamurugan1, G. Navin Siva Kumar2, S. Raj Thilak3, P. Selvakumar4
Automated Emergency System in Ambulance to Control Traffic Signals using IoT
Dr. A. Balamurugan1, G. Navin Siva Kumar2, S. Raj Thilak3, P. Selvakumar4
Traffic congestion has become a major problem in this technical era. There are various reasons for this traffic congestion. One of these is the rapid growth of the population. As a result of this, the number of cars is increasing annually. The increase in the number of trucks and commercial vehicles also causes traffic congestion. This causes problems for the ambulance to reach the hospital on the right time. As the result of the rapid growth of technology and engineering field the life of the mankind has got automated. This automation is the process of making the electronic device to communicate between themselves to serve the purpose of the human. The one of the major field that concentrate on the automation is Internet of Things creatively called as IoT. This project is based on the IoT and cloud to save the human life at critical situation. This project is to establish the communication between the traffic signals and the ambulance so that the traffic signal can respond to the arrival of the ambulance and respond according to that. When the traffic signals are changes its states according to the position of the ambulance it can able to make a free way for the ambulance. Thus this project will act as a life saver.
Showkat Ahmad Dar, Firdous Rashid , Dr. S.Palanivel
A Review of Ad Hoc Routing Protocols
Showkat Ahmad Dar, Firdous Rashid , Dr. S.Palanivel
Ad hoc network is a self configurable network where the mobile nodes act as routers and connected using wireless links. An ad hoc routing protocol is a standard that controls the movement of nodes. The primary goal of an ad hoc routing protocol is establishment of correct and efficient route between a pair of nodes; so that message may be delivered in time. This paper presents various types of ad hoc routing protocols along with their characteristics and functionalities
Novel Approach for Localizing Jammers in wireless network
Sundar.R1, Mahesh.C2
Jammers can severely disrupt the communications in wireless networks, and jammers’ position information allows the defender to actively eliminate the jamming attacks. Thus, in this paper, we aim to design a framework that can localize one or multiple jammers with a high accuracy. Most of existing jammer-localization schemes utilize indirect measurements (e.g., hearing ranges) affected by jamming attacks, which makes it difficult to localize jammers accurately. Instead, we exploit a direct measurement— the strength of jamming signals (JSS). Estimating JSS is challenging as jamming signals may be embedded in other signals. As such, we devise an estimation scheme based on ambient noise floor and validate it with real-world experiments. To further reduce estimation errors, we define an evaluation feedback metric to quantify the estimation errors and formulate jammer localization as a nonlinear optimization problem, whose global optimal solution is close to jammers’ true positions. We explore several heuristic search algorithms for approaching the global optimal solution, and our simulation results show that our error-minimizingbased framework achieves better performance than the existing schemes. In addition, our error-minimizing framework can utilize indirect measurements to obtain a better location estimation compared with prior work.
Enterprises collect huge volume of data from different sources such as web logs, click stream, social media, sensor data and the like. This includes numeric and textual data and needs to be converted onto a format (mostly numeric) that is acceptable by the data mining and machine learning algorithms. Thus, we need an efficient, scalable approach to enable not only simple conversion, but also transformations like normalization. In this paper, we present the IGATE In-Place Transformation Engine (IPTE), which can be used for conversion and transformation operations. A comparative evaluation of commercially available ETL tools and the need for a custom-tool (IPTE) is justified. Three distinct flavours of IPTE implementation are described -- (a) Stand-alone using Java Multi-threading (b) Distributed using Hadoop API’s, and (c) In-memory using Apache Spark. By comparing the performance of the transformation process on different data sizes using these three flavours, specific recommendations are made when each of these should be used. Some use cases where IPTE has been used effectively are also presented.
Study On High Strength Self Compacting Concrete Beams With Steel & Recron Fiber
P.Karthi, S.Sankar
-Self Compacting Concrete gets dense and compacted due to its own self-weight. An experimental investigation has been carried out to determine different characters like workability and strength of Self-Compacting Concrete (SCC). Tests involving various fiber proportions for a particular mix of SCC. Test methods used to study the properties of fresh concrete were slump test, U – tube, V – funnel and L – Box. The properties like compressive, tensile and flexural strength of SCC were also investigated. For compressive strength of studies the cube size of 100mmX100mmX100mm were used. The specimens were cured and tested for 7 and 28 days for high compressive strength. The split tensile strength was studied for the same concrete mix using cylinders size 100mmX200mm. The specimens were cured and tested for 7 and 28 days. The flexural strength was studied for the same concrete mix using beam of size 100mmX100mmX200mm. The specimens were cured and tested for 7 and 28 days.The stress-strain relationship was studied for the same concrete mix using cylinders of size 150mmX300mm. The specimens were cured and tested for 7 and 28 days. Test Results shows that the workability characteristics of SCC are within the limiting constraints of SCC. The variation of different parameters of hardened concrete with respect to various fiber contents was analysed.
Dr M. Vijaya Sekhar Reddy1, M. Seshalalitha2, N.Krishna Murthy3
Experimental Investigations on Flexural Strength and Durability Properties of Mortars Containing Cement Replacement Materials
Dr M. Vijaya Sekhar Reddy1, M. Seshalalitha2, N.Krishna Murthy3
This paper discusses the effects of using different pozzolonic materials as a partial cement replacement material in mortar mixes. An experimental study of mortar made with 43 grade Ordinary Portland Cement (OPC) and 12% of OPC partially replaced by different cement replacement materials (CRMs) such as fly ash, rice husk ash, silica fumes, Calcined Clay (Grog) and Slag (GGBS) in mortar mixes. The mechanical properties of mortar mixes with these (CRMs) were tested to determine the effect of incorporating these mineral admixtures on mortar properties and compared with that of control mix. Mortar specimens were cured in water for 28 days, through which, flexural strength and compressive strength were tested at ages 3, 7 and 28 days. Then, the rest of specimens were immersed in fresh water with solutions of 10% sodium sulfate (Na2SO4) and 10% magnesium sulfate (MgSO4) for period of 3 months. The specimens were also tested for variation in compressive strength at 28, 60 and 90 days to investigate the durability properties.
Face Recognition using Eigenfaces and Distance Classifiers
A V Krishna Rao Padyala, K Sai Prasanth D Neeraja
facial recognition is a computer application composes for complex algorithms that use mathematical and matricial techniques, these get the image in raster mode(digital format) and then process and compare pixel by pixel using different methods for obtain a faster and reliable results, obviously these results depend of the machine use to process this due to the huge computational power that these algorithms, functions and routines requires. The main goal of this paper is show and explains the easiest way to implement a face detector and recognizer in real time for multiple persons using Principal Component Analysis (PCA) with eigenfaces,distance measures.
N.R RejinPaul1 ,Alagu Lakshmi M.S2 , Sai Rasmitha D3
Secured Key-Aggregate Cryptosystem for Scalable Data Sharing in Cloud Storage
N.R RejinPaul1 ,Alagu Lakshmi M.S2 , Sai Rasmitha D3
Cloud Computing has been envisioned as the next generation architecture of IT enterprise, due to its long list of unprecedented advantages in the IT history. Data sharing is an important functionality in cloud storage. In this paper, we show how to securely, efficiently, and flexibl yshare data with others in cloud storage. public-key cryptosystems that produce constant-size ciphertexts such that efficient delegation of decryption rights for any set of ciphertexts are possible .secret key holder can release a constant-size aggregate key for flexible choices of file formats in cloud storage, but the other encrypted files outside the set remain confidential. This compact aggregate key can be conveniently sent to others with very limited secure storage.
Achieving Multi Layered security, Flexibility, Scalability, Access Control in Cloud Computing Using Hierarchical Attribute Set Based Encryption (HASBE)
Priyanka Madhirjau
At present cloud computing is going to be very famous technology in IT enterprises. For a company, the data stored is huge and it is very precious. All functions are performed through networks. Thus, it becomes very important to have the secured use of data. Cloud Computing, an emerging computing paradigm, requires additional security which is provided using HASBE and this can emerge as a new security feature for various organizational platforms. We propose attribute based solution so that performance of cloud can be improved. It is implemented using cipher text policy (CP-ABE) by encrypting and decrypting the data in the cloud and makes the data password protected so that the cloud system achieves scalability, flexibility and multi layered security by implementing data owners to share their data with data consumers controlled by the domain authority.
Jai Prakash1, Vikas Kumar2, Harish Chandra Maurya3
Analysis of Characteristics of 2G/3G Cellular Networks in rural and urban areas of Delhi
Jai Prakash1, Vikas Kumar2, Harish Chandra Maurya3
Recent years India has experienced significant growth in cellular penetration and it is the second largest and fastest growing mobile market in the world according to recent report by the Telecom Regulatory Authority of India. In this paper illustration of cellular data networks available in India is done. Throughput latency, Availability and other network characteristics of five cellular service providers (Idea, BSNL, Reliance, MTNL and Airtel) across six locations are evaluated and a robust, scalable, and extensible suite to conduct active client side measurements in rural regions is designed. Variety of tests has been done over 90 days by measuring the following key insights about the cellular data and understanding of such network parameters and their impact on performance will help to explore the strategies for optimizations at end-hosts as well as within the service provider network.
E-Field Uniformity Test Volume In Gtem Cell Based On Labview
Dominic S. Nyitamen
The standard IEC 6100-4-20 is used to carry out field uniformity in GTEM 5407 using Labview as the controlling software. For reliable and reproducibility of measurements in the GTEM, determine the field distribution to ensure the field uniformity for the performance of emissions and immunity tests is a requirement. In this work the determination of electric field distribution which is a lengthy process is carried out employing the Labview software for automatic capturing and storage at each point of measurement all the needed parameters in the frequency range of interest. The selected working volume chosen showed the primary field within 0 – 6dB required by the standard as well as – 6dB cross coupling separation. The GTEM working volume was prepared for total radiated power test for GSM900 phone.
Enhancement of the IEEE 802.15.4 Cluster-Tree Network with Energy Efficient Cluster Scheduling
Ahmed Ben Saleh, Martin J N Sibley, Peter Mather
In ZigBee cluster-tree network, the existing literature works does not provide solution for power efficient scheduling. In addition, the technique to prevent network collision is not explained. In order to overcome these issues, in this paper, we propose Energy Efficient Cluster scheduling for IEEE 802.15.4 Cluster-Tree Network. In this technique, initially, the distributed Pull-Push-Relabel (PPR) algorithm is designed to adapt to a ZigBee cluster-tree network. Then, a time division cluster scheduling technique is considered that offers energy efficiency in the cluster-tree network by maximizing the Cluster Scheduling period in relative to beacon interval. Besides, it prevents resource requirements whereas fulfills some temporal requirements such as end-to-end deadlines of all the flows. By simulation results, we show that the proposed technique reduces the energy consumption and reduces the network collision.
Software Application for Design of Structural Element Using Visual Basic Coding
J.Manoj Babu
The increasing reliance of engineers on computer software in the performance of their tasks requires engineers, the future professional engineers, must be knowledgeable of sound engineering concepts, updated on the latest computer technology used in the industry and aware of the limitations and capabilities of the computer in solving engineering problems. “Computer Methods in Civil Engineering” to developed structural design program for design of structural element using Visual Basic. By creating my own software applications will demonstrate my creativity and integrate concepts, methods and skills in mathematics, basic engineering and specialized civil engineering subjects. This paper presents the learning objectives, requirements, methodology and outputs of my knowledge on “Computer Methods in Civil Engineering”.
Airline delays caused by bad weather, traffic control problems and mechanical repairs are difficult to predict. If your flight is canceled, most airlines will rebook you on the earliest flight possible to your destination, at no additional charge. Unfortunately for airline travelers, however, many of these flights do not leave on-time. The issue of delay is paramount for any airlines. Therefore we intend to aid the airlines by predicting the delays by using certain data patterns from the previous information. This system explores what factors influence the occurrence of flight delays along with the intensity of the delays. Our method is based on archived data at major airports in current flight information systems. Classification in this scenario is hindered by the large number of attributes, which might occlude the dominant patterns of flight delays. The results of data analysis will suggest that flight delays follow certain patterns that distinguish them from on-time flights. Our system also provides current weather details along with the weather delay probability. We have achieved much better accuracy in predicting delays. We may also discover that fairly good predictions can be made on the basis on a few attribute.
Prof A.M.Magar, Akshay A. Kulkarni, Dayanand K. Jadhav
Literature Survey on Maharashtra State Transport Buses and RFID
Prof A.M.Magar, Akshay A. Kulkarni, Dayanand K. Jadhav
In this paper, we are representing the literature survey on Maharashtra State Transport Buses and RFID. Most of the part of Maharashtra State is covered by Maharashtra State Transport Buses. It has been serving more than 70 lacks passengers daily. There is a very wide network of M.S.R.T.C. across Maharashtra. But on the other side M.S.R.T.C. is facing many problems like low load factor, high bus staff ratio and many other. Radio Frequency Identification (RFID) is one of the emerging technologies which can be used for vehicle tracking and for some other purposes. This paper deals with some basic information about RFID and problems faced by passengers travelling by State Transport buses.
Now-a-days there is a huge increase in usage of Online Social Networks (OSNs). But, these OSNs do not provide a security to the user walls against unwanted messages. Users can easily post any undesired contents on a user wall. This is a very important issue in OSN giving rise to misuse of these OSNs. Up to now OSN have provided little control regarding who can post on user’s private wall. Here we have proposed a system that will prevent unwanted messages from being posted on user’s wall. This can be done by scanning the messages for any undesired contents before they are being posted on any user wall. We have also proposed a Blacklisting mechanism which will block users frequently trying to post such messages and prevent them from further posting any messages on the user’s wall.
—A hypervisor is a piece of computer software, firmware or hardware that creates and runs multiple virtual machines. The hypervisor presents the guest operating systems with a virtual operating platform and manages the execution of the guest operating systems. Virtual switch which is a part of the hypervisor also called as hypervisor switch does the switching between VM traffic. Currently if two VMs those are interested in communication and if they are in same subnet then VM-VM traffic will traverse from source VM to destination VM through hypervisor switch. And if two VMs are in different subnet then traversing of packets will need external devices such as external switch and router which increases the traffic in the network. We can improve the quality of the system by implementing the functionality of router within the hypervisor.
Data mining has attracted a great deal of information in recent years, due to the wide availability of huge amount of data and the imminent need for such data into useful information and knowledge, which can be used for applications ranging from market analysis, fraud detection and customer retention, to production control and science exploration. The real privacy concerns are with unconstrained access of individual records, like credit card, banking applications, customer ID, which must access privacy sensitive information. Due to privacy infringement while performing the data mining operations this is often not possible to utilize large databases for scientific or financial research. To address this problem, several privacy-preserving data mining techniques are used. The aim of privacy preserving data mining (PPDM) is to extract relevant knowledge from large amounts of data while protecting at the same time sensitive information.
Prof. Hycinth C. Inyiama1, Prof Christiana .C. Okezie2, Maxwell C. Okeke3, Tochukwu C. Akubue
Design And Implementation Of A Robot ARM Control System
Prof. Hycinth C. Inyiama1, Prof Christiana .C. Okezie2, Maxwell C. Okeke3, Tochukwu C. Akubue
The Design and Implementation of Robot Arm Control System is the subject of this thesis. To achieve the prototype, an appropriate payload was chosen and its torque requirements were carefully calculated. This formed the basis of selecting a motor of appropriate torque. A permanent magnet Stepper Motor was chosen for use in constructing the prototype because it delivers a known incremental step for each pattern sent to the motor. The prototype used four stepper motors, one at the ankle, one at the base, one at the wrist and one at the shoulder. This made possible, four axis of movement for the Robot Arm. Although a stepper motor can be used to carry out different process control tasks, the particular task assigned to the prototype is a pick and drop process control. The actions required for the Robot Arm during a pick and drop process was subdivided into twelve sub tasks. Each sub task led to a flowchart which guided the development of the corresponding microcontroller software that helps to achieve the sub tasks. A main program was designed and developed which calls each of the twelve sub-routines for pick and drop activities in sequence, at appropriate times in order to achieve the pick and drop process. The 8051 Microcontroller was selected because of its ready availability as well as its features that match the pick and drop operation. A fully expanded state transition table and a content addressable memory were used to achieve smooth transitions of the Robot control stepper motors. User friendliness was achieved by providing a Liquid Crystal Display and a Keypad. The system can also be reprogrammed to perform other tasks such as painting, welding, pounding, among others, provided the end effectors are modified to match the operation envisaged. The prototype testing and evaluation show that it met the criteria listed in the design objectives.
It has since been realized that sources of fossil fuel are finite and that the use of fossil fuel is degrading the environment. These have prompted the search for alternative energy sources especially those that are either carbon neutral or have significantly reduced carbon footprint. It has to be accepted that it is the aggregate of all these alternatives that will replace parts of the fossil fuels. This can then explain why studies are ongoing on various alternative energy areas. Most transesterification work found in the literature appear to deal with laboratory scale studies. For a difference therefore, the purpose of this study is to design a scalable biodiesel reactor for the transesterification of vegetable oil. Most data available in this area are proprietary and not available in the open literature. Stainless steel has been selected for the design so as to cater for both base and acid catalised processes and the reactor has been designed. Methods for determining the extra catalyst required to neutralize the free fatty acid in restaurant waste oil and testing for completion of reaction have been outlined. Precaution in handling the chemicals used in the transesterification process is recommended.
The development of ‘demand-side load management’ is the outcome of the smart grid initiative. Due to the significant amount of loads in the residential sector, home energy management has received increasing interest. In the country like India, we are lagging behind, in the power sector as the demand is much more than the supply. Moreover, there is not a single initiative, which has been taken for the deployment of smart suppliers and smart users. Here, I propose a hardware design of smart home energy management system (SHEMS). With the help of this proposed design, it is possible to have a real-time, price-responsive control strategy for domestic loads such as electrical water heater (EWH), illumination (Lights), air conditioning (Fan), dryer etc. Consumers may interact with suppliers or load serving entities (LSEs) to facilitate the load management at the supplier side. This system is designed with sensors to detect human activities and the behavior is predicted by applying a machine learning algorithm in order to help consumers reduce total payment on electricity. Finally, for the verification of the hardware system, simulation and experiment results will be checked based on an actual SHEMS prototype.
Transformation Of Analysis Model To Business Process Model
Kulshan Pattanaik
Software projects have become larger over the time and therefore the software requirements specifications have grown as well. One way to document requirements is the use of Use Cases, which describe the interaction of an actor with the desired system. Commonly used UML diagrams are - Use Case, Activity, Sequence, collaboration, statechart and Class diagrams. Use case and activity diagrams model the behavioral aspect of the system, whereas, Class diagrams represent the static design of a system. UML,being visual in nature, is easy to understand and communicate. However, as the number of Use Cases increases with the described functionality, the overview, the dependencies and possibly the execution order of Use Cases is lost more and more. Therefore, the task of locating specific Use Cases in large documents and avoiding contradictions becomes a hassle. Design and development of software has become much more complex in the last decade, resulting in evolution of design and development paradigms. Object oriented systems have thus become an integral part of more complex Service Oriented Architecture (SOA) to address complex issues. Automatic translation of UML use case and activity models to BPMN design elements would ensure consistent evolution of Object oriented systems to Service oriented paradigm.
This paper deals with design and development of an automated testing tool for Object Oriented Software. By an automated testing tool, we mean a tool that automates a part of the testing process. It can include one or more of the following processes: test strategy generation, test case generation, and test case execution, test data generation, reporting and logging results. By object-oriented software we mean software designed using OO approach and implemented using a OO language. Testing of OO software is different from testing software created using procedural languages. Several new challenges are posed. In the past most of the methods for testing OO software was just a simple extension of existing methods for conventional software. However, they have been shown to be not very appropriate. Hence, new techniques have been developed. This thesis work has mainly focused on testing design specifications for OO software. As described later, there is a lack of specification-based testing tools for OO software. An advantage of testing software specifications as compared to program code is that specifications are generally correct whereas code is flawed. Moreover, with software engineering principles firmly established in the industry, most of the software developed nowadays follow all the steps of Software Development Life Cycle (SDLC). For this work, UML specifications created in Rational Rose are taken. UML has become the de-facto standard for analysis and design of OO software. Testing is conducted at 3 levels: Unit, Integration and System. At the system level there is no difference between the testing techniques used for OO software and other software created using a procedural language, and hence, conventional techniques can be used. This tool provides features for testing at Unit (Class) level as well as Integration level. Further a maintenance-level component has also been incorporated. Results of applying this tool to sample Rational Rose files have been incorporated, and have been found to be satisfactory.
Organizational Mail Tracking With Security Using 3d Password And Database Encryption
Megha Sunil, Navami M Chandrabose, Neethu Pious
This paper proposes to design an organizational mail server which has the property of tracking unauthorized mails send between employees of different scales in an organization for reducing security issues. It is a direct intruding into each employee’s mail in an organization, tracking and monitoring the mail, which is done in a secret format. In Mail Compose web page, an additional function which does not allow employees to send invalid mail and send a report to the administrator about this attempt. Administrator stores these reports in a separate database in an encrypted format. For secure operations, 3D Password is used as the protection tool for administrator login.
The motivation for every location based information system is: “To assist with the exact information, at right place in real time with personalized setup and location sensitiveness”. In this era we are dealing with palmtops and iPhones, which are going to replace the bulky desktops even for computational purposes. We have vast number of applications and usage where a person sitting in a roadside café needs to get relevant data and information. A very appealing application includes surveillance where instant information is needed to decide if the people being monitored are any real threat or an erroneous target. We have been able to create a number of different applications where we provide the user with information regarding a place he or she wants to visit. But these applications are limited to desktops only. We need to import them on mobile devices. We must ensure that a person when visiting places need not carry the travel guides with him.
Mr. Rajesh Saini*; Mr. Siddarth Kaul**; Mr. Satish Kumar
Impact and General Awareness of Cloud Computing In Public Domain
Mr. Rajesh Saini*; Mr. Siddarth Kaul**; Mr. Satish Kumar
In this paper, we characterize the problems and their impact on adoption. in addition, and equally importantly, we describe our experience and lessons learnt in construction of a cloud computing platform, cloud computing is a new general purpose internet-based technology through which information is stored in servers and provided as a service and on-demand to clients. In this we describe the reasons of that why people switch from traditional IT to the cloud and also discuss the characteristics and types of cloud computing. This paper describes cloud computing, a computing platform for the next generation of the Internet. The paper defines clouds, explains the business benefits of cloud computing, and outlines cloud architecture and its major components. In particular, we argue that with continued research advances in trusted computing and computation-supporting encryption, life in the cloud can be advantageous from a business intelligence standpoint over the isolated alternative that is more common today.
Shaikh Mohammed Salman1,Kanchwala Taha Juzer2,Ansari Mohammed Abdul Aziz3, Prof. Shrinidhi G
ASP.Net Technology Based Training And Placement Management System
Shaikh Mohammed Salman1,Kanchwala Taha Juzer2,Ansari Mohammed Abdul Aziz3, Prof. Shrinidhi G
TPO(Training And Placement Office) is required in every college for placement of students as well as exstudents. In existing system, all processes are handled manually. Students create and submit their CVs early in the year. TPO has to see each and every student marks and their eligibility. If anyone wants to talk with the TPO he has to go in a TPO cabin and take relevant information. The proposed Online Training and Placement application meant to give more easiness to the users that they can add and retrieve information so quickly. There are two types of student: Current Student and Alumni Student. Current Students can review and enter information around the clock and from any location. For alumni the last three years data will be maintained. Students on placement will also use the system to read important announcements, to obtain information on assessment, to see the results of assessments recorded in the system. There are three types of users mainly Student, Administrator and Company. The Administrator is the master user; he gets the most number of priorities than the other users. The Companies will have to get register for the first time so that their contact information, papers, vacancies will be provided. Companies can view all students that have applied for vacancies, together with information on their availability, application time, cover letters, attached CVs etc. First the system is test properly and at the same time, the users are trained in the new procedure. Proper implementation is essential to provide a reliable system to meet organization requirement. Implementation includes all those activities that take place to convert from old systems to new.
Today technology is improving daily in different aspects in order to provide flexible and safe movement for the people. Currently the most widespread and used mean by the visually impaired people are the white stick, however it has limitations. With the latest technology, it is possible to extend the support give to people with visual impairment during their mobility; this paper proposes an economical ultrasonic stick for visually challenged people, so as to gain a personal independence and free from the external help. A portable user friendly device is developed that can identify the obstacles in the path using ultrasonic sensors and Camera. Ultrasonic sensors can scan three different directions (at 180o). Camera can be used as an alternative tool in the places that surrounds with the low signal coverage, a microcontroller, buzzer and vibrating motor. The buzzer and vibration motor is activated when any obstacle is detected. GPS system provides the information regarding to his current location. SMS system is used by the blind to send SMS message to the saved numbers in the microcontroller in case of emergency.
Designing high efficient websites to facilitate effective webpage access has long been a challenge. A primary reason is that the web developers understanding of how a website should be structured can be considerably different from that of the users. While various methods have been proposed to relink webpages to improve accessibility using user access data, the completely reorganized new structure can be highly unpredictable, and the cost of disorienting users after the changes remains unanalysed. This paper addresses how to improve a website without introducing substantial changes. Specifically, we propose a mathematical programming model to improve the user access on a website while minimizing alterations to its current structure. Our model not only significantly improves the user access with very few changes, but also can be effectively solved. In addition, we define two evaluation metrics and use them to assess the performance of the improved website using the real data set. Evaluation results confirm that the user access on the improved structure is indeed greatly enhanced. More interestingly, we find that heavily disoriented users are more likely to benefit from the improved structure than the less disoriented users.
Improving Privacy Multi-Keyword Top-K Retrieval Search Over Encrypted Cloud Data
Aashi Qul Huq A1 Bhaggiaraj S
Cloud computing is the emerging technology, in which storage and retrieval of sensitive data information are increased in usage. The effective data access control should be done in the effective access controlled manner through which one provide an better security. The key work search retrieval over the encrypted cloud data’s are complex and tedious process which need to more concerned to improve the user friendly environment while searching for an data over an encrypted data. In this paper, for the first time, we define and solve the challenging problem of privacy-preserving multi-keyword ranked search over encrypted data in cloud computing (MRSE). We establish a set of strict privacy requirements for such a secure cloud data utilization system. Among various multikeyword semantics, we choose the efficient similarity measure of “coordinate matching”, i.e., as many matches as possible, to capture the relevance of data documents to the search query. The proposed methodology proves that the keyword search retrieval is done effectively over the encrypted cloud data’s with the consideration of privacy requirements. The experimental results prove that the our proposed methodology is better than the already existing methodologies.
Prof. Manjitsing Valvi, Jay P. Chauhan, Dinsha S. Dinani
Development of WiFi-Bluetooth Communication Protocol
Prof. Manjitsing Valvi, Jay P. Chauhan, Dinsha S. Dinani
Being different standards, translating packages sent from Bluetooth to Wi-Fi and vice versa is difficult and also the connections should not be broken or interrupted. Finding a right way to send and receive data is the need and also efficiency and reliability of the data is also necessary. Each protocol has its own associated hardware and software specification. Hence for communication between 2 devices with 2 different protocols is difficult. The system allows us to develop a network through which two powerful protocols can communicate with each other, even though the standards of the protocol are different.
Comparative Analysis Between Prophet, Spray &Wait And Spray & Focus Protocol Of Wireless Opportunistic Networks
Sonam kashyap Jasvir Singh
In the past wireless LANS and Cellular networks were used for communication In case of wireless networks if the nodes are mobile then Manets (mobile ad-hoc networks) can be used for communication. If the distance between the nodes increases it is not possible to communicate so to remove this limitation the idea of opportunistic networks was developed because they can only work when distance is less. This network makes it possible to communicate between nodes irrespective of their distance and the type of node. An opportunistic network tries to remove the statement of physical end-to-end connectivity while providing connectivity opportunities to pervasive devices when they are not directly connected to the Internet. Enveloping devices, can opportunistically exploit their mobility and contacts for data delivery. In this paper we are going to compare three basic protocols used in this kind of network based on some common parameters.
This research has explored the use of Mobile Virtual Laboratory (MVL) for laboratory practicals in institutions of higher learning in Nigeria by designing, implementing and evaluating the developed MVL cross-platform application that works on different mobile operating systems, contrary to previous research works that deployed virtual laboratories for only specific mobile operating systems. A cross-platform Mobile Virtual Laboratory architectural framework was developed with the client side in Java and the server side in PHP. This had four layers which were application server, database server, client (Mobile device) and the admin console and the Rapid Application Development (RAD) model was used. Thereafter science laboratory practicals were uploaded and one hundred and twenty students were purposively selected to evaluate its implementation. A user evaluation of the mobile virtual laboratory developed was conducted to evaluate its usability, conceptual learning and re-usability. Using the Fiedman’s test, a significant relationship was found between Mobile Virtual laboratories and conceptual learning, and also with re-usability, whereas no significant relationship was observed with usability. Analysis of this study indicates that while there is important evidence in Nigeria that there is an increase in the usage of mobile devices, more research work needs to be carried out on how it can be used positively to impact educational outcomes. This research demonstrated the superiority of the developed framework over other existing techniques for it being a cross-platform solution. Though targeted at tertiary institutions it could be integrated into teaching science laboratory practicals at all levels.
Documents on the web exist in digital format, people spend large amount of time on searching using web browsers for finding useful information. The results returned by search engines are in the form of web pages that contains results obtained from different web databases. These results can be used further in many applications such as data collection, comparison of prices and many more but to make these applications successful the search results should be machine processable. so to make them machine processable, it is important that the result pages are annotated in a meaningful manner. The process of annotating has to consider groups of data and obtain final annotation after aggregating them. Annotation can be done for the Web, java, pdf files, text files, xps, mobile, image, multimedia etc. In Information retrieval for decision support and integrating an annotation database can be founded on the parameters such as document, user and time. Automatic extraction of the data from querying result pages is very much important for different applications, such as data integration, Meta querying cooperates with multiple web databases.
Image Retrieval using Markovian Semantic indexing (MSI)
J.Premkumar, Mr.P.Prasenna
The research paper propose a unique methodology for automatic annotation, categorization and annotation-based retrieval of pictures. The new methodology, that we have a tendency to decision Markovian Semantic indexing (MSI), is bestowed within the context of an internet image retrieval system. forward such a system, the users’ queries square measure wont to construct AN mixture Markoff chain (AMC) through that the connection between the keywords seen by the system is outlined. The users’ queries are wont to mechanically annotate the pictures. A random distance between pictures, supported their annotation and therefore the keyword connection captured within the AMC is then introduced. Geometric interpretations of the planned distance square measure provided and its relevancy a bunch within the keyword area is investigated. By means that of a brand new live of Markovian state similarity, the mean 1st cross passage time (CPT), optimality properties of the planned distance square measure tried. pictures square measure sculptured as points in a very vector area and their similarity is measured with MSI. The new methodology is shown to possess bound theoretical blessings and conjointly to realize higher exactitude versus Recall results when put next to Latent linguistics categorization (LSI) and probabilistic Latent linguistics categorization (pLSI) strategies in Annotation-Based Image Retrieval (ABIR)tasks.
A New Approach for Distributed Data Mining based on SOA
K.R.Shankar1, K.Manikandan2
Distributed Data mining is a data mining where computation and data is dispersed over multiple independent sites. Service-oriented architecture (SOA) provides integration of many services which coordinate and communicate to one another for their respective goals. Web service based SOAs are used for on-demand computing as well as for developing more interoperable intra or inter organizational systems. Generally data needs be securely protected for privacy purposes in the distributed environment and services provided on time. To serve both the purpose we use Distributed Data Mining (DDM ) integrated on web service Business Process Execution Language (BPEL). Use of local abstraction technique used for privacy purpose
Encrypted Data Transmission for secured SMS over Web via EMail Platform
MugheleEse Sophia
The rapid development of computers andthe way in which technology has been used to supportpersonal needs of users as well as businesses worldwide have been changed by the emergence of the internet. As the Internet becomes a more pervasive part of daily life,people fail to realize that the internet would establish itself as a powerful facilitator of the needs of the common man in such a short period of time. However, an individual’s private information can be disseminated withouthis/her knowledge.Many different types of email security threats exist. The security of an email can be compromised by identity theft, spoofing, imposters and modification of existing messages. Hackers may use all or any of these aforementioned methods to have access into a user's computer.However, the computer architecture that is being used exchange of email are not very reliable and hence, does not provide for much privacy for the users.This undeniable fact is what has prompted the researcher to work on the Secured and Encrypted Data Transmission over the Web in order to protect information exchanged over the internet from being compromised.
A Comparative Analysis of Detecting the Moving Object with Principal Component Pursuit for Surveillance Application
Jonsta Grashia.J, M.Kamala Malar
Object detection may be elementary step for machine-driven video analysis in several vision applications. Object detection in a video is usually performed by object detectors or background subtraction techniques. To automatize the analysis, object detection becomes an important task. However existing motion-based ways are typically restricted, while handling advanced scenarios like nonrigid motion and dynamic background. This paper have a tendency to show that the on top challenges may be self-addressed in a much unified framework named DEtecting Contiguous Outliers in the LOw-rank Representation (DECOLOR). This formulation integrates object detection and background learning into one method of improvement, which might be resolved by an alternating algorithmic program efficiently. It helps to detect the moving object from dynamic background by means of outliers in the low rank representation. This algorithm deals with the complicated foreground and dynamic background and also non rigid shapes very effectively. A comparative analysis also made with Principal Component Pursuit (PCP) to prove the Efficiency of DECOLOR. Experiments on each simulated knowledge and real sequences demonstrate that the output performs the progressive approaches and it will work effectively on a good vary of complex situations.
Today, security is the main problem. Almost all the work is done over the internet crucial data are sent over the web and other information is placed over internet. While all the data is available online, there are many type of users who interact with data some of them for their need and others for gaining knowledge that how to destroy the data without the knowledge of the owner. There are various techniques used for protection of data but the hacker (or cracker) is more intelligent to hack the security, basically there are two categories of hackers that are different from each other on the basis of their intensions. The one who has good intensions are known as ethical hackers because they ethics to use their skill and techniques of hacking to provide security to the sensitive data of the organization. To understand this concept fully we describe introduction about the hacking, types of hackers, a survey of IC3, rules of ethical hacking and the advantages of the ethical hacking.
Energy is a valuable input in all sectors of any country’s economy. Today most of the countries draw its energy need using a variety of sources. Out of all these sources wind energy is an efficient and economically viable energy resource which is safe and has no side effects on the environment. Attempts are being made to develop low cost wind mills which can be used for generation of electricity and irrigation purposes. Therefore this paper presents various causes and remedies so that the usage of wind turbine can be assured as safe and free from causing any sort of harmful impact on the surroundings as well as the environment.
Prof. S.M.Rajbhoj, Gudaghe Aarti, Ghavate Bhagyashree, Bhabad Pallavi
FPGA Implementation Of Invisible Watermarking Algorithm Using LSB And DWT Technique
Prof. S.M.Rajbhoj, Gudaghe Aarti, Ghavate Bhagyashree, Bhabad Pallavi
The proposed paper gives LSB Information Hiding algorithm which can lift wavelet transform image. The idea behind the LSB algorithm is to insert the bits of the hidden message into the least significant bits of the pixels. Achieving the purpose of information hiding with the secret bits of information to replace the random noise, using the lowest plane embedding secret information to avoid noise and attacks, making use of redundancy to enhance the sound embedded in the way nature to be addressed. The results showed that the proposed algorithm has a very good hidden invisibility, good security and robustness for a lot of hidden attacks.However, the limitation of capacity has led us to think about an improved approach which can be achieved through hardware implementation systems with the help of a programmable gate array (FPGA) board. Watermarking is the process of embedding data within the domain of another data, this data can be text, image, audio, or video contents. MATLAB is used to convert video into images and then convert it into pixel-format i.e header files. Here to implement this paper XPS & VB are needed. In XPS do the hardware & software configuration. Then XPS could be processed by adding source and header files & converting into bit streams and download into FPGA, to obtain Secret image. The final result is shown on VB.
Security is the primitive issue in all the areas related to computers. Proper authentication is required to make the network secure. The traditional way of password using characters is vulnerable and easy to hack. So that method is integrated with new paradigms, such as Graphical Password, Captcha. Still the technology is evolving, a new security primitive based on hard AI problem, called Captcha as Graphical Password is evolving. CaRP is the combination of both Captcha and Graphical Password which make it hard to intruders to hack the password.
Exploiting a Low Power Self Timed Ternary CAM Architecture Using RWOS Mechanism
Silna George1, I. Rabeek Raja2, M. Prakash
Content addressable memories are mostly used in computer networking devices. Power consumption is a major problem for these memories because of the parallel search operation. When compared to binary content addressable memories the mask cell increases the complexity and number of transistors. This paper introduces a new architecture for ternary content addressable memories. The proposed architecture called dual TCAM modifies the mask cell configuration so that the area and power consumption of the design can considerably be reduced. The search output from two CAM cells is used in the mask cell to work as a TCAM. As design example, a 8 × 4 bit TCAM is implemented and simulated by Tanner EDA Tool. The proposed TCAM achieves significant reduction in terms of power and number of transistors.