An Approach to Utilize Memory Streams for efficient Memory Management
Manish(Student) , Sunil Dhankar (Reader)
when there is need to use Memory allocation on relatively huge datasets, there becomes possibilities to encounter the exception that is OutOfMemoryException. That shows that memory is not available for the allocation. This exception does not occur due to the limitation of memory of system, it occurs when virtual address space is not available for that byte of data. This takes place not because of the insufficient memory or the fact that memory has been reached limitations of the system memory, but it is because of the current implementation of memory allocation which uses a single byte array as a backing store.
When data set is relatively larger the backing store of memory allocation space requires more contiguous memory than is available in the virtual address space. If there is no continuous memory available for the process then it encounters the exception of OutOfMemoryException even there is enough space available but not continuous.
In this research we proposed an approach for dynamically deciding the best memory allocator for every application. The proposed solution does not require contiguous memory to store the data contained in the stream. This approach uses a dynamic list of small blocks as the backing store, which are allocated on demand as the stream is used. If there is no contiguous memory available in the Stream then memory allocation can be done from these small blocks of memory with no
Dynamic parallel K-Means Algorithm Based On Dunn’s Index Method
Hitesh Kumari Yadav Sunil Dhankar
—Clustering can be define as a method of unsupervised classification, in which the data points are grouped into cluster based on their similarity. K-means is an efficient and widely used algorithm for the purpose of partitioned clustering. K-means algorithm is effective in producing clusters for many real time practical applications. In this Work a modified parallel K-Means algorithm is proposed that overcome the problem of fixed number of input clusters and their serial execution. Originally K-means algorithm accepts the fix number of clusters as input. But in real world scenario it is very difficultto fix the number of clusters in order to get the optimal outcomes.
K-means is popular and widely used clustering technique. Many research has been already done in same area for the improvement of K-means clustering, but further investigation is always required to reveal the answers of the important questions such as ‘is it possible to find optimal number of clusters dynamically while ignoring the empty clusters’ or ‘does the parallel execution of any clustering algorithm really improves it performance in terms of speedup’.
This research presents an improved K-Means algorithm which is capable to calculate the number of clusters dynamically using Dunn’s index approach and further executes the algorithm in parallel using the capabilities of Microsoft’s Task Parallel Libraries. The original K-Means and Improved parallel modified K-Means algorithm performed for the two dimensional raw data consisting different numbers of records. From the results it is clear that the Improved K-Means is better in all the scenarios either increase the numbers of clusters or change the number of records in raw data. For the same number of input clusters and different data sets in original K-Means and Improved K-Means, the performance ofModified parallel K-Means is 18 to 46 percent better than the original K-Means in terms of Execution time and Speedup
This paper describes the planning and development of location. Mobile commercial enterprise application is employed to seek out commercial enterprise place. This application is employed in each robot platform. owing to this app folks will notice actual location. The frontend of this app is robot. Backend is Xamp server, php server. This app helps for any someone for locating and visiting any location at intervals less time. Through this app we will visit the place in future and there's no want of guide. This app can advocate the places wherever we will visit the place. Emergency is additionally provided during this app. once a user thinks that he's in peril he will contact others
Ms.Bawake Shraddha N Ms.Barse Stella N ,Kote G.B Ms. Chechre Pritam ,Mr. Chaudhari Suraj
“Formation of E-Commerce Recommendation (E-Shop)”
Ms.Bawake Shraddha N Ms.Barse Stella N ,Kote G.B Ms. Chechre Pritam ,Mr. Chaudhari Suraj
Recommendation techniques are very important in the fields of E-commerce and other Web-based services. One of the main difficulties is dynamically providing high-quality recommendation on sparse data. In this paper, a novel dynamic personalized recommendation algorithm is proposed, in which information contained in both ratings and profile contents are utilized by exploring latent relations between ratings, a set of dynamic features are designed to describe user preferences in multiple phases, and finally a recommendation is made by adaptively weighting the features.
Recommended systems are changing from novelties used by a few E-commerce sites, to serious business tools that are re-shaping the world of E-commerce. Many of the largest commerce Web sites are already using recommended systems to help their customers find products to purchase. A recommended system learns from a customer and recommends products that she will find most valuable from among the available products. In this paper we present an explanation of how recommended systems help E-commerce sites increase sales
Nayana M. Nale, Shilpa, R. Landge, Shradha A. Darekar, Suvarna B. Gadhave, Yogesh S. Jorwekar
Real-Time Carpooling Application for Android Platform
Nayana M. Nale, Shilpa, R. Landge, Shradha A. Darekar, Suvarna B. Gadhave, Yogesh S. Jorwekar
— in today’s world, everyone commuting from one place to another for specific reasons. E.g. school, job, excursion etc. Carpooling is the application based on android platform help sharing of car journeys so that more than one person travels in a same car. By adopting this application, users, will help to reduce high amount of CO2 emission because of car. Carpooling reduces each person’s travels costs such as fuel costs, parking space, tolls, and the stress of driving. Carpooling application is a more environmentally friendly and sustainable way to travel as sharing journeys. Carpooling is more suitable, especially during high fuel prices and high pollution periods. Carpooling is Android based application that provide more security and easy way to find a car for journey. Also it is working on mobile device which makes it more usable and makes the service more dynamic
With the introduction of good phones and mobile devices, the speed of technology usage has became widespread. it's terribly possible to visualize anyone victimization mobile device on the road, in the bus, at work, in school or maybe reception in spite of if there's a totally useful pc close. significantly, the usage of good phones isn't simply restricted to basic communication; they're getting used as a technological contrivance in our daily lives for years.
we will simply reach such applications on totally different fields e.g. image process, audio standardization, vidseo piece of writing, voice recognition. With the assistance of those good applications we tend to ar able to move with our surroundings a lot of quicker, and consequently build life easier; comparatively needless to say. aside from itinerant business, sharing and staying connected to our surroundings has become extremely popular. someone WHO incorporates a mobile phones with latest technology wishes to speak with friends, share feelings instantly, and in fact meet new folks that perceive him well.
Anonymization in Social Networks: A Survey on the issues of Data Privacy in Social Network Sites
A.Praveena, Dr.S.Smys
Recent years have seen unprecedented growth in the popularity of social network systems. Social networks are online applications which allow its users to connect by various link types. Online social networks, such as Facebook, LinkedIn are progressively utilized by many people. It makes digital communication technologies sharpening tools for extending the social circle of people. It has already become an important integral part of our daily lives, enabling us to contact our friends and families on time. As social networks have developed rapidly, recent research has begun to explore social networks to understand their structure, advertising and marketing, and data mining online population, representing 1.2 billion users around the world. More and more social network data has been made publicly available and analyzed in one way or another.
Optimization of Dynamic Packet Length in Wireless Sensor Network
Mr. Sudip Tembhurne 1, Mr. Sachin Deshpa
In wireless sensor network, packet size optimization is an important issue in energy constrained. We provide dynamic packet length optimization scheme to achieve best performance in terms of channel utilization, energy efficiency. The adaptation of dynamic packet length is 802.11 wireless system. We increase data delivery ratio, system throughput and decrease network conjunction, end to end delay. In the system, the packet delivery ratio keeps high i.e. 95% above and link estimated error within 10% for 95% link. The system provides accurate link estimation method which achieves best performances related to the previous works.
Modified Random Walk Algorithm to Improve the Efficiency of Word Sense Disambiguation (WSD)
Rashmi S, Hanumanthappa M
Natural language processing (NLP) is a field in computer science, artificial intelligence and the linguistics which mainly concentrates on the interactions between human languages (natural language) and the computer. One of the main challenges in NLP is ambiguity. Every language is ambiguous in nature, in the way that one word has multiple meaning and multiple words have same meaning. The ambiguities are generally categorized into two groups: lexical and structural ambiguities. Lexical ambiguity arises where there are two or more possible meaning for a single word. Structural ambiguities appear when a given sentence is interpreted in more than one way due to ambiguous sentence structure. Word Sense Disambiguation (WSD) is defined as the task of finding the correct sense of a word in a specific context. This paper presents our preliminary work towards building WSD system by constructing a corpus. We include a detailed analysis of the factor that affects the WSD algorithm and propose a modified algorithm based on random walk algorithm and compare the working of each of these algorithms
Homomorphic authenticable ring signature mechanism for Public Auditing on shared data in the Cloud
MS. Gayatri D Patwardhan, Prof.B. W. Balkhande
Cloud computing is the long dreamed vision of computing as a utility, where users can remotely store their data into the cloud so as to enjoy the on-demand high quality applications and services from a shared pool of configurable computing resources. By data outsourcing, users can be relieved from the burden of local data storage and maintenance. Thus, enabling public auditability for cloud data storage security is of critical importance so that users can resort to an external audit party to check the integrity of outsourced data when needed. To securely introduce an effective third party auditor (TPA), the following two fundamental requirements have to be met: 1) TPA should be able to efficiently audit the cloud data storage without demanding the local copy of data, and introduce no additional on-line burden to the cloud user. Specifically, our contribution in this work can be summarized as the following three aspects:
In the present scenario, some people need to be online all the time but it is not possible to have wired media available at everywhere. In this case Boundless media is the best solution. There are many types of boundless transmission available like radio transmission, microwaves, infrared and laser/light wave transmission. In this paper we will focus on the uses, requirements, benefits and types of unbounded communication.
There is an up-and-coming topic in the field of computer science and technology that is getting a lot of publications these days and that is Big Data. The term Big data is referring for a collection of large and complex data sets which is very hard to process by database management tools or data processing applications. The volumes are in the range of Exabyte and above. Most of companies and government agencies are dealing with it in this current speedy moving technology environment. Stand -alone applications as well as today’s newer web-based processes are generating large amount of data. This raise some issues that has to be considered when dealing with big data. Some of the issues are storage, management and processing. This data also creates new challenges which make the framework based developers to come up with solutions to the problems. They introduced different frameworks for the storage, management and processing of big data that are scalable and portable with fault tolerance capabilities. Newer technologies for the analysis of such large amount data that is complex to handle are also introduced. These challenges include Privacy and Security, Scale, and Heterogeneity to mention a few. This paper reviews the Issues and Challenges of Big Data from Data Analytic and Storage Perspectives.
We also briefly discussed about the frameworks and databases that can be used to tackle the challenges that is facing Big Data world.
Insights to Sensor Technology and it’s Applications
Prof. Mayuri Bapat
A sensor is a device which is able to convert any objective quantity to be measured into a signal which is interpreted, displayed, stored or used to control some other data. The signal formed by the sensor is equal to the quantity to be measured. Sensors are used to calculate a particular characteristic of any object or device.[1] This paper provides the overview about What is sensors, different fields where sensors are used, how sensors are used in day to day life, its advantages and disadvantages and types of sensors. Paper also gives idea about how sensors plays an important role in cyber security.
The cloud computing and the Internet of things are tightly coupled with each other . The rapid growth of the Internet of Things (IoT) and the development of technologies created a widespread connection of “things”. This results in the production of large amounts of data which needs to be stored, processed and accessed. Cloud computing is a paradigm for big data storage and analytics while the Internet of Things is exciting on its own that the real innovation will come from combining it with cloud computing . This can enable sensing services and powerful processing of sensing data stream. More things are being connected to address a growing range of business needs. In fact, by the year 2020, more than 50 billion things will connect to the Internet—seven times our human population. Insufficient security will be a critical barrier to large-scale deployment of IoT systems and broad customer adoption of IoT applications using cloud. Simply extending the existing IT security architectures to the IoT and cloud will not be sufficient. The IoT world requires new security approaches, creating fertile ground for innovative thinking and solutions. This paper discusses key issues that are believed to have long-term significance in IoT and cloud computing security and privacy, based on documented problems and exhibited weaknesses
Power performance analysis on secure and efficient authentication protcols in mobile devices
S Kharthikeyana, K Azarudeenb , S Samsude
Mobile phone in our present life has become an important device for communication, selective from formal and informal talks to sharing confidential and secure information. This secure information includes personal communication to large business deals. Thus, there is a need to guarantee security to those applications which use to send confidential information. Focusing to meet the mobile users’ demand, many cryptographic protocols are chosen based on confidentiality, integrity and authentication. Public Key Cryptography is a better solution that fulfills the above mentioned necessities. Applications that use public key cryptography deals with computing power, key size to measure the efficiency of the protocol. This work mainly focuses on the performance attributes of the ECC algorithm modified with its addition and multiplication points after generating a prime number, which leads to few changes in the algorithm parameters of encryption and decryption process to improve performance with limited power consumption. The proposed algorithm studied and compared with its conventional one, is highly secured for a mobile communication along with minimum power and current consumption. The protocol energy efficiencies are measured based on the consumption of current, power against time. Experiments results that the proposed protocol consumes less power and current when compared with the conventional ECC algorithm
Overview of Classification and Risk Evaluation in Multi-dimensional Datasets
Dr. K. Kavitha
Data mining is the task of discovering useful and interested patterns from the huge amount of the data where the data can be stored in databases, data warehouses and other information repositories. Data mining comprises an integration of techniques from various disciplines such as data visualization, database technology, information retrieval, high performance computing, machine learning and pattern recognition, etc. The classification of multi-dimensional data is one of the major challenges in data mining and data warehousing. In a classification problem, each object is defined by its attribute values in multidimensional space. Some of the existing systems consider the data analysis might identify the set of candidate data cubes for exploratory analysis based on domain knowledge. Unfortunately, conditions occurred for such assumptions are not valid and these include high dimensional databases, which are difficult or impossible to pre-calculate the dimensions and cubes. Some proposed system is formulated automatically find out the dimensions and cubes, which holds the informative and interesting data. In high dimensional datasets, the data analysis procedures need to be integrated with each other. Based on the information theoretic measures like Entropy is used to filter out the irrelevant data from the dataset in order to formulate a more compact, manageable and useful schema.
Inferring User Search Goals with feedback Sessions using Fuzzy K-Means Algorithm
Mr.Gajanan Patil, Miss.Sonal Patil
This document shows the concept for a broad topic and ambiguous query, different types of users may have different search goals when they submit the query to the search engine. The inference and analysis of user search goals can be very useful in improving search engine relevance information and user experience. In this paper, we propose a novel approach to infer user search goals by analyzing search engine query logs. First, we propose a framework to search different user search goals for a query by making cluster to the proposed feedback sessions. Feedback sessions are constructed from user click-through logs i.e. user response and can efficiently reflect the information needs to users. Second, we propose a novel approach to create pseudo-documents to better represent the feedback sessions for clustering. Finally, we propose a new criterion Classified Average Precision (CAP) to calculate the performance of inferring user search goals. Experimental results are presented using user click-through logs from a commercial search engine to check the effectiveness of our proposed methods
Medium Valuation Method for Massive MIMO Using Gaussian Mixture Bayesian Learning
M.Sakthivel P.Agila P.Anitha
Pilot contamination posts a elementary limit on the performance of huge multiple-input–multiple-output (MIMO) antenna systems owing to failure in correct channel estimation. To address this drawback, we tend to propose estimation of solely the channel parameters of the specified links during a target cell, however those of the interference links from adjacent cells. The desired estimation is, nonetheless, AN underdetermined system. During this paper, we show that if the propagation properties of huge MIMO systems will be exploited, it's potential to get a correct estimate of the channel parameters. Our strategy is impressed by the observation that for a cellular network, the channel from user instrumentality to a base station consists of solely a number of clustered methods in space. With an awfully massive antenna array, signals may be discovered under extraordinarily sharp regions in space. As a result, if the signals are discovered within the beam domain (using Fourier transform), the channel is around thin, i.e., the channel matrix contains only a little fraction of huge elements, and different elements are near zero. This observation then permits channel estimation based on thin Bayesian learning strategies, wherever thin channel components may be reconstructed employing a little variety of observations. Results illustrate that compared to traditional estimators; the planned approach achieves far better performance in terms of the channel estimation accuracy and doable rates in the presence of pilot contamination.
In the modern age of Internet, usage of social media is growing rapidly on internet, organizing the data, interpreting and supervising User generated content (UGC) has become one of the major concerns. Updating new topics on internet is not a big task but searching topics on the web from a vast volume of UGC is one of the major challenges in the society. In this paper we deal with web search result clustering for improving the search result returned by the search engines. However there are several algorithms that already exist such as Lingo, K-means etc. In this paper basically we work on descriptive-centric algorithm for web search result clustering called IFCWR algorithm. Maximum numbers of clusters are randomly selected by using Forgy’s strategy, and it iteratively merges clusters until most relevant results are obtained. Every merge operation executes Fuzzy C-means algorithm for web search result clustering. In Fuzzy C-means, clusters are merged based on cosine similarity and create a new solution (current solution) with this new configuration of centroids. In this paper we investigate the Fuzzy C-means algorithm, performing pre-processing of search query algorithm and try to giving the best solution.
Steganography is used to encrypt any secret information like password, text and picture, audio behind original cover file. Original message is converted into cipher text by using mystery key and then hidden into the LSB of original image. The current work signifies cryptostegnography of audio and video which is the combination of image steganography, audio and video steganography by making use of Forensics Technique as a tool for authentication. The main aim is to hide secret data behind image and audio of video file. As video is the utilization of many still frames of images and audio, thus for hiding secret information any frames can be selected for audio and video. Suitable algorithm such as AES for security and authentication image processing is used, hence data security can be increased. And for data embedding, use 4LSB algorithm
:In Cloud, we describe the ways to enable protection of Multimedia contents from redistributing. Web has billions of documents including video, audio and images, but there is no central management system, where duplication of contents is more common . It is said that each and every document has a duplicate copy. This is more prevelant in videos stored in multiple formats, versions, size etc. and are found unaware by content creators when modified and republished using Video Editorial Tools. This may lead to security problems and also reduplicating the identity of owners and also loss of revenue to content creators. This also occupies a enormous space over the web. In cloud storage too it is more common involving both public and private clouds. But the private cloud is said to be more secure when compared to the public cloud. So to avoid this situation some of the techniques have been used to avoid duplication of contents and focused mainly over the 3D-video contents.
Bluetooth technology unplugs our digital peripherals. In short, it is a wireless replacement for many of the cables we currently use to transmit voice and data signals. Bluetooth radio modules use Gaussian Frequency Shift Keying (GFSK) for modulation. Bluetooth employs an FHSS spreading technique, changing frequencies at a rate of 1600 times per second - 160 times the rate at which a wireless LAN changes frequencies. This paper focuses on the attacks and security issues in Bluetooth technology.
Mr. Rahul Samant Mr. Sachin Deshpande Mr. Anil Jadhao
Multi Criteria Recommendation System for Material Management
Mr. Rahul Samant Mr. Sachin Deshpande Mr. Anil Jadhao
Material management is related to planning, controlling and organizing the flow of material from availability to requirement. Mapping excess material in appropriate location considering multiple criteria is one of the administrative decision making task
Aatish B. Shah, Jai Kannan, Deep Utkal Shah Prof. S.B.Ware, Prof. R.S.Badodekar
Fog Computing: Securing the cloud and preventing insider attacks in the cloud.
Aatish B. Shah, Jai Kannan, Deep Utkal Shah Prof. S.B.Ware, Prof. R.S.Badodekar
Cloud computing promises to significantly change the way we use computers and access and store our personal and business information. Because of these new computing and communication paradigm there arise data security challenges. Even though existing techniques use security mechanisms, data theft attacks prevention fails. To overcome this we can use decoy technology to secure data stored in cloud.Although, Fog Computing is defined as the extension of the Cloud Computing paradigm, its distinctive characteristics in the location sensitivity, wireless connectivity, and geographical accessibility create new security and forensics issues and challenges which have not been well studied in Cloud security and Cloud forensics.
We monitor data access in the cloud and detect abnormal data access patterns. When unauthorized access is suspected and then verified using challenge questions, we launch a disinformation attack by returning large amounts of decoy information to the attacker. This protects against the misuse of the user’s real data. Experiments conducted in a local file setting provide evidence that this approach may provide unprecedented levels of user data security in a Cloud environment
Consider the problem of allocating resources such as machines, memory or bandwidth to meet the demands of a given set of jobs. Resource allocation should be done in such a manner that the available resources are assigned to the given jobs in an economic way. The Resource allocation (RESALL) problem is motivated by its applications in many real-life scenarios such as interval scheduling, workforce management. This is an NP-hard problem. The input consists of a set of resources and a demand profile (corresponding to the set of jobs). Each resource has a capacity, a cost, a start time and a finish time. The cost of solution is the sum of the costs of the resources included in the solution. The goal is to find a minimum cost feasible solution such that multiple units of resources can be included in the solution and at any timeslot, the sum of capacities of these resources should be at least the demand requirement at that timeslot. This problem definition corresponds to the MULTIRESALL version of RESALL problem
Prajakta. S. Jadhav1 Preeti. P. Dapkekar2, Shubhangi. M. Jadh Kartik. K. Kondalkar 4,Shabnam S. Shaikh
A Review of Damaged Manuscripts Using Binarization Techniques.
Prajakta. S. Jadhav1 Preeti. P. Dapkekar2, Shubhangi. M. Jadh Kartik. K. Kondalkar 4,Shabnam S....
Image binarization is the procedure of separating pixel values into two parts, black as foreground and white as background. In binarization document image is converted into binary image using thresholding techniques. Many binarization algorithms have been proposed for different types of degraded document images. The main objective of image enhancement is to modify attributes of an image to make it more suitable for a given task . To remove the noise and improve the quality of the document binarization techniques are used. Thresholding is one of such binarzation technique which is used for this purpose. Thresholding is further divided into the global and local thresholding techniques. In the document with uniform contrast delivery of background and foreground, global thresholding has been found to be best technique. Local thresholding , is an approach for the situations in which single value thresholding does not yield proper result, to overcome this a hybrid approach is introduced which is a combination of local with Otsu’s thresholding.
In India, everyday many lives are affected because the patients are not timely and properly operated. Also for real time parameter values are not efficiently measured in clinic as well as in hospitals. Sometimes it becomes difficult for hospitals to frequently check patients’ conditions. Also continuous monitoring of ICU patients is not possible. To deal with these types of situations, our system is beneficial. Our system is designed to be used in hospitals for measuring and monitoring various parameters like temperature, ECG, heart beat etc. The results can be recorded using Raspberry Pi displayed on a LCD display. Also the results can be sent to server using GSM module. Doctors can login to a website and view those results.
Now a day people tend to seek knowledge or information from internet that concern with health through online healthcare services. The basic aim of this system is to bridge the vocabulary gap between the health providers by proving instant replies to the questions posted by patients. Automatic generated content for healthcare services are chosen instead of traditional community generated systems because they are reliable, compatible, and provide instant replies. This paper proposes a scheme to code the medical record using local mining and global approaches. Local mining aims to code the medical records by extracting the medical concepts from individual record and then mapping them to terminologies based on external authenticated vocabularies. Local Mining establishes a tri-stage framework to accomplish this task. Global learning aims to learn missing key concepts and propagates precise terminologies among underlying connected records over a large collection.
Most often in our daily life we have to carry lot of cards such as credit cards, debit cards and some other special cards for toll system ERP, parking and personal identification purpose. Currently smart card implementations can be seen around the world but they are not unified i.e. each developers uses different programming standards and data structures. The smart card will provide service to the user only within a university campus or an organization.
In order to make available such multiple application access using a single card to every individual person we have planned to use RFID technology, which is cost effective. As RFID technology is used in the proposed concept, the programming standards and data structures will be unified. Unlike smart card, the RFID card can be used by every individual person to access different applications. Thus, a person needs not to carry number of cards; he can just carry a single card for different purpose.
: This paper presents an interactive model based system for the management of production and controlling process in spinning mill using embedded system and Internet of Things. This system consists of various sensors to measure and store different parameters that are calculated to find the production rate. Apart from a comprehensive presentation of the set of the modules the system is composed of, together with their interrelationships, the above characteristics are analyzed, and their impact on the production control system is explained.. The system is also related to two control process namely air cooler controller and moisture mixer sprayer controller. This process is currently done manually which is being automated.Making Automated in this system we can effectively control the quality of the yarn that is produced in those alike industries. The system's attributes are presented with the aid of data structure diagrams, while the complete algorithm concerning the arduino module, in a algorithm form is provided and it presents a survey of all such systems.
: It has always been an arduous task for the election commission to conduct free and fair polls in our country, the largest democracy in the world. Crore of rupees have been spent on this to make sure that the elections are riot free. But, now-a-days it has become common for some forces to indulge in rigging which may eventually lead to a result contrary to the actual verdict given by the people. This paper aims to present a new voting system employing biometrics in order to avoid rigging and to enhance the accuracy and speed of the process. The system uses thumb impression for voter identification as we know that the thumb impression of every human being has a unique pattern. Thus it would have an edge over the present day voting systems. As a pre-poll procedure, a database consisting of the thumb impressions of all the eligible voters in a constituency is created. During elections, the thumb impression of a voter is entered as input to the system. This is then compared with the available records in the database. If the particular pattern matches with anyone in the available record, access to cast a vote is granted. But in case the pattern doesn’t match with the records of the database or in case of repetition, access to cast a vote is denied or the vote gets rejected. Also the police station nearby to the election poll booth is informed about the identity of the imposter. All the voting machines are connected in a network, through which data transfer takes place to the main host. The result is instantaneous and counting is done finally at the main host itself. The overall cost for conducting elections gets reduced and so does the maintenance cost of the systems.
Advance Dynamic Malware Analysis Using Api Hooking
Ajay Kumar , Shubham Goyal
As in real world, in virtual world also there are people who want to take advantage of you by exploiting you whether it would be your money, your status or your personal information etc. MALWARE helps these people accomplishing their goals. The security of modern computer systems depends on the ability by the users to keep software, OS and antivirus products up-to-date. To protect legitimate users from these threats, I made a tool
(ADVANCE DYNAMIC MALWARE ANAYSIS USING API HOOKING) that will inform you about every task that software (malware) is doing over your machine at run-time
Wireshark is a network protocol analyser. Wireshark is able to intercept packets transmitted over the network and compile statistics about network usage, allow the user to view content that is being accessed by other network users, and store usage information for offline access. This paper depicts the comparison of Wireshark, with one other similar tool, Network Miner, which is a Network Forensic Analysis Tool (NFAT), based on different parameters: graphical user interface (basic), packet information and traffic analysis. Network Miner can be used as a passive network sniffer/packet capturing tool and can parse PCAP files for off-line analysis.
The cloud is redefining the IT architectures of various business domains. Organizations have clearly recognized the unlimited benefits of cloud like dynamic payloads on the technical side and elastic financial models on the commercial side which guarantees in greater than before efficiency. The benefits of cloud computing can be maximized fully by applying novel technologies in security and risk management processes. The risk factors in terms of security is much more in public cloud computing compared to traditional computing which are bases on datacenter . In a highly shared ad hoc cloud environment which is in the control of instance-to-instance network connectivity security of the applications and sensitive data is a big challenge faced by the cloud providers. As the entire stack of applications, platform, infrastructure of the cloud is designed and managed by service providers the cloud users are uncertain about the security. This paper studies the generic security challenges in an ad hoc cloud environment .the security challenges and mitigations are discussed in section I. A survey was conducted with users of cloud from domains like health care, education and retail business. Analysis of survey data and the results are also discussed in section
File Encryption System Based on Symmetric Key Cryptography
Ajay Kumar, Ankit Kumar
In Today’s scenario, files are not secure. They are fetch by any means of attack by eavesdropper like cracking the pins, crashing the OS by viruses, malwares, and plenty of ways. We today can’t sure that files protection wizards are secure and data can’t be reached to the attacker. But if files are encrypted then even files are accessed original data remains confidential. Therefore, this paper represents the File Encryption System based on Symmetric Key Cryptography. I proposed the strategy to encrypt the files/even multiple files can be encrypted by compressing the files into one ‘rar’ file and it uses Blowfish as encryption/decryption standard and Cipher Block Chain Mode to perform the operations. I implemented compression function for 64-bit Initialization Vector(IV), use CBC mode with Blowfish and RC4 for 256-bit keystream. It is more efficient and secure then other general encryption process
Smart Energy Meter Using Android Application And Gsm Network
Diya Elizabeth Paul Prof . Alpha Vijayan
This is implemented for the purpose of getting a fully automized electricity billing system. This is aim to measure and monitor the electricty consumed by consumers in a locality and transmitting the consumed power to the station as well as issuing the bill of consumed power automatically. It is also aims to find the malpractices in the meter. Using this system the Electricity Board can access all data regarding the consumed power in each home and in each station whenever required. From the data the Board can find out power theft also it is also offers a system to charge extra payment for the excess usage of power at peak time (6.00---10.00pm) .Online payment is also possible for our new system. GSM is using for automating the system. The consumed unit transmission, alerts and bill reception are achieved by the GSM module in the client side as set by the user. Server station is also served by a GSM module for transmission and reception of data .