A Review on Various Techniques of Image Compression
aManju Devi, bUma Mehta
Images are an important form of data and are used in almost every application. Some applications cannot use images directly due to the large amount of memory space needed to store these images. Image compression is to reduce irrelevance and redundancy of the image data in order to be able to store or transmit data in an efficient form. Compression of images plays an important role in storing the images .in this paper. In this paper we analyze different types of existing method of image compression. Now there is question may be arise that how to image compress and which types of technique is used. Basically lossy and lossless image compression technique are used for compress the images.
Review on Energy Efficient LEACH Protocol in Wireless Sensor Network
Anju Godara, Pankaj rathee
—In Wireless Sensor network (WSN) the energy consumed by the sensor network determines the wireless sensor network lifetime and battery backup which effect lifetime of sensor node and nodes are alive more time. LEACH is a hierarchical routing protocol special for wireless network .LEACH due to randomness property CH selection and any sensor node become cluster head. More number of cluster in sensing reduce cluster size and energy consumption of nodes and cluster also data transfer between cluster head and base station. Stability is fail to maintain in system. The purpose is to increase the alive time of nodes for more data transfer among BS and CH without failure of any node. In improved LEACH protocol, after clusters are established, each cluster head decides whether to select a new cluster head, based on their energy consumption. The new cluster head will be the node having more residual energy. Thus, those clusters with less load can avoid the energy consumption resulted in selecting cluster head frequently. A homogeneous sensor network consists of identical nodes, while a heterogeneous sensor network consists of two or more types of nodes organized into hierarchical clusters
Approach on Face Recognition & Detection Techniques
aParveen kumar , bDoulat singh
Images are an important form of data and are used in almost every application. Some applications cannot use images directly due to the large amount of memory space needed to store these images. One of the most critical decision points in the design of a face recognition system is the choice of an appropriate face representation. Effective feature descriptors are expected to convey sufficient invariant and non-redundant facial information. Motion information is used to find the moving regions and probable eye region blobs are extracted by thresholding the image. These blobs reduce the search space for face verification which is done by template matching. Experimental results for face detection show good performance even across orientation and pose variation to a certain extent. The face recognition is carried out by cumulatively summing up the Euclidean distance between the test face images and the stored database, which shows good discrimination for true and false subjects. As human face is a dynamic object having high degree of variability in its appearance, that makes face detection a difficult problem in computer vision. In this field, accuracy and speed of identification is a main issue.
The goal of this paper is to evaluate various face detection and recognition methods, provide complete solution for image based face detection and recognition with higher accuracy, better response rate as an initial step for video surveillance. Solution is proposed based on performed tests on various face rich databases in terms of subjects, pose, emotions, race and light.
Recognition of Fake Currency Based on Security Thread Feature of Currency
Eshita Pilania1, Bhavika Arora2
In the last few years a great technological advances in color printing, duplicating and scanning, counterfeiting problems have become more serious. In past only authorized printing house has the ability to make currency paper, but now a days it is possible for anyone to print fake bank note with the help of modern technology such as computer, laser printer. Fake notes are burning questions in almost every country. Counterfeit notes are a problem of almost every country but India has been hit really hard and has become a very acute problem. Fake Indian currency of 100, 500 and 1000 rupees seems to have flooded the whole system and there is no proper way to deal with them for a common person. There is a need to design a system that is helpful in recognition of paper currency notes with fast speed and in less time. Our system describes an approach for verification of Indian and other countries currency banknotes. The currency will be verified by using image processing techniques.
LTE Network Structure, QoS Considerations, Bearers and Interfaces
Husham Ibadi, R. S. Kawitkar
in this paper a comprehensive discussion about the entire EPS network structure and functionality of the various nodes residing in both radio access part and non radio network part is provided. The EPS interfaces, bearer establishment, QoS criteria and interworking with other networks such GSM also have outlined.
Improved Map Reduce K Mean Clustering Algorithm for Hadoop Architecture
Shweta Mishra, Vivek Badhe
Cluster is a gathering of information individuals having comparable qualities. The procedure of setting up a connection or getting data from crude information by performing a few operations on the information set like grouping is known as information mining. Information gathered in reasonable situations is usually totally arbitrary and unstructured. Consequently, there is dependably a requirement for examination of unstructured information sets to determine important data. This is the place unsupervised calculations come into picture to prepare unstructured or even semi organized information sets by resultant. K-Means Clustering is one such method used to give a structure to unstructured information so that significant data can be separated. Discusses the implementation of the K-Means Clustering Algorithm over a distributed environment using Apache Hadoop. The key to the implementation of the K-Means Algorithm is the design of the Mapper and Reducer routines which has been discussed in the later part of the paper. The steps involved in the execution of the K-Means Algorithm has also been described and this based on a small scale implementation of the K-Means Clustering Algorithm on an experimental setup to serve as a guide for practical implementations.
The protection afforded to an automated information system in order to attain the applicable objectives of preserving the integrity, availability and confidentiality of information system resources is called computer security. Integrity of a message should be preserved as it travels from the sender to the recipient, it is compromised if the message is modified during transit. The principle of availability states the resources should be available to authorized parties at all times. The principle of confidentially specifies that only the sender and the intended recipient should be able to access the contents of a message. Data security is a very vital thing to ensure the privacy of a user from others. For an organization, it is very necessary to keep the information safe from various attackers. Computer and network security is a battle of wits between a perpetrator who tries to find holes and the designer or administrator who tries to close them. Strong encryption algorithms can be used to make it impossible for an attacker to attack the node that is strongly protected by multiple keys. This work is focused on the use of dynamic keys for securing the data and for securing data transmission
Web Personalization Based on Enhanced Web Access Pattern using Sequential Pattern Mining
Kuldeep Singh Rathore Sanjiv Sharma
The development of the web has created a big challenge for directing the client to the website pages according to their need. Accordingly, just option is to capture the intuition of the client and provide them a list of recommendation. More specifically, the behavior of an online navigation develops with each passing day; consequently extracting information intelligently from it is a tedious task. Webmaster of an organization ought to utilize methods of web mining to fetch intuition, WUM is one among them. Web usage mining is planned to operate on web server logs; logs contain client's behavior towards browsing patternon web which is very useful for the web recommendation. Recommendation is an application of WUM. Consequently, recommendation system can be utilized to forecast the navigation pattern of client and recommend those to client in a recommendation list form. In this paper, we tend to propose a recommendation approach that recommends a listing of pages based mostly upon client's historic pattern (recorded within the web log). This approach brings the advance within the accuracy of the pages displayed to the client or users.
Effect of Curing Methods on the Compressive Strength of Concrete
Obam, Ogah .
Different methods are usually adopted to cure concrete. Concrete strength partly depends on the method and duration of curing. The structural use of concrete depends largely on its strength, especially compressive strength. This study uses three curing methods to determine their effects on the compressive strength and density of concrete. These methods are immersion of concrete cubes in curing tank (Ponding), covering of cubes with wet rug (Continuous wetting) and the use of polythene sheet (Water-barrier). Laboratory experimental procedures were adopted. A total of sixty (60) cubes were cast with 1:2:4 mix ratios. The cubes were cured in the laboratory at room temperature. The results showed that the average compressive strength values for 28-day curing vary with curing methods. The cubes cured by immersion have an average compressive strength of 29.7 N/mm2 while the ones cured by wet rug and polythene sheet have average compressive strength of 26.8 and 24.7 N/mm2 respectively. The traditional curing by immersion appeared to be the best method to achieve desired concrete strength.
A New Approach To Automatic Generation Of All Quadrilateral Meshes Over A Linear Convex Polygon With H-Refinements For Finite Element Analysis
H.T. Rathoda*, K. Sugantha Devib
This paper presents an automatic mesh generation scheme for a linear convex polygonal domain in with boundary composed of piecewise straight lines.We can express
=
in which is a polygonal domain of N oriented edgeswith end points and . Where,it is assumed that can be discretised into a set of N triangles, and we can also divide each triangle into smaller triangles so that which gives us a triangular mesh with h-refinement ,when this procedure is repeated for all triangles we obtain a triangular mesh with h-refinement for a linear convex polygon, further we can discretise each triangle into three special quadrilaterals a=0,1,2 which are obtained by joining the centroid to the midpoint of its sides and we have created an all quadrilateral mesh. We choose =an arbitrary triangle with vertices in Cartesian space We have furthered refined this mesh two times. We have taken this further and created 12-new quadrilaterals by joining the centroids of the 3-special quadrilaterals to the midpoints which can be called as first h-refinement.Then each of these quadrilateral is divided into four smaller quadrilaterals again by joining the centroids to the mid points of sides which we call as the second h- refinement.
Design of Tunable Method for PID Controller for Higher Order System
G. Umamaheshwari1, M. Nivedha2 J. Prisci Dorritt3
In this paper tuning of PID Controller for higher order system is given. Conventionally higher order system is reduced to lower order model, and then controller is designed. But in this method higher order system is considered as it is and controller designed based on desired Phase Margin (PM) method. Analytical expression for controller design is given which is based Mikhalevich method. The Mikhalevich method gives a way to select the desired phase margin.
Availability and Reliability analysis of a system having main unit and its sub unit & two associate units
V.K. Pathak1 , H.L. Manker2
Reliability is one of the most vital elements in Information industry. An understanding of failure process is central to any effort to model development and availability is closely related to reliability and can be calculated from reliability. Reliability measurements such as Transition Probability, Mean time to system failure, Availability and Busy period of repairman in repairing the failed units can be used to guide managerial and engineering decisions on various projects. This basic model represents a system having one main unit along with its sub unit and two of its associate units. Functioning of main unit is essential for the functioning of the system. Associate units depend on the main unit and sub unit. The system fails as soon as the main unit fails. There is a single repairman who is always available and repairs the failed unit. After a random period of time, the whole system goes for preventive maintenance. In this paper the failure and repair time distributions are taken to be exponential. Using Markov regenerative process, several system characteristics using chapman-kolmogorov equations are evaluated and profit analysis is done. This kind of analysis is of immense help to the owners of small scale industry. Also the involvement of preventive maintenance in the model increases the reliability of the functioning units.
VLIW (Very Long Instruction Word) machines are highly parallel ILP (Instruction Level Parallelism) based architectures that offer an alternative to scalar sequential architectures like CISC/RISC. With VLIW, the burdensome task of dependency checking by hardware logic at run time is removed and delegated to the compiler. HP and Intel introduced a new style of instruction set architecture called EPIC (Explicitly Parallel Instruction Computing), and a specific architecture called the IPF (Itanium Processor Family). EPIC is like VLIW, with extra cache prefetching instructions. EPIC combines the capabilities of both Superscalars processors and VLIW processors. Through this paper, we will try to understand the characteristics of EPIC processors which make them distinguishable
Brain Computer Interface applications and classification techniques
T. Shashibala, Bharti W. Gawali
Brain Computer Interface(BCI) has provided a direct medium of communication between human beings and external devices. It is a boon for people with severe motor disorders as it provides means of control and communication for them. Recently more efforts on out of the lab BCI based research with the help of Electroencephalogram (EEG) signals from brain has provided many applications. In this technology, specific features of brain activity are considered and transformed into device control actions. This kind of interface would help disabled individuals to become independent thus improving their quality of life
Design of Rectangular Microstrip Patch Antenna for Wi-Fi
Apoorva Jain .
The robust working and speed of a wireless communication will depend on the speed of the signal we can receive and transmit. Antenna selection can therefore have a significant impact on the working and speed of a wireless link. A microstirp patch antenna has a low profile and weight along with low fabrication cost. In this project, ANSYS HFSS is used to develop and simulate a Microstrip Line Feed Linearly Polarized Rectangular Microstrip Patch Antenna for Wi-Fi at 2.455 GHz.
Securing Privacy over Encrypted Cloud Data Using Multiple Keywords Rank Based Search
Parasa Srinivas Manikanta1 , V. Baby2
: Due to the existence of cloud computing , owners of that particular data had determined to push their complex data systems from local database systems to the public cloud which is commercial for better flexibility and for economical preservation. In order to protect privacy of data, the data that is sensitive need to be encrypted before deploying the data onto the cloud, however, it is very different from the plaintext keyword match which is a old-fashioned technique. Therefore, the searching of data on encrypted data is the prime importance. There are many data users who wish to search a file, based on their desired keywords, the file must be extracted matching the relevance of that user entered keywords. In this paper, we are designing and solving the provocation problem of securing Privacy over Encrypted cloud-data using Multiple keywords Rank based Search (PEMRS). We initiate a set of strong privacy essentials for a secure cloud-data using system.
Data mining is the procedure of mining knowledge from data. This is extensively studied field for research area, where most of the work emphasized over knowledge discovery. Outlier detection is an important task in data mining and it has many real time applications . In most of the applications data contains unwanted and unrelated data. Finding and removing anomalous data is very important and there by improve the quality and accuracy. The outliers are pinpoint or group that depends on data and applications. This paper focus on outlier concept, taxonomy and outlier detection using least square fit
An Overview and Evolution of the Intelligent Transportation System as VANETs
Garima Dhawan, Shilpa Nagpal
Vehicular Ad-hoc Network (VANET) in the recent years have emerged as a most attractive topic for researchers and automotive industries due to their tremendous potential to improve traffic safety, efficiency and other added services.The routing in VANET is the most challenging part of research. In this paper we have discussed the various routing protocols and also discussed the advantages as well as the disadvantages of these routing protocols.
Divya R. Jariwala1, Heta Desai2 , Samiksha H. Zaveri3
Mining Educational Data To Analyze The Performance Of Infrastructure Facilities Of Gujarat State
Divya R. Jariwala1, Heta Desai2 , Samiksha H. Zaveri3
School plays major role in our society of Economics, Education and Environment. A School infrastructure represents good School. Good school infrastructure includes with building in good shape including an adequate number of well-organized classrooms, sufficient blackboards, access to adequate clean drinking water, common toilets, girl’s toilets etc. The School which has got sophisticated and produces the good result then it’s attracted many students. The schools were analyzed with reference to staff, infrastructure, amenities and achievements. The data was analyzed by using WEKA software. Educational officers to take appropriate steps to improve the quality of education in Gujarat State. An attempt has been made for broad mapping and analysis of existing infrastructures in the context of planning scheme in Gujarat State district wise, and to delineate the development zones of educational infrastructure facilities. The thematic layers considered in this study are infrastructure accessibility, type and condition of classroom and number of classroom allocated for the educational system at primary and upper primary level. Data mining has emerged as one of the major research domain in the recent decades in order to extract implicit and useful knowledge. Data Mining is an emerging technique with the help of this one can efficiently learn with historical data and use that knowledge for predicting future behavior of concern areas. The work aims to use data mining techniques which mines required information, so that the present education system may adopt this as a strategic management tool.
Estimation of the Image Quality under Different Distortions
1Pinki, 2Rajesh Mehra
The evaluation of the image quality is very important. The best subjective evaluation of the image is done by the human eyes as they are the good receivers. Objective analysis of the image is done by using full reference metric. The results of the objective measurements are validated by the subjective measurements. The objective evaluation of image in this paper has been done using PSNR (Peak Signal to Noise Ratio), MSE (Mean Squared Error) and SSIM (Structural Similarity Index Metric). These algorithms are applied on the different images. If the value of PSNR increases, the corresponding value of SSIM also increases and the value of MSE decreases. In the proposed wok, if the value of MSE increases by 8%, then the corresponding PSNR decreases by 10%. This is not applicable on all images because the MSE, PSNR and SSIM changes according to the complexity of the image. Every image has different coefficient of complexity. More complex image gets distorted first than the less complex image.
Clustered Key Management Scheme to Maximize Life of Wireless Sensor Networks
K Vimal Kumar Stephen1¬ , Mathivanan V2
A WSN typically has no infrastructure. It is composed of number of sensor nodes works together to monitor an environment to obtain data about it. Structured and unstructured are the two types of WSNs. An unstructured WSN contains a dense collection of sensor nodes. These sensors are small, computing resources, with limited processing and they are inexpensive. It can sense, measure, and gather information from the environment based on some local decision process; they can transmit the sensed data to the user. Security and lifetime of a sensor node in a network plays an important role.
Review on Various Data Mining Approaches on Smart Phone
Rajbir Singh, Divya Garg
Smartphones are common and becoming more and heightened, with ever-growing computing, marketing, and feeling powers. It has been adjusting the landscape of people's everyday life and has subjected the options for several interesting data mining purposes, which range from wellness and health checking, particular biometric trademark, downtown computing, assistive design, and elder-care, to interior localization and navigation, etc. Individual task popularity is just a core making end behind these applications.
Multimedia Data Hiding using Fourier series on an Experimental Basis
M.Senthilkumar1, V.Mathivanan2
The compressed data transmission to be implemented in this work with the help of Fourier series information hiding mechanism. In the proceeding step, the necessity requirement is the problem statement formulation for designing the system. It mainly focuses on the partitions or segmentations of the multimedia content into a compressed one. The compressed content will be hided to be maintained the originality during the data transmission over the network
A Personalized Ontology Model For Web Mining Using Instance Matching
Madhuri S. Pawar1, K. V. Metre2
In rapid growth of internet, the amount of web information gathering becomes a challenging point for all users. Many existing retrieval systems have been developed to attempt to satisfy this problem. But still there is no complete solution to the problem. Ontology describes and formalizes standardized representation of knowledge as a set of concepts and the relationship between those concepts. In personalized web information gathering ontology is also used to represent the user profiles. For representing user profiles many existing models have been provided knowledge from either global or local knowledge base. The user background knowledge can be better discovered if we integrate global and local analysis. The proposed system emphasizes the specific semantic relation in one computational model. Ontology contains lots of instances. Automatically instance matching has become the fundamental issue. Instance matching approach is used to present based on discriminative property value. Ontology model in domain specific system with instance search gives more accurate result.
A Review on Atmospheric Parameter Monitoring System for SMART CITIES using Internet
Shruti A. Kothalkar1, , Prof. Dr. A. S. Joshi2
Objects of everyday life will be equipped with microcontroller, microprocessor, embedded system and transceivers for digital communication with one another and with the users, becoming an integral part of internet. In this paper, we focus specifically to an urban internet for atmospheric parameter system. Through the mechanism of internet, we make a modem specially focused on atmospheric parameter which consists of three sensors for air, noise pollution and temperature measurement and the controlling of street light thus connected to the embedded system. This transmitted the data to the server and received by the pc or GUI modem.For this, modem is designed to support SMART CITY vision, which aims at exploiting the most advanced communication technologies to support added-value services for the administration of the city and for the citizens
Binarization And Post Processing of Binarized Document Images for Background Text Removal
Aparna Patil1, Prof. Deepak Gupta2
There exists an assortment of proposal and books which are been composed long years back. So some of them which are essential for us we should protect them in terms of their degradation. In several instance, these reports are being debased because of some common causes such as discriminated color illusion or ink sipping from background to foreground etc. In order to isolate the content from those corrupted images, such document images need to be processed under efficient binarization methods. In this paper we will build up the framework that can fit for isolating the content from the debased image and few calculations are done such as gray scaling and local thresholding. Image contrast reversal, Edge estimation, Image bimodal binarization and Post Processing binarized images are incorporated into proposed system. Subsequent to utilizing these all techniques proposed system becomes ready to partition out the frontal area accumulation from back ground debasements.
Design and analysis of front axle for heavy commercial vehicle
Pravin R.Ahire1, Prof.K. H. Munde2
Front axle carries the weight of front part of the Automobile, as well as facilitates steering and absorbs shocks due to road surface variations. The front axle is designed to transmit the weight of the Automobile front the spring to the front wheels, turning right and left as required. So proper design of front axle beam is extremely crucial. The paper deals with design and analysis of front axle. The same analysis with help of FE results were compared with analytical design. For which paper has been divided in to two steps. In the first step front axle was design by analytical method. For this vehicle specification – its gross weight, payload capacity, braking torque used for subject to matter to find the principle stresses & deflection in the beam has been used. In the second step front axle were modelled in CAD software & analysis in ANSYS software.
Multilevel Security for Cloud Storage using Encryption Algorithms
K.Satyanarayana .
Cloud Computing is a set of Information Technology Services, like network, software system, storage, hardware, software, and resources and these services are provided to a customer over a internet. These services of Cloud Computing are delivered by third party provider who owns the infrastructure. The advantages of cloud storage are easy access means access to your knowledge anywhere, anytime, scalability, resilience, cost efficiency, and high reliability of the data. Because of these advantages each and every organization is moving its data to the cloud, means it uses the storage service provided by the cloud provider. So it is required to protect that data against unauthorized access, modification or denial of services etc. Hence security of cloud means securing the operations and storage of the cloud providers. In this research paper, I propose an algorithm called Multilevel algorithm by using which an unauthorized user cannot access or modify the data or calculations of cloud. The work plan is to eliminate the concerns regarding data privacy using cryptographic algorithms to enhance the security in cloud as per different perspective of cloud customers
Warning Time Analysis for Emergency Response in Sumbawanga City for the Repeat of Magnitude 7.2 Earthquakes of 1919 Using Proposed Community Earthquake Early Warning System
Asinta Manyele .
Sumbawanga city, with population of about 90 thousand has experienced several damaging earthquakes from Kanda Fault System which is a seismically active fault in eastern side, encompassing Lake Rukwa and Lake Tanganyika basins. The magnitude 7.2 earthquake of July 1919 is one of the historical earthquakes from Kanda fault that generated damaging ground motions centered in Sumbawanga city while reaching several town/cities situated up to several kilometers from its epicenter. The historical earthquakes information of the region indicates that large earthquakes are expected in near future from the Kanda fault system. The Community Earthquake Early Warning System (CEEWS) are tools used to capture earthquakes onset time and ground motion levels in communities for emergency responses. To prepare for future emergency management of magnitude 7.2 earthquakes in Sumbawanga city, the study evaluates the warning times possible from the deployment of CEEWS in the city. Warning times calculated by simulating the event, indicates that Sumbawanga city will have approximately 8 second of warning times before the arrival of strong shaking if the processing and transmission delays are minimal. Warning times are meant to allow the appropriate emergency precautionary actions to be taken by the government officials, companies and individuals during the imminent earthquakes.
Extract User Tweet Post Location & Detect Social and Disastrous event using NER & POS Tags
Mr.Chetan Puri, Prof.S.M.Roakde
Mapping Twitter conversations on maps over time has become a well-liked means of visualising conversations around events on Twitter. giant events are the topic of most of those kinds of visualizations, wherever the speed of geo-tagged tweets is high enough to create attention-grabbing visualizations over the chosen time period. However, within the case of smaller events, or smaller countries wherever the frequency of tweets generated for events is lower, we tend to area unit naturally long-faced with an occasional variety of geo-tagged tweets, that makes it uninteresting to use these information for mapping and visualisations. This paper demonstrates application of Twiloc - a tweet location detection system - for mapping the voice communication around associate degree EU Qualifiers match between eire and Scotland. The paper more presents atiny low comparison between the results obtained from Twiloc and CartoDB Twitter Maps for national capital Marathon tweet dataset. Twiloc uses varied options for deciding the situation of each single tweet it receives, leading to a considerably higher rate of tweets with associated location data, and thence allows tweet location analysis and image for smaller events.
The present paper proposes a novel way of extracting local texture features based on the Morphological Primitive Patterns with grain components (MPP-g) on texton based Local Directional Pattern (LDP) for effective stone texture classification. To reduce the dimensionality, the proposed research used Integrated Logical Compact LDP with OR operation on Textons (ILCLDP-T). Mathematical Morphology (MM) provides an efficient framework for analyzing object shape characteristics (such as size and connectivity) due to its geometry-oriented nature which are not easily accessed by linear approaches. A LDP feature is obtained by computing the edge response values in all eight directions at each pixel position of LBP and generating a code from the relative strength magnitude. The local descriptor LDP is more consistent in the presence of noise and illumination changes, since edge response magnitude is more stable than pixel intensity. The proposed Morphological Primitive Patterns with grain components (MPP-g) on ILCLDP-T are rotationally invariant due to Kirsch Edge Response. The present method experimented on the Dataset consists of various brick, granite, and marble and mosaic stone textures with resolution of 256×256 collected from Vistex, Mayang database and also from natural resources from digital camera. The experimental results and comparison with the other methods show the supremacy of the proposed method over the existing methods
Automatic Detection and Classification of Cervical Cancer in Pap Smear Images using ETCM & CFE methods based Texture features and Various Classification Techniques
S.Athinarayanan1, M.V.Srinath2
Abstract: Classification of the cervical cell is one of the most important and crucial tasks in the medical image analysis. Due to its importance, the aim of the paper is to investigate about the classification of Cervical Cell as Normal Cell or Abnormal Cell by using individual feature extraction method and combining individual feature extraction features method with the classification technique. In this paper four Feature Extraction methods were used: from that four, two were existing individual feature extraction methods namely Gray Level Co-Occurrence Matrix (GLCM) & Texton Co-Occurrence Matrix (TCM) and the remained two were proposed novel methods. From that proposed two, one was individual feature extraction method, that is Enriched Texton Co-Occurrence Matrix (ETCM) and other was combining individual feature extraction features method, that is Concatenated Feature Extraction (CFE). The CFE method represents all the individual feature extraction methods of GLCM, TCM & ETCM features are combining together to one feature to assess their joint performance. Then, these four feature extraction methods are tested over three classifiers such as Support Vector Machine (SVM), Radial Basis Function (RBF) and Feed Forward Neural Network (FFNN). This Examination was conducted over a set of single cervical cell based pap smear images. The dataset contains two classes of images, with a total of 952 images. The distribution of number of images per class is not uniform. Then, the performance of the proposed system was evaluated in terms of the statistical parameters of sensitivity, specificity & accuracy in both the individual feature extraction method with the classification techniques and combining individual feature extraction methods with the classification techniques. Hence, the performance of individual combination method described, the proposed ETCM features with SVM Classifier combination had given the better results than the other combinations such as ETCM with RBF Classifier, ETCM with FFNN Classifier, GLCM with SVM Classifier, GLCM with RBF Classifier, GLCM with FFNN Classifier, TCM with SVM Classifier, TCM with RBF Classifier & TCM with FFNN Classifier. Then the performance of the combining individual feature extraction features method described, proposed Concatenated Feature Extraction (CFE) method with SVM Classifier had given the better results than all other remained CFE method with classifier combinations and all other individual feature extraction and classification combinations.