IJCER Online Submission
IJCER Paper Submission Format
Publication Charges - IJCER

Frequency: 12 issues per year
ISSN: 2250–3005 (online version)
Published by: IJCER
Call for paper...

International Journal of Computational Engineering Research (IJCER) Volume 2, Issue 3, May-June, 2012

S.No. Article Title Page No. PDF  

Image Analysis Techniques for Fingerprint Recognition

Vidyadevi G Biradar, H Sarojadevi


Fingerprint recognition is a method of biometric authentication that uses pattern recognition techniques based on fingerprints image of the individual. Fingerprint patterns are full of ridges and valleys and these structures provide essential information for matching and classification. The steps for fingerprint recognition include image acquisition, preprocessing, feature extraction and matching. A number of pattern recognition methods have been used to perform fingerprint matching. In this paper a survey of fingerprints matching methods are presented, they have been classified into approaches based on minutiae, image transform and hybrid approaches, among them minutiae based methods are widely used, and Hybrid methods are used for more reliable matching with an additional computational cost. Comparison and contrasting of all these methods reveals that a lot of emphasis is put into the design of accurate fingerprint features extractor to improve the classification accuracy..

Keywords: Coflow Non-premixed turbulent combustion OpenFOAM reactingFoam Swirling flow Swirl number

Computational studies of swirl effects on instabilities and pollutions due to non-premixed turbulent combustion

Luthenda Gamany, Taha Janan Mourad, Agouzoul Mohamed


Considerable effort is currently being extended by means of open CFD analysis to examine and fight against mechanisms responsible of combustion instabilities and environment pollution due to CO2 and NO productions. To achieve that, the present paper suggests a system based on injection of a secondary air swirling flow in a non-premixed turbulent combustion chamber fed by fuel oil n°2. Computational studies are based on analysis of swirl intensity impact using OpenFOAM's solver named reactingFoam to compare the recommended system to a basic combustor of drying furnace. Data allowing discussions are temperatures and concentrations of unburned species and products of gas combustion calculated at transversal and longitudinal sections of the combustion chamber. The fact that results obtained reveal no risk of flashback or blowing phenomena, fast diminution of unburned products, significant thermal losses near walls, reduction of CO2 production combined to a rise of NO formation pushes us to investigate more about the proposed apparatus.

Keywords: Cryptography, Data Encryption Standard, Pseudo Random Number Generator, Secret key.

Analysis of Skew Bridges Using Computational Methods

Vikash Khatri, P. R. Maiti, P. K. Singh, Ansuman Kar


In spite of increases in computing power, analysis of skew bridge deck has not changed to the same extent. Therefore, there is a need for more research to study the skew bridges using different computational methods. Grillage analyze is a fast and simpler approach compared to the finite element method, and has been used by engineers to analyses bridge deck over a long time. On the other hand the finite element method is thought to be better method for the slab analysis because of its capability to represent the complex geometry of the structure more realistically. In this present study, a bridge deck consists of beam and slab is defined and modeled using grillage and finite element method. The effect of grid spacing on different skew angles on same-span of reinforced concrete bridges using the finite-element method and grillage analogy method is compared. Maximum reactions force, deflection, bending and torsional moments is calculated and compared for both analysis methods. A total of nine different grid sizes (4 divisions to 12 divisions) have been studied on skew angles 30°, 45° and 60° to determine the most appropriate and efficient grid size. It is observed that finite element method (FEM) and Grillage method results are always not similar for every grid size. Bending moment calculated by using FEM overestimates the results obtained by grillage analysis for larger grid sizes. Torsion moment behavior shows reverse of bending moment and difference between reaction values of grid sizes between two methods decreases as skew angle increases. FEM gives lesser variations of bending and torsional moment with the change of grid sizes than Grillage one. Deflection doesn't vary much on the change of the grid sizes. The appropriate grid size is estimated for this narrow and long bridge is seven divisions whose ratio of transverse to longitudinal grid spacing is about 2.

Keywords: Skew slab, FEM, Grillage analysis, Grid size

A New Omni-directional Monopole Antenna for Interference Reduction

T .S. Ghouse Basha , K.Tulasi Krishna , C.Chandrakala, V.Kishore, D.Aruna


A compact Ultra wideband (UWB) antenna with band- notched characteristic is presented in this paper. It has compact size of 30 mm x 31mm and has ultra wide band operation. A 'C' shaped slot was introduced to achieve band notch function from 4.8 GHz to 6 GHz to avoid interference from WLAN. The proposed antenna has ultra wide band frequency range from 3 GHz to 11.5 GHz for return loss below -10 dB, except for frequency stop band from 5 GHz to 6 GHz. Details of antenna are presented with parametric study for Ultra Wideband Applications. The bandwidth is varied by varying the width W4 of inner tuning stub and height L3 of feed and Ultra Wideband width is obtained. The antenna is Omni directional in operating bandwidth and it has good radiation efficiency. The fundamental parameters such as return loss, VSWR, radiation pattern are obtained, which meet standard specifications. Method of moments based IE3D simulator is used to analyze this antenna.

Key words: Coplanar waveguide, UWB antenna, VSWR, Omni directional radiation pattern

A Novel Two Stage Binary Image Security System Using (2,2) Visual Cryptography Scheme.

Mr. Rohith S, Mr. Vinay G


Visual Cryptography Scheme (VCS) is an encryption method that uses combinatorial techniques to encode secret written materials. The idea is to convert the written material into a binary image and encode this image into n shadow image, it is also called as shares of images. The decoding only requires selecting some subset of these n shadow images, making transparencies of them, and stacking them on top of each other. Main advantage of this scheme is mathematical computation complexity is reduced compared to conventional cryptography technique. This paper presents design of visual cryptography scheme which gives two stage security, such that we are not able to get back the original image at the first decoding stage. As the cryptographic algorithm become more relevant, it will become inefficient. So, the basic model of visual cryptography is not an efficient tool to hide the information anymore. This is our try to improve its efficiency. The performance of this algorithm is compared between three different binary images. Result shows that no residual information present in the shares.

Keywords: Visual Cryptography Scheme, LFSR, Binary Image Security.

Numerical Investigation of Secondary Flow In An Axial Flow Compressor Cascade

T. Suthakar, Akash Dhurandhar


A numerical study has been conducted for steady, three dimensional viscous flow fields in an axial flow compressor cascade focusing basically on the secondary flow formation. The study is made on the basis of static pressure distribution along the blades with different inflow angles. The k-ε model was used and a finite volume method was employed to solve the flow governing equations. The present work consists of a compressor cascade analysis with chord length of 5 units and pitch of 0.55 times chord length. For this analysis ideal gas is taken as the fluid with atmospheric pressure and a temperature of 300 K, αi = 26.5o, 35o and 45o, Ma = 0.66, Re = 0.6 x 106. Flow analysis is carried out using commercial CFD codes (Fluent 6.3.26). The computational results support the secondary flow occurrence in a compressor cascade for specific conditions.

Key words: secondary flow, turbulence intensity, axial compressor cascade, cascade losses


Brijesh Shah, Jigar Modh, Satish Shah


Image segmentation plays an important role in the medical imaging. Image segmentation is the process to divide the image in the different region which has similar attributes. In this paper we proposed marker controlled watershed segmentation algorithm which works on gradient magnitude and use linear operation. The Linear convolution based image reconstruction algorithm used here is enables us to obtain better than general algorithm and morphological image reconstruction. This also reduces the overall time of algorithm than other marker controlled watershed algorithm. This method can be applied to noisy and degraded images. The use of markers can avoid the problem of over segmentation. The algorithm works on color as well as gray-scale images. The segmentations of X-ray image, MR image and also Mammographic images for breast cancer detection is shown in this paper. This algorithm can also be applied to high resolution satellite images.


Smart Lighting and Control using MSP430 & Power Line Communication

Sanjay Belgaonkar, E. Elavarasi, Gurjeet Singh


Smart Lighting is a lighting technology designed for energy efficiency. It includes high efficiency fixtures, day lighting and automatic controls that make adjustments based on conditions such as occupancy. This smart lighting system is connected through power line which is also used for communication. Power Line Communication (PLC) is a technology which uses power lines as physical media for data transmission. PLC can offer a "no new wires" solution because the infrastructure has already been established. PLCs are used for transmitting data at rapid speed through a power line in a house, an office, a building and a factory etc. Here, the existing alternating current (AC) power wires serve as a transmission medium by which information is relayed from an AC source. Present paper deals with design and development of a smart lighting system which is controlled by MSP430 microcontroller and power line communication..

Keywords: MSP430, Power Line Communication (PLC), Sensors, Smart Lighting System

The e-Health scenario with latest trends in EMR applications: A Review of EMR techniques with healthcare framework

Onkar S Kemkar, Dr P B Dahikar


Over the past few years, information systems have become increasingly important in healthcare delivery. The use of computers in a wide range of medical applications and healthcare management is one potential alternative to reducing the overall costs of healthcare delivery. Further, the use of sophisticated decision support systems is envisaged to improve quality of clinical decision making. The paper discusses what medical informatics is, the definition of EMR, efforts required for extraction of patient data from heterogonous EHRs. Gain new knowledge within secure distributed systems and software agents. Based on experimental work and some of the pilot studies a system has been designed and developed in the field of EMR & EHR

Keywords: medical informatics, ehealth, e-records, health informatics, EMR, EPR


Utpal Sharma, Sunil Kumar Chakraborty


The process transformation and e-governance in technical institutes is one of the important components for improving the efficiency. This also improves the student's satisfaction index apart from making them more IT savvy. This paper deals with the case study showing how e-Governance initiative with in-house capacity has transformed the institute process without incurring any expenditure.


Optimal Power Flow Using Differential Evolution Algorithm With Conventional Weighted Sum Method

Rohit Kumar Verma, Himmat Singh, Laxmi Srivastava


Optimal reactive power dispatch is one of most important task in the today's power system operation. This paper present optimal reactive power dispatch with the help of differential evolution algorithm. The optimal reactive power dispatch is a non linear constraints multi objective optimization problem where the real power loss, voltage deviation and fuel cost are to be minimized under control and dependent variables.. Reactive power optimization is a mixed integer nonlinear optimization problem which includes both continuous and discrete control variables. The suggested algorithm is used to find the setting of control variables, such as voltage, transformer tap position and reactive compensation devices to optimize a certain objective. A Differential Evolution Algorithm based approach is proposed to handle the problem as a true multi-objective optimization problem. The standard IEEE 30-bus test system is used and the results show the effectiveness of Differential Evolution Algorithm and confirm its potential to solve the multi-objective optimal reactive power dispatch problem. The results obtained by Differential Evolution Algorithm are compared and validated with conventional weighted sum method to show the effectiveness of the proposed algorithm.

Keywords- Differential evolution algorithm, Power loss minimization, Voltage deviation, Multi-objective weighted sum method.

Minimization of Reactive Power Using Particle Swarm Optimization

Vivek Kumar Jain, Himmat Singh, Laxmi Srivastava


This paper presents an efficient and reliable Particle Swarm Optimization (PSO) algorithm for solving Reactive power optimization including voltage deviation in Power System. Voltage deviation is the capability of a power system to maintain up to standard voltages at all buses in the system under standard conditions and under being subjected to a disturbance. Reactive power optimization is a complex combinatorial programming problem that reduces power loses and improves voltage profiles in a power system. To overcome this shortcoming, a multi-objective particle swarm optimization is proposed and applied in reactive power optimization on IEEE-30 bus, Here the RPO problem has been formulated as a constrained multi-objective optimization problem by combining of two objective functions (real power loss and voltage profile improvement) linearly shows that the particle swarm optimization more effectively solve the reactive power optimization problem in power system.

Keywords: - Reactive power optimization, multi-objective particle swarm optimization, voltage deviation and loss minimization.

Speaker Recognition System Using Combined Vector Quantization and Discrete Hidden Markov model

Ameen Khan A, N V Uma Reddy, Madhusudana Rao


This paper presents a speaker verification system using a combination of Vector Quantization (VQ) and Hidden Markov Model (HMM) to improve the HMM performance. A Malay spoken digit database which contains 100 speakers is used for the testing and validation modules. It is shown that, by using the proposed combination technique, a total success rate (TSR) of 99.97% is achieved and it is an improvement of 11.24% in performance compared to HMM. For speaker verification, true speaker rejection rate, impostor acceptance rate and equal error rate (EER) are also improved significantly compared to HMM.

Keywords- Speaker recognition, speaker verification, hidden Markov model, vector quantization

Design, Implementation and Performance Analysis of an Integrated Vedic Multiplier Architecture

Ramachandran.S, Kirti.S.Pande


Fundamental and the core of all the Digital Signal Processors (DSPs) are its multipliers and speed of the DSPs is mainly determined by the speed of its multipliers. Multiplication is the most fundamental operation with intensive arithmetic computations. Two important parameters associated with multiplication algorithms performed in DSP applications are latency and throughput. Latency is the "real delay of computing a function". Throughput is a measure of "how many computations can be performed in a given period of time". The execution time of most DSP algorithms is dependent on its multipliers, and hence need for high speed multiplier arises. Urdhva tiryakbhyam sutra performs faster for small inputs and Nikhilam sutra for larger inputs. Here a novel Integrated Vedic multiplier architecture, which by itself selects the appropriate multiplication sutra based on the inputs, is proposed. So depending on inputs, whichever sutra is faster, that sutra is selected by the proposed integrated Vedic multiplier architecture. In the simulation results, it can be seen that Urdhva performs faster for small inputs, but Nikhilam performs better for large inputs (more than twice as much for 64 bit multiplicands).

Keywords: Integrated Vedic multiplier architecture, Nikhilam sutra architecture, Urdhva tiryakbhyam sutra.

Performance Comparison Study of AODV, OLSR and TORA Routing Protocols for MANETS

Manjeet Gupta, Sonam Kaushik


Mobile Ad hoc Networks (MANETs) are the special type of wireless network, where mobile nodes are connected through wireless interfaces forming a temporary network. They don't need fixed infrastructure. Due to higher mobility in nodes and dynamic infrastructure of MANETS, Routing is important issue in ad hoc networks. There are many routing protocol in MANETS like AODV, TORA, DSDV, OLSR, DSR etc. MANETS is classified in three routing protocols. This research paper make a comparison of these routing protocol based on the performance metrics like packet delivery fraction, end–to–end delay and throughput. Simulation is used to compare the performance of AODV, OLSR and TORA. NS2 (Network Simulator version2) is used as simulator. With the help of ns-2, result shows that AODV's performance in PDF and throughput metrics is better than OLSR and TORA. For end-to-end delay metrics TORA perform better than OLSR and AODV.

Keywords: AODV, MANET, OLSR, Routing Protocols, , TORA.

EIA for Ramapada Sagar (Polavaram) Irrigation Project using the Model of RS and GIS

Sreeramulu. Y, Murali Krishna.I.V


At Global level, Asian countries like India and China have experienced untold environmental degradation and ecological deterioration in the past century, with little or no real solution to alleviate many of these concerns. Poorly planned human interference has been the major cause. Adequate information and appropriate technology are limiting factors for effective environmental management. Hence, efforts to improve, conserve and protect the environment will include not only the resolution of political policies but also the application of a state-of-the-art scientific approach to planning and implementation. The process of Environmental Impact Assessment (EIA) was developed as an effective planning tool. The genuine conduct of this process will go a long way in reducing environmental deterioration. Because of the dynamic characteristics and multivariate nature of the environment, it has often been difficult to collate, analyze and interpret its data sets. However, this great complexity can be overcome with the present research of engineering management system model of Remote sensing and geographical information system and related technology with the ground truth verification.

Speaker Features And Recognition Techniques: A Review

Dr. Mahesh S. Chavan , Mrs. Sharada V. Chougule


This paper gives an overview of various methods and techniques used for feature extraction and modeling in speaker recognition. The research in speaker recognition have been evolved starting from short time features reflecting spectral properties of speech (low-level or physical traits) to the high level features (behavioural traits) such as prosody, phonetic information, conversational patterns etc. Low level acoustic information such as cepstral features has been dominated as these features gives very low error rates (especially in quiet conditions). But they are more prone to error in noisy conditions. In this paper various features along with modeling techniques used in speaker recognition are discussed.

Computing Over a Multi Cloud for MTC Applications

Challa Vanitha Reddy, Battula Sudheer Kumar


IT organizations can now outsource computer hardware by leasing CPU time through cloud computing services. The problem here is the effectiveness is becoming less due to burden on single cloud while working with MTC applications .This paper deals with defining the feasible solutions with MTC applications using the programming models for computing over a Multi clouds instead of single cloud for effectiveness.

Keywords- Cloud Computing, MTC applications, HTC


Suraya Mubeen, Dr.A.M.Prasad, Dr.A.Jhansi Rani


Smart antennas are systems attract lot attentions now and believably more in the future, as it can increase the capacity of mobile communication systems dramatically. Design of smart antenna systems combines the technologies of antenna design, signal processing, and hardware implementation. : A smart antenna is therefore a phased or adaptive array that adjusts to the environment. That is, for the adaptive array, the beam pattern changes as the desired user and the interference move; and for the phased array the beam is steered or different beams are selected as the desired user moves. The early smart antenna systems were designed for use in military applications to suppress interfering or jamming signals from the enemy. The proposed research work gives us an overall view of basic smart antennas and its techniques.

Keywords: AAntenna, Miso, Diversity, Beamforming 1. Introduction: A single antenna element is mostly omni-directional (a). This means it receives and sends in and from all directions around it. If a base station uses an omni-directional antenna and a user communicates with this station, every signal that is send back and forth between the two devices is at the same time a source of interference for any other communication taking place within the same cell.


A Comparative Analysis of Fuzzy C-Means Clustering and K Means Clustering Algorithms

 Mrs. Bharati R.Jipkate, Dr. Mrs.V.V.Gohokar


Segmentation of an image entails the division or separation of the image into regions of similar attribute. The most basic attribute for segmentation of an image is its luminance amplitude for a monochrome image and color components for a color image. Clustering is one of the methods used for segmentation. The objective of this paper is to compare the performance of various segmentation techniques for color images. K-means clustering and Fuzzy C-Means clustering techniques are compared for their performance in segmentation of color images.

Keywords: K-Means clustering, Fuzzy C- Means Clustering. 1. Introduction: A single antenna element is mostly omni-directional (a). This means it receives and sends in and from all directions around it. If a base station uses an omni-directional antenna and a user communicates with this station, every signal that is send back and forth between the two devices is at the same time a source of interference for any other communication taking place within the same cell.


Performance Analysis of Timing Attack on Elliptic Curve Cryptosystem

 Mr. Praful V. Barekar, Prof. K. N. Hande


Cryptosystems often take slightly different amounts of running time depending on the input and the used key. This timing information, extracted from the decryption process, can be used to derive information about the secret key. This new class of attacks on implementations of cryptosystems is named Timing Attacks. Timing attacks attempt to exploit the variations in computational time for private key operations to guess the private key. This type of attack is primitive in the sense that no specialized equipment is needed. An attacker can break a key by simply measuring the computational time required by the user inputs and recording those user inputs. This paper is aimed to analyse the performance of Timing Attack on Elliptic Curve Cryptosystem. The main advantage of Elliptic Curve Cryptography is smaller key size, it is mostly used for public key infrastructure

Keywords: Cryptosystem, Timing Attack, Running Time, Elliptic Curve Cryptography, Public key Infrastructure.


Hand Held Emergency Wireless Telemedicine System

 Suganthi.J. N.V.Umareddy, Sridharan.B


with the rapid development of computer science and communication technologies, doctors will employ electronic communication to facilitate patient care more and more. We have developed a portable telemedicine system which is much more flexible, robust and easier to use.It helps eliminate distance barriers and can improve access to medical services that would often not be consistently available in distant rural communities. It is also used to save lives in critical care and emergency situations. There are many healthcare technologies which have been implemented around the globe. Amongst these technologies, very few are used for an emergency case. In this paper we are going to introduce Portable Emergency System which is based on Locate- Diagnose-Move technique. This system is the collaboration of GSM/GPRS, GPS, sensors (wearable device) and P2P technology. This system is will be useful for all people especially for the emergency movement and diagnoses the roaming heart (Cardiac) patients, diabetes patients, elder peoples as well as accidental victim. The main aim of this system is to provide urgent provisional medication and movement of patient to the hospital which can save lives of many before the contact of expert doctors.

Keywords: India; TeleHealth; Telemedicine; GSM, GPRS; GPS;P2P (Peer to Peer); , mobile, wireless, Cardiac


Secure and Reliable Data Transmission in Wireless Sensor Network: A Survey

 Rudranath Mitra, Rudranath Mitra, Tauseef Khan


Wireless sensor network increases its application in industrial field as well as in consumer application very rapidly. Its growth increases day by day. Sensor node normally senses the physical event from the environment such as temperature, sound, vibration, pressure etc. Sensor nodes are connected with each other through wireless medium such as infrared or radio waves it depends on applications. Each node has its internal memory to store the information regarding the event packets. Basically this whole sensor network called sensor net is working in a distributive manner, sensor nodes are deployed in a huge area and use to send data packet in broadcast manner. This data packet finally reaches to the base station or called sink and vice versa. Nodes are deployed over a huge region in an ad-hoc based manner and use to sense the physical events. If any region cannot be sensed by any nodes then that region is called blind area. If blind area is too large then data retrieval is become unreliable. Nodes normally works in a collaborative manner to perform a specific task by transferring data packet to its neighbor nodes and so on until it reached to the base station. Every node has its own transmission range and within this transmission range node can transmit data packet. The event packet which sensor node transmit may be secret or confidential for the application ; so the data transmission must be secured to maintain the confidentiality of data packets.

Keywords: One way hash chain (OHC), Request for missing packet (RMP), Message authentication code (MAC), Base Station (BS).


Detection and Classification of Epileptic Seizures using Wavelet feature extraction and Adaptive Neuro-Fuzzy Inference System

 Dr. D. Najumnissa, Dr. T. R. Rangaswamy


Epilepsy, a neurological disorder in which patients suffer from recurring seizures, affects approximately 1% of the world population. In this work, an attempt has been made to enhance the diagnostic importance of EEG using Adaptive neuro fuzzy inference system (ANFIS) and Wavelet transform coefficients. For this study, EEG for 20 normal and 30 seizure subjects under standard recording procedure is used. A method based on wavelet transform and ANFIS is used to detect the epileptic seizures. Further, BPN algorithm is used to study and compare the datasets. Average specificity of 99% and sensitivity of 97% are obtained. Results show that the ANFIS is able to detect seizure. It appears that this method of detection makes it possible as a real-time detector, which will improve the clinical service of Electroencephalographic recording.

Keywords: ANFIS, ANN, BPN, Discrete Wavelet Transform, Epileptic seizure.


Customer and User Requirements Modeling Enhanced Software Development

 Tawfik Saeed Zeki


This paper explores various software models for business globalization and the nature of customer requirements. It discusses building a new model which the customer need involve System Development Models as well as the system user requirement before building customer applications.


Enhanced Clusterhead Selection Algorithm Using LEACH Protocol for Wireless Sensor Networks

 Rudranath Mitra, Anurupa Biswas


WSN is now a days a vast field for research. Its growth increases day-by-day. Routing protocol with energy efficiency has been a challenging issue in the design of wireless sensor networks. Efficiency and security are two topics in the design of routing protocol for WSNs. Heinzelman, ET. Al. introduced a hierarchical clustering algorithm for sensor networks; called Low Energy Adaptive Clustering Hierarchy (LEACH).The new proposed scheme describes two new ways to select Cluster head. Analysis shows that the improved or enhanced LEACH protocol balances the energy expense, saves the node energy and hence prolongs the lifetime of the sensor network.

Keywords: Clusterhead, , Energy conservation, Energy level, Optimum distance.

Implementing VGA Application on FPGA using an Innovative Algorithm with the help of NIOS-II

 Ashish B. Pasaya, Kiritkumar R. Bhatt


Basically, here we have used VGA for implementing basic graphics applications that can be either used in a single user game or either in advertisements that deals with real-time application. Further expanding the logic with the coding part even double user game could be developed. So, we thought of using VGA as a standard for this implementation as it is the basic graphics array and compatible with other graphical arrays. Here, we used HDL language on Quartus-II software for interfacing the required peripheral to the NIOS-II Soft-core processor through FPGA Cyclone-II Processor. Where, we made use of the Innovative Algorithm for implementing the application of VGA with the help of C language on NIOS-II Soft-core processor that will contain the logic part. Finally, the results that we obtained for VGA application implementation.

Keywords: VGA, FPGA, HDL, Quartus-II, NIOS-II, DE2 Education Board.


 R. K. Mishra


Glass fiber reinforced resol/vac-eha composites have been fabricated in laboratory to determine the dynamic behavior of glass fiber reinforced composites. Resol solution was blended with vinyl acetate-2-ethylhexyl acrylate (vac-eha) resin in an aqueous medium with varying volume fraction of glass fibers. The role of fiber/matrix interactions in glass fibers reinforced composites were investigated to predict the stiffness and damping properties. In order to study the static and dynamic response of Resol, Resol/VAC-EHA blend and glass fibers reinforced composites, a multiquadric radial basis function (MQRBF) method is developed. MQRBF is applied for spatial discretization and a Newmark implicit scheme is used for temporal discretization. The discretization of the differential equations generates a larger number of algebraic equations than the unknown coefficients. To overcome this ill conditioning, the multiple linear regression analysis, which is based on the least square error norm, is employed to obtain the coefficients. Simple supported and clamped boundary conditions are considered. Numerical results are compared with those obtained by other analytical methods.

Comparitive Study of Advanced Database Replication Strategies

 A. Pramod Kumar, B.Sateesh


In this paper we are comparing two advanced replication strategies namely materialized view replication and multi master replication are analyzed based on the Five partitioning algorithms namely range partitioning, hash partitioning, list partitioning, composite range-hash partitioning and composite range-list partitioning have been implemented for both the replication strategies. The performance of all these partitioning algorithms have been evaluated for each of the replication strategies with simulation results.

Keywords: Partitioning Algorithm , Replication, Distributed Database, Centralized Database.

Vehicular Number Plate Recognition Using Edge Detection and Characteristic Analyisis of National Number Plates

Bharat Raju Dandu, Abhinav Chopra


In this work, we propose a framework that uses a camera installed at roadside to detect the vehicle number plate. A typical video component requires several adjustments after image has been stored such as enhancement of image, localization of number plate, separating each character and recognition of each character. In this work we extract the vehicle number plate from our image and then recognize it based on the characteristics of number plates in different countries. We use Sobel edge detection for plate localization, template matching and fuzzy logic for recognition. We make use of the characteristics of the vehicle number sequences to further enhance performance. Vehicle number plate recognition can be used to decrease human effort by making systems automatic.

Keywords:Number plate, Sobel algorithm, vehicle and camera.

Comparative Analysis of Image Registration using SIFT and RANSAC method

 Riddhi J Ramani., N.G. Chitaliya


Image registration is a prerequisite step prior to image fusion or image mosaic. It is a fundamental image processing technique and is very useful in integrating information from different sensors, finding changes in images taken at different times, inferring three- dimensional information from stereo images, and recognizing model-based objects. As a large dimension of the traditional SIFT descriptor and its complex algorithm and improved algorithm of SIFT is presented which can reduce the dimension and also can improve the time saving and complexity reducing and to improve the accuracy of matching, RANSAC is applied for removal the wrong matching points

Keywords: Image Registration, SIFT, RANSAC

Assessment of Radiation Emission from Waste Dumpsites in Lagos State of Nigeria

 Olubosede. O, Akinnagbe .O.B., Adekoya O


This paper takes a look at the total radiation emanating from waste dumpsites in two cities of Lagos state Nigeria. This was achieved using a radiation survey meter (RADALERT50) to measure the radiation exposure rate in micro sievert per hour (μSvhr-1). Readings were taken by placing the detector at gonad level i.e. about 1 meter above the ground level in five sampling locations; this was done at an interval of 5meters away from the point of reference up to 30 meters. The results obtained revealed that the annual absorbed dose rate measurements taken inside the five dumpsites are 29.80μSvhr-1, 28.05μSvhr-1, 19.29μSvhr-1, 17.53μSvhr-1 and 15.78μSvhr-1. This is far lower than the average of 70μSvhr-1 recommended by UNESCO on effect of Atomic Radiation.

Keywords: Radiation Emission, Waste dumpsites, absorbed dose rate.

Performance Evaluation of different α value for OFDM System

 Dr. K.Elangovan


Orthogonal Frequency Division Multiplexing (OFDM) has recently been applied in wireless communication systems due to its high data rate transmission capability with high bandwidth efficiency and its robustness to multi-path delay. Fading is the one of the major aspect which is considered in the receiver. In this paper the Performance Evaluation of α (0.05, 0.005 and 0.0005) values for OFDM System using LMS Algorithm. Thes

Keywords: OFDM, BPSK and QPSK modulations and LMS Algorithm

Image Segmentation Using Active Contour Model

 Abhinav Chopra, Bharat Raju Dandu


Image segmentation is one of the substantial techniques in the field of image processing. It is vastly used for medical purposes , tracking growth of tumor for surgical planning and simulation. Active contours or snakes are used extensively for image segmentation and processing applications, particularly to locate object boundaries. Active contours are regarded as promising and vigorously researched model-based approach to computer assisted medical image analysis. However, its utility is limited due to poor convergence of concavities and small capture range. This paper shows the application of an external force that largely solves both problems. This external force is called gradient vector flow (GVF). Using several examples to show that, GVF because of its large capture range moves snakes into boundary concavities.

Keywords: Active contour models, edge detection, gradient vector flow, image segmentation, snakes

Sbpgp Security Model Using Iodmrp

 Meenakshi Mehla, Himani Mann


Today's world is mobile and using ad hoc network. Routing is the reactive on-demand philosophy where routes are established only when required.Security is one of the most important concepts in ad hoc networks. So different strategies for security are suggested. The study here proposes a theory in this paper based on PKI with IODMRP. The study should help in making protocols more robust against attacks and standardize parameters for security in routinsg protocols. PKI, PGP and SPGP plays the vital role in terms of security.It is easy to manage the security of a fixed network but for a mobile and dynamically changing network it is very cumbersome. Thus in this current paper we are focus on the security with Public key infrastructures and its various types that can help to maintain the security in the Mobile adhoc network.


Design of Neural Architecture in 0.35μmTechnology Using Analog VLSI

 Mr.Maulik B.Rami, Prof.H.G.Bhatt, Prof.Y.B.shukla


Artificial neural network are attempts to mimic, at least partially, the structure and function of brains and nervous systems. The human brain contains billion of biological neurons whose manner of interconnection allow us the reason, memorize and compute. Advances in VLSI technology and demand for intelligent machines have created strong resurgence of interest in emulating neural system for real time applications. Such an artificial neural network can be built with help of simple analog components like MOSFET circuits and basic circuits with help of operational amplifier. This paper gives information about neuron behavior and how it takes intelligent decision.

Keywords: Introduction, Architecture of neuron, Analog Neural components, Simulation Results, Conclusion& future work.

Combining Multimedia Building Blocks In Image Analysis

 P. Shanmugam, Dr. C. Loganathan


Messages are given as text needs to be coded to avoid loss of secrecy in any transaction. This has been atempted through any media requires tough crypto analysis and can be handled through known ciphers. This job gets better results while combining the text message along with other building blocks of multimedia. In this paper a noval approach to insert a message on an image and both of them are passed in a media using elliptic curve crypto systems. Caution is made in maintaining the quality of the vehicle image carrying the text. We have presented the algorithm and illustrated through standard images used in image analysis.

Keywords: Elliptic curve cryptography, Encryption, Image analysis, Multimedia, Text conversion.

An Intrinsic Dislocation Density – Finite Element Formulation Of Metal Plasticity

 Njoroge K. D., Mutuli S. M., Kihiu J. M


A computational model was developed to simulate elastic and plastic behavior in Body Centered Cubic (BCC) metals and alloys. The model provided for simultaneous simulation of the micro and macro length scales and used periodicity to link the two length scales. The model was implemented in a 3dimensional framework giving rise to a finite element technique incorporating intrinsic dislocation information in the simulation of the material's behavior. The technique was validated by simulating loading over the elastic range and the immediate region beyond yield, of thin steel strips, and comparing the results to those obtained by conventional analysis. Stress-strain curves and slip plane percentage contribution factors were generated. Specifically the stress-strain curves generated upheld Hooke's law and demonstrated a definite yield plateau followed by material recovery after yielding.

Keywords: Dislocation density, Multi scale, Percentage slip plane contribution

Dynamic Analysis Of Dislocation Cores In the α- Fe Lattice Using The Embedded Atom model

 K. D. Njoroge, G. O. Rading, J. M. Kihiu, M. J. Witcomb, L. A. Cornish


The Embedded Atom Method (EAM) was employed to study the structure of body centered cubic (BCC) dislocation cores. Core energies, number of nearest neighbour atoms, stress tensor components, resolved shear stresses and dynamic dislocation core stresses were calculated for four types of dislocation cores. A dynamic dislocation model was presented and a "path of least resistance" (POLR) mechanism suggested for the determination of the Peierls stress. It was concluded that a sequence of stress components acting on the dislocation core in a slip system were responsible for the proposed core atom motion resulting in the overall dislocation motion. A review of the resolved shear stress in the lattice was then used to collaborate the results of the dynamic dislocation model and the core atom motion mechanism model.

Keywords: Embedded Atom Method, Dislocation, Peierls Stress, Body Centered Cubic

Performance Enhancement in Mobile Computing Using Replicated Cache Agent

 Meenakshi Mehla, Reena Dahiya


With the rapid advances of wireless communications and portable computing devices, a new computing paradigm which is called "mobile computing" has evolved. In mobile computing environment, users carrying portable devices have access to data and information services regardless of their physical location or movement behaviour. Wireless LANs (WLANs); allow greater flexibility and portability than traditional wired local area networks (LAN).This environment is called MANET. In MANETs, however, there are many unsolved problems so far:

Keywords: WLANs, LANs, MANET, Replicated Agent, NS2

Oscillation Test Methodology for Built-In Analog Circuits

 Ms. Sankari.M.S and Mr.P.SathishKumar


This article aims to describe the fundamentals of analog and digital testing methods to analyze the difficulties of analog testing and to develop an approach to test the analog components in a mixed signal circuit environment. Oscillation based, built-in self-test methodology for testing analog components in mixed-signal circuits, in particular, is discussed. A major advantage of the OBIST method is that it does not require any complex response analyzers and test vector generators which are costly Furthermore, since the oscillation frequency is considered to be digital it can be easily interfaced to test techniques dedicated to the digital part of the circuit under test (CUT). OBIST techniques show promise in detecting faults in mixed signal circuits and requires little modification of the CUT to improve the fault coverage. Extensive simulation results on some sample analog benchmark circuits are described in Spice format.

Keywords: System on chip (SOC), built-in Self test (BIST), Oscillation based built-in self test(OBIST), Circuit Under Test(CUT), Design for Testabiltity (DFT).

A new approach Data hiding in 2D data matrix and tilt correction algorithm

 Kimmy Ghanaiya, Gagandeep Kaur


Today barcodes are very much popular and present all around us. Not only on groceries in the super-market, but also in industrial manufacturing. Barcodes are designed for automated reading and interpretation. This paper presents two algorithm, first for data hiding in 2D data matrix provides a high level security and second for decoding purpose, independent of orientation of data matrix in any direction. This method also provides de noising of data matrix images captured by a webcam in real time YUY2_640x480 resolution. Experimental results shows this technique provides more accurate results as compared to previous methods with improved speed.

Keywords: Blind source separation, coordinate detection and slope detection, rotation detection.

Estimation of the Periods of Occurrence of Spread– F Over Ouagadougou

 Tomiwa, A. C, Adimula, I. A


Spread F data from the ionogram Collected in Ouagadougou (west Africa) were used to investigate the rate of occurrence of medium-scale irregularities in the electron concentration in the F-region of the ionosphere at all the seasons of the year. It was observed that at both equinox and solstices irregularities occur mostly at night, which is the same period at which scintillation also occur, when data were considered on an hourly basis. Occurrence of spread F was also studied during magnetic storms and during quiet periods. It was shown that spread- F is observed during disturbed period as well as quiet periods. At equinox, the values of the thickness of spread-f is seen to be higher for the quiet periods than the disturbed days throughout, but at solstices it is only in the evening period that the thickness for the quiet period is more than that of the disturbed periods.

Keywords: ionosphere, ionogram, irregularities, equinox, solstices, quiet and disturbed periods.

Application Specific Quality of Service (QoS) Centric Parameters Simulation in Wireless Mobile Ad hoc Networks

 Padmashree S, Manoj P B


Wireless mobile ad hoc networks is a collection of mobile nodes communicating with each other without any existing infrastructure networks via wireless links. In this paper, modelling and simulation of Ad hoc On-demand Distance Vector(AODV) Routing Protocol is performed. Study and analysis of parameters for Vehicular Communication application is considered where a very highly mobile ad hoc networks scenario will be created. Simulation is performed in ns2 (Network Simulator 2) for Quality of Service (QoS) parameters such as throughput and delay. Throughput variation also results due to various other factors such as jitter, number of mobile node variation and so on. The simulations are then plotted on x-graph.

Keywords: Wireless mobile ad hoc networks, AODV, DSDV, Quality of Service (QoS) parameters, ns2.

The Implementation OF Prosthetic Index Finger Based On EMG Signals

 Amanpreet Kaur, Gagandeep Kaur


The main goal of this paper is to provide an integrated design of the artificial index finger and to present the result of human like behavior. The big advantages of the prosthetic index finger are their incredible small size, volume and weight, their low cost .This paper describes our implementation of one finger of a future biomechatronic hand and remote control .This index finger captures the muscle activity like human index finger. The purpose of this paper is to illustrate the methodology for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature.

Keywords: Index Terms-EMG, Hilbert Transform, Neural Network.

Design of an Intelligent SMS Based Remote Metering System for AC Power Distribution to HT and EHT Consumers

 Mrs. Mahalakshmi N, Mr.KrishnaiahParamesh, Ms. Elavarasi E


Electrical distribution utilities are facing problems due to high energy losses that amount to 8% of the total generation. 4% of the losses are unaccounted. The problem is mainly associated with sub-transmission and distribution networks. This paper presents a unique solution to devise a unit which is tamperproof, cost effective, fast, accurate and remote metering at any level of the distribution system. The system helps to access accurate and sufficient data from metering devices to measure the electrical parameters, eliminating the use of energy meters and human intervention. The software application provides a real time parameter like line voltage, line current, power factor, true power, apparent power and reactive power from remote substations. Remote metering system is implemented using microcontroller based mixed signal circuitry.

Keywords: Global System for Mobile communication, General Packet Radio Service, Remote Metering, Short Message Service.

A study on strength properties of roller compacted concrete with 30% replacement of OPC 53 grade cement by flyash

 Ganapati Naidu. P, B. Jagannadh Kumar, S.Adisesh, P.V.V.Satayanarayana


Roller compacted concrete is a zero-slump concrete, low water content, dense mix consisting of coarse aggregate, sand, cementitious materials, and water. By using conventional vibrators, it is very difficult to compact for larger thicknesses. There is a chance of getting honey combing and inconsistency with respect to laboratory values. To rectify this, roller compacted concrete technique is proposed by preparation of samples. In this an attempt is made to prepare M15 and M20 mixes at their optimum moisture content and tested for compressive and split tensile strengths for various time periods (i.e., 3days, 7days, 14 days and 28 days). From these results, it is observed that higher strengths at early periods are obtained. When cylinders were tested, the compressive strength for M15 and M20 mixes at 28 days are 25Mpa and 31Mpa respectively. Similarly, for M15 and M20 mixes at 28 days, the split tensile strengths are 1.7Mpa and 1.9Mpa respectively. Since high strengths are obtained in compression and tension, it can be used as base course and sub-base course for flexible and rigid pavements.

Keywords: Roller compacted concrete, Compressive strength, Split tensile strength, Grade of concrete and water cement ratio

Motion Detection Method to Compensate Camera Flicker Using an Algorithm

 Alam Inder Singh, Gagandeep Kaur


This paper presents an improved motion detection system that is based on background subtraction method and threshold comparison method. Motion detection is used in many computer vision tasks like human tracking, pose estimation and recognition. It is a basic part for many computer vision tasks. Our purposed method makes background image using 10 previous consecutive frames. Our method detects motion via a standard webcam in real-time YUY2_640x480 resolution. Experimental results showed that the proposed method is more robust in nature as it can avoid the noise in motion detection due to camera flicker and useful to reduce the number of false positive alarms.

Keywords: background subtraction method, consecutive frames, motion detection, threshold comparison method.

An Efficient Approach for Mining Frequent Itemsets with Large Windows

 K Jothimani, S. Antony Selvadoss Thanmani


The problem of mining frequent itemsets in streaming data has attracted a lot of attention lately. Even though numerous frequent itemsets mining algorithms have been developed over the past decade, new solutions for handling stream data are still required due to the continuous, unbounded, and ordered sequence of data elements generated at a rapid rate in a data stream. The main challenge in data streams will be constrained by limited resources of time, memory, and sample size. Data mining has traditionally been performed over static datasets, where data mining algorithms can afford to read the input data several times. The goal of this article analysing the mining frequent itemsets in theoretical manner in the large windows. By comparing previous algorithms we propose new method using analytical modelling to determine the factors over data streams.

Keywords: Data Mining, Data Streams Frequent Itemset Mining, Large Sliding Window.

Recent Developments in Preparation of Non Conventional Activated Carbons

 Prof Shantini Bokil, Prof.Dr.R.K.Rai, Prof.Dr.S.N.Kaul


The present paper has been initiated with a view to investigate the adsorption characteristics of available agro based material to be used as an adsorbent for the removal of the contaminants from the secondary treated industry effluents. The main purpose is to characterise the available agro-based waste material as adsorbent and asses the removal efficiency of contaminants by available agro-based waste material. The paper highlights the various evaluation parameters like initial concentration, time, pH, temperature dose of the adsorbent, particle-size, functional group and agitation speed on the removal of contaminants from the secondary treated industry effluents.

Keywords: adsorptrion, Freundlich, Langmuir isotherms, effluents.

Credit Card Fraud: Bang in E-Commerce

 Khyati Chaudhary, Bhawna Mallick


Recent decades have seen a gigantic expansion in the use of credit cards as a true transactional medium. Data mining is rising as one of the chief features of many homeland security\ initiatives. Often, it is used as a means for detecting fraud, assessing risk, as well as product retailing. Data mining is becoming increasingly common in both the private as well as public sectors. Data mining involves the use of data analysis tools to find out formerly unknown, believable patterns and relationships in large data sets. Credit card offers a number of secondary benefits unavailable from cash or checks. Credit cards are safer from theft than is cash. Fraud detection involves monitoring the behavior of populations of users in order to estimate, detect or avoid unwanted behavior. In this paper, we studied various factors required to distinguish transaction and characterizes factors affecting fraud detection with fraud prevention techniques.

Keywords: E-commerce, Credit Card, Fraud detection, Fraud Prevention

Efficient Moving Object Detection Based On Statistical Background Modeling

 Kusuma.U, S.T Bibin Shalini


Tracking vehicles is an important and challenging problem in video-based Intelligent Transportation Systems, which has been broadly investigated in the past. A robust method for tracking vehicles is implemented in this thesis work. The proposed algorithm includes three stages: object detection, counting and tracking. Vehicle detection is a key step. The concept of moving object detection is built upon the segmentation method. Background subtraction method is used in this work. According to the segmented object shape, a predict method based on Kalman filter is proposed. By assuming that the vehicle moves with almost a constant acceleration from the current frame to the next, a Kalman filter model is used to tracking and predicting the trace of a vehicle. The model can be used in the traffic analysis as it is capable of tracking and counting multiple targets in a big area hence forming an effective, efficient, practical vehicle tracking system. The proposed method has been tested on few traffic-image sequences and the experimental results show that the algorithm is robust and can meet the requirement. 1.

Designing Continuous-Time Observers for Linear Hybrid Systems with Application to Three Tank Model

 Mohammad Amin Zahedi Tajrishi, Behrouz Rezaie, Reza Ghaderi


In this paper, two methods are studied and compared for designing continuous-time observers in linear hybrid systems. The methods are proposed based on Kalman filter and Luenberger observer design methods in linear systems. Three tank system is also considered as a hybrid system with unknown state variables. The linear hybrid model is obtained by linearizing the model of three tank system around its equilibrium point. In addition, it is also assumed that a state feedback controller is utilized for controlling the linearized hybrid system in order to stabilize the system. Simulation results depict the effectiveness and applicability of the proposed methods for state observation in hybrid systems. Comparison of two proposed methods shows that the optimal observer obtained using Kalman filter design method has better performance than Luenberger method.

Keywords: Linear hybrid system, Three tank system, Continuous-time observers, Kalman filter, Luenberger observer.

Single Precision Floating Point Divider Design

 Serene Jose, Sonali Agrawal


Growth in Floating Point applications and mainly its usage in reconfigurable hardware have made it critical to optimize floating point units. Divider is of particular interest because the design space is large and divider usage in applications varies widely. The design presented in this paper covers a range of performance, area and throughput constraints. Floating point numbers can be represented by single and double precision respectively. A design for single precision floating point divider was done in Verilog and was synthesized using Xilinx and Synopsys tool. The path delay, device utilization was also determined successfully.

Keywords: Divider, Floating Point Unit (FPU), and Single precision.

Reliability Analysis:The Mathematical Expression

 Pooja Parnami, Ruchi Dave, Neha Singha, Ankur Dutt Sharma


The reliability engineering discipline has undergone evolutionary development and breakthroughs during the last six decades. The need for reliable products was first sensed in both commercial and military sectors in early 1950s. Since then enormous progress has been made in the area of reliability engineering. Before 1950s, the focus was either on quality control or on machine maintenance problems. Literature suggests that before World War II reliability was intuitive in nature and the basic concept of reliability was born during this time period. In this paper with the help of Mathematical simulation, the reliability analysis will be proved for particular products. The concept of reliability analysis will be more useful to check any product or physical element‟s durability and confidence of quality.To decide the standardization of any product, this reliability analysis will be important and key of success. The need of satisfaction related to physical products can be check by these analysis also. Some of Mathematical approaches like Boolean algebra, Logarithm equations some of the formulas and mathematical expressions will be introduced. The condition between failure and durability will be considered to check the quality & reliability.Some of the different types of distributions like Exponential, Weibull, and Gamma & Lognormal can be express in PDF or CDF curves for reliability analysis standardizations and proof. Finally to check the reliability of substances these analyses will be useful and easy to prove product quality and other features.

Simulation of Beamforming Solution of Intereference Reduction for High Altitude Systems

 Bharati.L.Rathod, Mr.Hemanthkumar.P, Mr.Aaquib.Nawaz.S


Interference reduction is vital for being able to effectively communicate with mobile users in rugged terrain and mountainous regions .It is proposed to outfit a high flying airborne node with a Code Division Multiple Access (CDMA) base station in order to provide line of sight communications and continual coverage to the remote users in a highly congested environment. The results will One approach to increasing capacity and coverage zones for the servicing wireless station is to use smart antennas. This paper simulates many Beamforming algorithms namely SMI, LMS, VSSLMS, Griffiths, VSSG, EDNSS and ENSS. The algorithms provides different ways by which we can calculate the phase shifts and apply to individual antenna elements so that main beam is formed in the look direction and nulls or reduced radiation is formed in the jammer directions. The two algoithms namely EDNSS & ENSS from adaptive filtering and applied to smart antenna concepts of beamforming. The ENSS algorithms achieves less error and high convergencence as compared to existing beamforming other algorithms.

Keywords: Smart Antenna, Beam forming Algorithms, least mean square (LMS),variable step size LMS, Griffth's variable step size Griffth's, variable error data normalized step size(EDNSS) error normalized step size(ENSS).

Floating Point Unit Implementation on FPGA

 Deepa Saini, Bijender M'dia


As densities of FPGA are increasing day by day, the feasibility of doing floating point calculations on FPGAs has improved. Moreover, recent works in FPGA architecture have changed the design tradeoff space by providing new fixed circuit functions which may be employed in floating-point computations. By using high density multiplier blocks and shift registers efficient computational unit can be developed. This paper evaluates the use of such blocks for the design of floating-point units including adder, subtractor, multiplier and divider.