IJCER Online Submission
IJCER Paper Submission Format
Publication Charges - IJCER

Frequency: 12 issues per year
ISSN: 2250–3005 (online version)
Published by: IJCER
Call for paper...

International Journal of ComputationalEngineeringResearch


Volume 4, Issue 3,March, 2014

Version I
S.No. Article Title Page No. Paper Index PDF

Study of Combustion Characteristics of Fuel Briquettes

Amit Talukdar, Dipankar Das, Madhurjya Saikia


Biomass material such as rice straw, banana leaves and teak leaves (Tectona grandis) are densified by means of wet briquetting process at lower pressures of 200-1000 kPa using a piston press. Wet briquet-ting is a process of decomposing biomass material up to a desired level under controlled environment in order to pressurize to wet briquettes or fuel briquettes. Upon drying these wet briquettes could be used as solid fuels. This study is aimed at to determine combustion characteristics of briquettes which will fa-cilitate to answer some of the questions regarding the usefulness of fuel in terms of production of harm-ful gases and fly ashes during combustion which are common indoor air pollutants in many households and effectiveness of the fuel in terms of heat value

Keywords:Biomass, fuel briquettes, calorific value, combustion rate

The Birefringent Property of an Optical Resinfor the Study of a Stress Field Developed in a Three Point Loading Beam



The birefringent property, known also as the double-refraction phenomenon, is used in a polariscope to study a stress field developed in a three point loading beam. The model used for this analysis was made of an epoxy resin (PLM4) and a hardener (PLMH). The stress field was locked in the model by the stress freezing technique. Photoelastic fringes obtained on the analyzer of a regular polariscope were used to determine completely the stress field. A finite elements analysis was also conducted in order to determine the stress field numerically. A whole field comparison of the experimental photoelastic fringes and the simulated ones and a local analysis using the principal stresses difference showed very good agreement between the experimental solution and the numerical one.

Keywords:Birefringence, photoelasticity, fringe, stress

Optimization of Image Search from Photo Sharing Websites

Shubhada Mali, Sushama Kadam, Ganesh Shelke, Rajani Ghodake


The social networking sites, such as flicker allows users to upload images and annotate it with descriptive labels known as tags. Personalized image searching is the way to searching images according to intension of users and that personalized image result is relevant to the individual user.Personalized web search takes an advantage of information about an individual that tagging to an image for identifying the most relevant image result for that person. The main challenge for personalization lies in collecting user profiles which describes information about the user. The user preferences and fired query are used to obtained relevant image result. The proposed system contains three components: A Ranking based multi-correlation tensor factorization (RMTF) model is proposed to perform annotation prediction, which is considered as user‟s preference according to annotating or tagging to an image. Corpus is used to analyze users ,their annotating images and users tags for each image to find users specific topics .The proposed algorithm perform topic modeling which is used to generate user specific topics . The single word query selection is used for searching relevant image result. The query mapping or query relevance and topic sensitive user preferences(TSUP) are integrated into final ranked result of relevant images .

Keywords:Relevant search, RMTF, image annotation, user preferences, user specific topics, query relevance, TSUP.

Manets: Increasing N-Messages Delivery Probability Using Two-Hop Relay with Erasure Coding

A.Vijayalakshmi, J.Praveena


The lack of a thorough understanding of the fundamental performance limits in mobile ad hoc networks (MANETs) remains a challenging barrier stunting the commercialization and application of such networks. The proposed system is using a method such as two-hop relay algorithm with erasure coding to increase the message delivery probability of a MANETs. The two-hop relay algorithm with erasure coding used for the message that is erasure coded into multiple frames (coded blocks) at each source node. And also, a finite-state absorbing Markov chain framework is developed to characterize the complicated message delivery process in the challenging MANETs. Based on the developed framework, closed-form expressions are further derived for the message delivery probability under any given message lifetime and message size by adopting the blocking matrix technique where the important issues of interference, medium contention and traffic contention are carefully integrated. To further improve our proposed systems a delivery of n-distinct message is simultaneously send from source to destination.

Keywords:Mobile ad hoc networks, delivery probability, two-hop relay, erasure coding.

Survey paper on Virtualized cloud based IPTV System

Jyoti P. Hase, Prof. R.L.Paikrao


IPTV delivers the TV content over an internet Protocol infrastructure. Virtualized cloud-based services will benefit of stastical multiplexing across applications to yield important cost savings to the operator the cloud based IPTV provide lower a provider's costs of real-time IPTV services through a virtualized IPTV architecture and through intelligent time shifting of service delivery. It takes advantage of the differences in the deadlines associated with Live TV versus Video-on-Demand (VoD) to effectively multiplex these services. However, achieving similar advantages with period of time services are often a challenge. For construct the problem as an optimization formulation that uses a generic cost function. e.g., minimum-maximum, concave and convex functions to reflect the different cost operation. We are going to study The time shifting solution to this formulation gives the number of servers needed at different time instants to support these services. a simple mechanism for time-shifting scheduled jobs in a simulator and study the reduction in server load using real traces from an operational IPTV network. The cloud based IPTV results show that able to reduce the load by 24%.


The Search of New Issues in the Detection of Near-duplicated Documents

Hasan Naderi , Narges salehpour, Mohammad Nazari farokhi, Behzad Hosseini chegeni


Identifying the same document is the task of near-duplicate detection. Among the near-duplicate detection algorithms, the fingerprinting algorithm is taken into consideration using in analysis, plagiarism, repair and maintenance of social softwares. The idea of using fingerprints is in order to identifying duplicated material like cryptographic hash functions which are secure against destructive attacks. These functions serve as high-quality fingerprinting functions. Cryptographic hash algorithms are including MD5 and SHA1 that have been widely applied in the file system. In this paper, using available heuristic algorithms in near-duplicate detection, a set of similar pair document are placed in a certain threshold, an each set is indentified according to being near- duplicate. Furthermore, comparing document is performed by fingerprinting algorithm, and finally, the total value is calculated using the standard method

Keywords:Near-Duplicate Detection, Fingerprinting, Similarity, Heuristics, Shingle, Hash, Random

Mathematical Modeling of Class B Amplifire Using Natural and
Regular Sampled Pwm Moduletion

N. V. Shiwarkar, K. G. Rewatkar


Class-D amplifiers operate by converting an audio input signal into a high-frequency square wave output, whose lower-frequency components can accurately reproduce the input. Their high power efficiency and potential for low distortion makes them suitable for use in a wide variety of electronic devices. By calculating the outputs from a classical class-D design implementing different sampling schemes we demonstrate that a more advance method, over the double Fourier series method, which is the traditional technique employed for this analysis. This paper shows that when natural sampling is used the input signal is reproduced exactly in the low-frequency part of the output, with no distortion. Although this is a known result, our calculations present the method and notation that develops the classical class-D design is prone to noise, and therefore negative feedback is often included in the circuit. Subsequently we incorporate the Fourier transform/Poisson Re-summation method into a formulised and analysis of a feedback amplifier. Using perturbation expansions we derive the audiofrequency part of the output, demonstrating that negative feedback introduces undesirable distortion

Keywords:class D, natural sampling, regular sampling, Fourier analysis method re-summation

Neighborhood Triple Connected Domination Number of a Graph

G. Mahadevan, N. Ramesh, C. Sivagnanam, Selvam Avadayappan, A. Ahila, T.Subramanian


In this paper we introduce new domination parameter with real life application called neighborhood triple connected domination number of a graph. A subset S of V of a nontrivial graph G is said to be a neighborhood triple connected dominating set, if S is a dominating set and the induced subgraph is a triple connected. The minimum cardinality taken over all neighborhood triple connected dominating sets is called the neighborhood triple connected domination number and is denoted by ntc. We investigate this number for some standard graphs and find the lower and upper bounds of this number. We also investigate its relationship with other graph theoretical parameters.

Keywords:Neighborhood triple connected domination number AMS (2010): 05C69
Version II
S.No. Article Title Page No. Paper Index PDF

Bioelectrical Impedance Analysis (BIA) For Assessing Tbw and Ffm of Indian Females

Munna Khan, Shabana Mehfuz, Ghazala PerveenKhan


Background: The Bioelectrical Impedance analysis is an easy, applicable method for assessing Total Body water and Fat Free Mass of various groups of people. It has many advantages over other methods and is safe, rapid, portable, easy to perform and require minimum operator training. It has been used extensively for developing specific prediction equation for different ethnicity, age, gender, level of body fatness and physical activity. Regression equations play great role to estimate the body density and fatness specific to the owing to methodological and biological factors. Purpose: The purpose of the study was to investigate the utility of multi-frequency BIA for the estimation of TBW and FFM of Indian females at different frequencies. Earlier scientists have measured various parameters of body composition of Indian population using MALTRON-II. However, literature shows that prediction equations have not been developed for Indian females which could develop the heathy prediction equation for Indian females (unhealthy eating habits). It has been found in the study that women particularly Indian womens have unhealthy eating habits owing to the fact that they concentrate only on 1 aspect was their food cooked good enough to eat, but they don't understand the meaning of healthy food. In the present research paper, an attempt has been made to develop BIA equations using data taken from PhD thesis (1). This data was taken from my senior Dr. Goswami, who collected the data in the college in which he was teaching. Some of the data that my supervisor, co-supervisor and myself has taken in DRDO have already been utilized in building multi-compartmental model, developing generalized age and sex specific prediction equation and developing REE of Indian subjects.

Keywords:R(2.9.2) software, Bio Electrical Impedance Analysis, Prediction Equation, MALTRON-II, Multiple Regression Analysis, Total Body Water, Fat Free Mass, Impedance index.

The Applications of Computational Intelligence (Ci) Techniques in System Identification and Digital Filter Design

Jainarayan Yadav, Sanjay Gurjar  


The thesis focuses on the application of computational intelligence (CI) techniques for two problems- System identification and digital filter designe .In system identification, different case studies have been carried out with equal or reduced number of orders as the original system and also in identifying ablackbox model. Lowpass, Highpass, Bandpass, stopband FIR and Lowpass IIR filters have been designed using three algorithm Using two different fitness function. Partical Swarm Optimization (PSO), Differential Evolution based PSO (DEPSO) and PSO with Qantum Infusion(PSO-QI)algoritms have been applied in this work.PSO-QI is a new Hybrid algorithm where global best particle.obtained from PSO goes into at tournament with an offspring produced by mutating the gbset of PSO using the quantum principle in quantum behaved PSO (QPSO) and the winner is selected as the new gbest of the swarm. In QPSO, unlike traditional PSO, exact values of particle's position and velocity cannot be determined. However, its position in the solution space is determined by mapping the probability of its appearance in the quantized search space. The results obtained from PSO-QI have been compared with the DEPSO hybrid algorithm and the classical PSO. In all of the cases, PSO-QI has outperformed the other two algorithms in its ability to converge to the lowest error value and its consistency in finding the solution every time and thus proven to be the best. However, the computational complexity of PSO-QI is higher than that of the other two algorithms.


Bi-Level Weighted Histogram Equalization with Adaptive Gamma Correction

 Jeena Baby, V. Karunakaran


In this paper the bi-level weighted histogram equalization is combined with adaptive gamma correction method for better brightness preservation and contrast enhancement. The main idea of this method is to initially divide the input dimmed image into R, G and B components and apply the probability density function and weighting constraints on each component separately. And finally, an adaptive gamma correction method is applied to each component and their union produces a brightness preserved and contrast enhanced output image. The performance of this technique is calculated using Absolute mean brightness error (AMBE) measure.

Keywords:Contrast enhancement, brightness preservation, histogram equalization, peak signal to noise ratio, absolute mean brightness error, adaptive gamma correction, probability density function, cumulative density function.

Ultraspherical Solutions for Neutral Functional Differential

A. B. Shamardan, M. H. Farag, H. H. Saleh 


This paper is concerned with the numerical solution of neutral functional differential equations (NFDEs). Based on the ultraspherical  -stage continuous implicit Runge-Kutta method is proposed. The description and outlines algorithm of the method are introduced. Numerical results are included to confirm the efficiency and accuracy of the method.

Keywords:Functional differential equations, Equations of neutral type, Implicit delay equations

Solid Waste Management in Mahaboobnagar Municipality

Dr. C.Sarala, G.SreeLakshmi


The 74th Amendment of the constitution of India in 199, made Municipal authorities in the country as a third tier to government. The 12th schedule of the constitution envisaged functions to be performed by the municipal authorities, one among these functions is solid waste management. The Ministry of Environment and Forest has notified municipal solid waste rules 2000 under the Environment Protection Act 1986. According to these rules all municipal authorities were expected to improve solid waste management practices in terms of a fore said rules by 2003, but the situation did not improve as expected for want of adequate technical knowledge, there is no proper management facility for solid waste. Uncontrolled dumping of municipal solid waste has been observed at the road side. There is no processing facility or disposal practices in any urban local bodies. The biomedical slaughter house waste is getting mixed with solid waste and altering the characteristics of waste hence there is a need to develop a proper management systems. This paper presents the waste management system of Mahabubnagar municipality in Andra Pradesh State of India to implement municipal solid waste rules 2000. Expeditiously in Mahabubnagar municipality by the process of modernizing the system of solid waste management.

Keywords:solid waste management,emissions,land filling,biological process, green house gases

Survey Paper on Creation Of dynamic query Form for mining highly optimized transactional databases

 Jayashri M. Jambukar


In New scientific databases and web databases maintain huge and heterogeneous data. These concrete world databases include over so many relations and attributes. Historic predefined query forms are not able to answer different ad -hoc queries from users on those databases. This paper proposes Dynamic Query form, a curious database query form interface, which is able to dynamically create query forms. The significance of DQF is to capture a user's choice and classify query form components, sup port him/her to make conclusion. The creation of query form is a repetitive process and is conducted by the users. In each repetition, the system automatically creates classification lists of form components and the user then adds the desired form components into the query form. The classification of form components is based on the captured user choice. A user may al so fill up the query form and deliver queries to view the query output at each step. Thus, a query form could be dynamically refined till the user answer w ith the query output. A probabilistic model is developed for estimating the excellence of a query form in DQF. I have studied evaluation and user study certify the effectiveness and efficiency of the system

Keywords:Form creation, Query Form, User Interaction.

Image Performance Tuning Using Contrast Enhancement Method with Quality Evaluation

Alphy George, S John Livingston


Quality of an image plays a very crucial role in various image processing applications such as, recognition, identification, transmission etc. Therefore identifying the quality of an image is very necessary in such areas. Restoration of the good image from a degraded image will improve the quality of the image. The purpose of this paper is to find the quality value of an image using a new metric after some preprocessing steps. Here one particular type of image distortion taken into account that is contrast change and enhancing the contrast using an additive gamma correction method. After this preprocessing the quality value is found using a new image quality assessment metric called normalized perceptual information distance. For this metric, the main concept used is kolmogorov complexity and normalized information distance.

Keywords:Contrast enhancement, Image quality assessment, gamma correction, normalized information distance, kolmogorov complexity.

Customized Query Interface Integration using Attribute Constraint Matrices

Sherlin Mary Koshy, Belfin R. V. 


Query Interfaces are often the only means by which certain web databases in a domain can be accessed. It has been observed that several different kinds of query interfaces exist for the same domain and they require the user to enter the same values despite the differences in the interfaces. Integrating these query interfaces is therefore essential to allow the user uniform access to all databases in a given domain. Integration of Query Interfaces using Attribute Constraint Matrices was found to be a very efficient technique for integration. This technique however performs integration over all the interfaces in a domain, while integration of only a subset of these may be necessary. This paper presents an algorithm that when applied to the result of integration of all interfaces in a domain produces an integrated attribute constraint matrix that represents only the components of those interfaces that are required to be integrated by the user.

Keywords:Attributes, Attribute Constraint Matrix, Interface Integration, Integrated Matrix, Pruning, Query Interface, User Preference.

Analysis of Image Segmentation Methods Based on Performance Evaluation Parameters

Monika Xess, S. Akila Agnes 


Image segmentation is an important technology for image processing which aims at partitioning the image into different homogeneous regions or clusters. Lots of general-purpose techniques and algorithms have been developed and widely applied in various application areas. However, evaluation of these segmentation algorithms has been highly subjective and a difficult task to judge its performance based on intuition. In this paper image segmentation using FCM, Region Growing and Watershed algorithms is performed and segmentation results of these techniques are analyzed based on four performance metrics GCE, PSNR, RI and VoI. This analysis provides an overview that on what parameters different image segmentation techniques can be evaluated at best

Keywords:FCM, Region Growing, Watershed, GCE, PSNR, RI, VoI.

Controlling Computer Operations using Brain-Wave Computing

Shanmugapriya.B, Akshaya.T, Kalaivani.K, Anbarasu.V 


People interact with computer using devices that have been created to serve a specific purpose. For generations, humans are fascinated about the idea of communicating with machines through devices that can peer into person's mind. Such idea motivated to the recent advancements in the field of artificial intelligence and cognitive neuroscience to provide the ability to interact with machine through brain. The proposed work is an Electroencephalography (EEG) based biomedical signal processing system to perform computer operations by manipulating the brain activity. The users have to explicitly manipulate the brain activity to produce signals that can be used to operate the computer. The brain waves are obtained with the help of scalp electrodes. EEG signals collected are then processed to interpret the command and execute the desired task. The real time implementation requires training the computer according to one's thoughts and actions through neural networks.

Keywords:Artificial Intelligence, Cognitive Neuroscience, Electroencephalography, Biomedical signal processing, Scalp electrodes, Neural Networks.

A Novel Approach for the Effective Detection of Duplicates in XML Data

Anju Ann Abraham, S. Deepa Kanmani  


eXtensible Markup Language is widely used for data exchange between networks and it is also used for publishing data on web. Identifying and eliminating the duplicates has become one of the challenging tasks in the area of Customer Relationship Management and catalogue integration. In this paper a hybrid technique is used for detecting duplicates in hierarchically structured XML data. Most aggressive machine learning techniques is used to derive the conditional probabilities for all new structure entered. A method known as binning technique is used to convert the outputs of support vector machine classifiers into accurate posterior probabilities. To improve the rate of duplicate detection network pruning is also employed. Through experimental analysis it is shown that the proposed work yields a high rate of duplicates thereby achieving an improvement in the value of precision. This method outperforms other duplicate detection solution in terms of effectiveness.

Keywords:Binning, duplicate detection, heterogeneous structure, network pruning, posterior probability, SVM, XML

Survey Paper on Integrity Auditing of Storage

Ugale Santosh A


Cloud servers is a model for enabling convenient, on-demand network access to a shared pool of configurable server resources (networks, memory, storage, cpu, applications, and services) that can be rapidly provisioned and released with minimal management effort or cloud service provider interaction. The Cloud servers models offers the promise of massive cost savings combined with increased IT agility due to pay per consume. However, this technology challenges many traditional approaches to hosting provider and enterprise application design and management. Cloud servers are currently being used; however, data security cited as major barriers to adoption in cloud storage. Users can store data and used on demand or for the applications without keeping any local copy of the data. Users can able to upload data on cloud storage without worrying about to check or verify integrity. Hence integrity auditing for cloud data is more important task to ensure users data integrity. To do this user can resort the TPA (Third Party Auditor) to check the data on the cloud storage. TPA is the expertise and having knowledge and capabilities which users can unable to check. TPA audit the integrity of all files stored on the cloud storage on behalf of users and inform the results. Users should consider auditing process will not cause new vulnerability against the users data also ensures integrity auditing will not cause any resources problem.

Keywords:Auditing, Cloud, Cloud servers, Data integrity, Data privacy, Security, Storage

Lookup table embedded in (FPGA) for network security

ZuhirNemerAlaaraj, Abdelrasoul Jabar Alzubaidi


This work proposes a solution to improve the security of data through flexible bitstreem encryption, by using the lookup table (LUT) that is embedded in the FPGA. This technique concentrates on building high data security to make it difficult for an adversary to capture the real data. The syntheses proposed can be implemented in two steps: First step: programming the FPGA to create a LUT by VHDL language the LUT has a predefined length and contents. Second step: applying security strategy. A high security and more reliability can be achieved by using this mechanism when applied on data communication and information transformed between networks

Keywords:LUT, FPGA.VHDL, embedded system, security

Survey of Steganalysis Technique for Detection of Hidden Messages

 Vanita J. Dighe,Prof. Baisa L. Gunjal


Still and multi-media images are subject to transformations for image compression and steganographic hidding and digital watermarking. Here new measures and techniques for detection and analysis of steganographic embedded content. We show that both statistical and pattern classification techniques using our measures provide reasonable discrimination schemes for detecting embeddings of different levels. These measures are based on a few statistical properties of bit strings and wavelet coefficients of image pixels. There are Techniques for information hiding known as steganography are becoming increasingly more popular and spread over a large area. The purpose of steganography is to send secret messages after embedding them into public digital multimedia. It is preferred to hide as many messages as possible per change of the cover-object. In general, for given messages and covers, the steganography that introduces fewer embedding changes will be less detectable, i.e., more secure. Two fields of study have been projected to develop the communication security: cryptography and information hiding. Although they are both applied to the protection of secret message, the major difference is the appearance of the transmitted data.

Keywords:Steganography, Least significant bit (LSB), Exploiting Modification Direction (EMD), Diamond Encoding (DE), Optimal Pixel Adjustment Process (OPAP), Pixel Pair Matching (PPM), Adaptive Pixel Pair Matching (APPM).