Articles in Press
   2010
   2009
   2008
   2007
   2006
 
Volume 13, Issue 1
Electrical and Computer Engineering
Volume 13, Issue 4
Electrical and Computer Engineering
Volume 13, Issue 2
Civil and Mechanical Engineering
Volume 13, Issue 3
Civil and Mechanical Engineering
   2005
   2004
   2003
   2002
   2001
   2000
   1999
   1998
   1997
   1996
   1995
   1994
Volume 13, Issue 4, 2006
Electrical and Computer Engineering


A Rule-Based Advanced Static Var Compensator Control Scheme for Transient Stability Improvement
 
        S. Abazari (PhD.)
  • M. Ehsan [PhD.]
  • M.R. Zolghadri [PhD.]
  • J. Mahdavi [PhD.]

Preview   Download PDF
 

The paper presents the application of a rule- based control scheme for an Advanced Static Var Compensator (ASVC) to improve power system transient stability. The proposed method uses a current reference, based on the Transient Energy Function (TEF) approach. The proposed scheme provides, also, a continuous control of the reactive power flow. The performance of the proposed approach is compared with that of a system using a conventional control method and of a system without ASVC. A single-machine system and an IEEE three machine system are used to verify the performance of the proposed method.


Performance Evaluations and Comparisons of Several LDPC Coded MC-FH-CDMA Systems
 
        H. Behroozi (PhD.)
  • J. Haghighat [PhD.]
  • M. Nasiri-Kenari [PhD.]
  • S.H. Jamali [PhD.]

Preview   Download PDF
 

In this paper, the application of regular Low-Density Parity-Check(LDPC) codes in Multi-Carrier Frequency-Hopping (MC-FH) CDMA systems is studied. To this end, different well-known constructions of regular LDPC codes are considered and the performance of LDPC coded MC-FH-CDMA systems, based on these constructions, are evaluated and compared in a frequency-selective slowly Rayleigh fading channel. These results are compared with those previously reported for super-orthogonal convolutionally coded MC-FH-CDMA systems. The simulation results indicate that the LDPC coded MC-FH-CDMA system significantly outperforms the uncoded and super-orthogonal convolutionally coded schemes. To alleviate the restrictions imposed by well known LDPC construction methods when applied to the coded MC-FH-CDMA system considered, a new semi random construction is proposed and its performance is evaluated in the coded scheme. The simulation results indicate that this new construction substantially outperforms other well-known construction methods in the application considered.


Estimation of Mean Radius, Length and Density of Microvasculature Using Diffusion and Perfusion MRI
 
        M. Ashoor (PhD.)
  • M. Jahed [PhD.]
  • M. Chopp [PhD.]
  • A. Mireshghi [PhD.]

Preview   Download PDF
 

In theory, di usion and perfusion information in MRI maps can be combined to yield morphological information, such as capillary density, volume and possibly capillary plasma velocity. This paper suggests a new method for determination of mean radius, length and capillary density in normal regions using di usion and perfusion MRI. Mean Transit Time (MTT), Cerebral Blood Volume (CBV), Apparent Di usion Coecient (ADC), pseudo-di usion coecient (D) and R2 and R 2 values were utilized to calculate mean radius, length and capillary density. To verify the proposed theory, a special protocol was designed and tested on normal regions of a male Wistar rat using obtained functions. Mean radius, length and capillary density in the normal regions were calculated to be 2:48  0:35 (mean  SD), 234  12 microns and 11897  219=mm3, respectively. With respect to the values 0.01 through 0.1 for the CBV/Vol(voxel) parameter and 1 through 1000 sec for R2 2 =R3 2, the mean radius of the capillary, using the proposed method, varied from 0.076 to 7.58 microns.


Performance Analysis of Per Tone Equalization in DMT-Based Systems
 
        M.R. Pakravan (PhD.)
  • S.S. Changiz Rezaei [PhD.]

Preview   Download PDF
 

The Per Tone equalization algorithm is a novel discrete multitone(DMT) equalization method with practical applications in subscriber digital loop systems. Unlike time domain DMT equalizers (TEQ), which equalize all DMT tones in a combined fashion, the Per Tone equalizer equalizes each tone separately. In this paper, the performance and complexity of this technique is investigated and compared with that of other TEQs. Furthermore, the behavior of this technique in different simulation conditions, such as over different standard CSA loops and in the presence of additive white Gaussian noise with variable power spectral density level and near end crosstalk (NEXT), is studied and compared with that of other TEQs. Simulation results show that Per Tone has a reduced sensitivity to synchronization delay and a much better performance compared to other conventional DMT equalizer design algorithms.


A New Analytical Method on the Field Calculation of Interior Permanent-Magnet Synchronous Motors
 
        A. Kiyoumarsi (PhD.)
  • M.R. Hassanzadeh [PhD.]
  • M. Moallem [PhD.]

Preview   Download PDF
 

Although there are analytical methods for field calculation in surface-mounted synchronous motors, accurate analytical methods for predicting airgap flux density distribution in Interior-type Permanent-Magnet (IPM)synchronous motors are not available. In this paper, a novel method for analytical prediction of flux distribution, based on Schwarz-Christofell transformation techniques, is proposed to evaluate the airgap flux density distribution in an IPM motor. To validate the accuracy of the new analytical method, the results are compared with transient Finite Element Method (FEM) results.


An Adaptive Secure Channel Coding Scheme for Data Transmission over LEO Satellite Channels
 
        A. Payandeh (PhD.)
  • M. Ahmadian [PhD.]
  • M.R. Aref [PhD.]

Preview   Download PDF
 

Both secure and error control coding are very extensive subjects, each with a variety of sub-disciplines. A secure channel coding (joint encryption-channel coding) scheme provides both data secrecy and data reliability in one process to combat problems in an insecure and unreliable channel. In this paper, a joint encryption-channel coding scheme is developed, based on concatenated turbo codes for more efficient and secure transmission of LEO satellite data. Reliability and security are achieved by adapting the pseudo-random puncturing strategy with a change of distance between satellites and ground stations in the communication s\'{e}ance, an issue further burdened by reducing energy consumption or increasing bit rate of data transmission. Simulation results show the relevance and superior performance of the proposed scheme compared with the traditional data transmission system.


A Note on Fuzzy Process Capability Indices
 
        M. Mashinchi (PhD.)
  • M.T. Moeti [PhD.]
  • A. Parchami [PhD.]

Preview   Download PDF
 

The notion of fuzzy process capability indices is studied by Parchami et al. [1], where the specification limits are triangular fuzzy numbers. In this note, their results are revised for the general case, where the specification limits are $L-R$ fuzzy intervals.


Drawing Free Trees on 2D Grids Which are Bounded by Simple Polygons
 
        A. Bagheri (PhD.)
  • M. Razzazi [PhD.]

Preview   Download PDF
 

In this paper, a polyline grid drawing of free trees on two dimensional grids, bounded by simple polygons, is investigated. To the authors' knowledge, this is the first attempt made to develop algorithms for drawing graphs on two dimensional grids bounded by simple polygons.


Machine Learning Approaches to Text Segmentation
 
        S.D. Katebi (PhD.)
  • M.M. Haji [PhD.]

Preview   Download PDF
 

Two machine learning approaches are introduced for text segmentation. The first approach is based on inductive learning in the form of a decision tree and the second uses the Naive Bayes technique. A set of training data is generated from a wide category of compound text image documents for learning both the decision tree and the Naive Bayes Classifier (NBC). The compound documents used for generating the training data include both machine printed and handwritten texts with different fonts and sizes. The 18-Discrete Cosine Transform (DCT) coefficients are used as the main feature to distinguish texts from images. The trained decision tree and the Naive Bayes are tested with unseen documents and very promising results are obtained, although the later method is more accurate and computationally faster. Finally, the results obtained from the proposed approaches are compared and contrasted with one wavelet based approach and it is illustrated that both methods presented in this paper are more effective.


Achieving Higher Stability in Watermarking According to Image Complexity
 
        M. Jamzad (PhD.)
  • F. Yaghmaee [PhD.]

Preview   Download PDF
 

One of the main objectives of all watermarking algorithms is to provide a secure method for detecting all or part of the watermark pattern in case of the usual attacks on a watermarked image. In this paper, a method is introduced that is suitable for any spatial domain watermarking algorithm, so that it can provide a measure for the level of robustness when a given watermark is supposed to be embedded in a known host image. In order to increase the robustness of the watermarked image, for a watermark of $M$ bits, it was embedded N=s\times M times, where $s$ is a small integer. Doing this, the entire image is divided into 16 equal size blocks. For each block, the complexity of the sub-image in that block is measured. The amount of repetition of the watermark bits saved in each block is determined, according to the complexity level of that block. The complexity of a sub-image is measured using its quad tree representation. This approach not only secures the watermarked image with respect to usual attacks, but also, enables one to save longer bit patterns of the watermark, while maintaining a good level of similarity between the original image and the watermarked one. For evaluating the performance of this method, it has been tested on 2000 images having low, medium and high levels of complexity and the result have been compared with the same set of images, without considering the complexity of sub-images in blocks. The new method provided 17% higher stability.