Volume 8, Issue 1, Winter and Spring 2008, Page 1-195


A comparative study between Classical Numerical Methods and Monte Carlo Methods

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 1-9
DOI: 10.33899/iqjoss.2008.31512

Recently, there is a great interest in the methods of Monte Carlo used for the treatment of different technical and scientific issues. This research deals with using the Monte Carlo methods in numerical integration by making a general comparison between the classical methods of integration and the Monte Carlo methods. The research also applies the statistical sampling method to compute the approximated values of a number of numerical integrations. By doing so, we conclude that the Monte Carlo methods are efficient in treating such important issues.

On Bayesian Estimation in Mixed Linear Models Using the Gibbs Sampler

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 10-34
DOI: 10.33899/iqjoss.2008.31516

This paper tackles the estimation of parameters of linear mixed random effect one–classification model by Bayesian technique which includes Gibbs sampling. Gibbs sampling is a special case of Monte Carlo Method which uses Markov Chain and so called MCMC (Markov Chain Monte Carlo).
This MCMC method depends on partition of difficult and compound models into simple ones which can be manipulated and easily analyzed, specially for the posterior distribution which are not easy to find their final formulae.
In this research the mixed random effect linear one–classification model is proposed on a population of 15 treatments including 15 types of cotton plant. A random sample of 5 types is taken and using the analysis of variance method to test the hypothesis that all the 15 types have equal effect and the estimation of the parameters is obtained Gibbs sampling is also used in order to estimate the parameters and then testing the hypothesis of equal effects of treatments. The results obtained in both ANOVA and Gibbs sampling are nearly the same and encouraging. All algorithms are programmed in this research using WinBUGS program.

On the Discrimination between the Inverse Gaussian and Lognormal Distributions

Zakaria Y. AL-Jammal

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 27-52
DOI: 10.33899/iqjoss.2008.31670

Both inverse Gaussian and lognormal distributions have been used among many well-known failure time distributions with positively skewed data. The problem of selecting between them is considered. The logarithm of maximum likelihood ratio has been used as a test for discriminating between these two distributions. The test has been carried out on nine different real data sets and three simulated data sets.

On the Discrimination between the Inverse Gaussian and Lognormal Distributions

Zakaria Y. AL-Jammal

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 27-52
DOI: 10.33899/iqjoss.2008.31689

Both inverse Gaussian and lognormal distributions have been used among many well-known failure time distributions with positively skewed data. The problem of selecting between them is considered. The logarithm of maximum likelihood ratio has been used as a test for discriminating between these two distributions. The test has been carried out on nine different real data sets and three simulated data sets.

Keywords: Inverse Gaussian distribution, lognormal distribution, Ratio maximum likelihood, Discrimination.

Revised model for radial Basis Function

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 35-56
DOI: 10.33899/iqjoss.2008.31540

In this paper , a revised model for radial basis function has been done with application of sunspot time series. It has been arrived at a revised model for nonlinear time series. The probability characteristics have been explained for such series with chaotic analysis of the simulated data from revised model ,which indicates that the revised model is chaotic and bayesian information criterion ( BIC) Schwarz information criterion (SIC)and the error are better than radial basis function

New CG Method for Large-Scale Unconstrained Optimization Based on Nazareth theorem

Khalil . K. Abbo

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 53-65
DOI: 10.33899/iqjoss.2008.31676

In this paper we present new conjugate gradient method for computing the minimum value of differentiable real valued function in n variables ,this method derived from Nazareth theorem , which uses the equivalence of CG and Qusi–Newton methods on quadratic function also the descent property and Conjucy conditions are proved and compared with some well know CG method showing considerable improvement .

Employing Kruskal's method for assignment problem

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 57-70
DOI: 10.33899/iqjoss.2008.31548

The Assignment problem is really considered really very important, which assigns a set of n distinct jobs to n machines such that, the total cost is minimum. The problem is solved by many methods, one of them is the Hungarian's method. This research deals with graphs by proposing a new method which deals with complete bipartite graph using Kruskal's method. We compare the results for different problems which give optimal solution in both methods, but the new method gives a high degree of success and its easy to use.

Qualified Genetic Algorithms for Images Smoothing

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 71-88
DOI: 10.33899/iqjoss.2008.31554

Four types of algorithms are suggested in this paper to smooth images; three of them work in the special domain and the fourth works in the frequency domain. The first genetic algorithm uses smoothing filters which are called the median, mean or minimum as an objective function for the genetic algorithm each time. A style has been proposed to enhance the mean filter. The second genetic algorithm uses processes of morphology to perform smoothing operation, while the third and fourth ones utilize Gaussian filter as an objective function, but they differ in that the third genetic algorithm works in the special domain and the fourth in the frequency one.

Optimum Constant Smoothing for Exponential Smoothing Model with Application

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 89-103
DOI: 10.33899/iqjoss.2008.31558

In this paper, we study methods to obtain the constant smoothing of exponential smoothing model and suggest another method, to obtain constant smoothing of dependent (two estimation New and old Information) of dependent adaptive filtering ,and applied for time series to National Income overall of the Egypt for (1965-2002) and unsure of use this model to use the measure of (χ2), simulation have been done depending on the parameter of the original series with different numbers.

State Space Models Employment and Principle Components Approach to Estimate the Delay Time

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 104-112
DOI: 10.33899/iqjoss.2008.31567

This research deals with a state space models and principle components analysis approach to estimate the delay time in linear stochastic dynamical systems. The delay time has a statistical importance to describe the suitable identification. The results are compared with the proposed approach from the simulation experiments where the Principle Component approach gives a high degree of success to estimate the delay time with state space where there isn’t any failure state compared with original data.

Using Network Sampling for estimating the Diffusion of Thalassemia Disease

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 113-128
DOI: 10.33899/iqjoss.2008.31638

This paper used the technique of Network Sampling for estimating the diffusion of a hevidetary disease of Mediterranean (Thalassemia) in Tala'fur city, and a comparison of the efficiency of this technique with some ordinary techniques was made in estimation , and it has been concluded that this technique had been characterized by less variance and giving it a near actual estimations as compared to the remaining known ordinary techniques in which the most important one is the technique of proportion sampling .

The Calculation of average of measurements newly born babies using Bayes estimates depending on prior distribution of standard information

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 129-135
DOI: 10.33899/iqjoss.2008.31640

In this paper calculation of averages of measurements of newly born baby are obtained; namely, the weight , height ,size circumference of head, chest , and arm; through which the health of the baby may be decided using Bayes estimates depending on the standard information of the previous distribution and compared it with the traditional estimate depending on sample information only.

Extract Hidden Information in Digital Images

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 136-151
DOI: 10.33899/iqjoss.2008.31646

One of the Problems of image processing is that there is plenty of hidden information among the gray levels of these images. In this research the histogram technique is widely used for image enhancement to extract that information. Through the applications applied to different types of images better results were obtained compared with the original ones which were processed via the proposed algorithm.

Key Words : Digital Image Processing, Image Enhancement, Histogram, Feature Extraction.

Using the Genetic Algorithm to Maximize the Likelihood Function of Normal Distribution

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2008, Volume 8, Issue 1, Pages 152-162
DOI: 10.33899/iqjoss.2008.31651

In this research, the genetic algorithm (GA) has been carried out. This application is considered to be a manufacturing treatment to find the value which maximizes the likelihood function . An algorithm has been proposed to find the values which maximize. The likelihood function for normal distribution. The application of this algorithm enables us to find out several solutions, including the value which is responsible for the maximization of the likelihood function and the number of the latter is equal to the times for generating the algorithm.