Τμήμα Στατιστικής
Μόνιμο URI για αυτήν την κοινότηταhttps://pyxida.aueb.gr/handle/123456789/12
Το Τμήμα Στατιστικής ιδρύθηκε το 1989 και είναι το πρώτο αμιγές Τμήμα Στατιστικής που δημιουργήθηκε στον ελληνικό χώρο. Σκοπός του Τμήματος είναι η προαγωγή και η μετάδοση της γνώσης μέσω της έρευνας και της διδασκαλίας στο γνωστικό πεδίο της επιστήμης της Στατιστικής και των συναφών αντικειμένων, θεωρητικών και εφαρμοσμένων, καθώς και η κατάρτιση στελεχών υψηλής στάθμης για τις ανάγκες του ιδιωτικού και του δημόσιου τομέα. Η υλοποίηση του σκοπού αυτού επιδιώκεται μέσω της ανάπτυξης της έρευνας και της εκπαίδευσης, η οργάνωση των οποίων εξασφαλίζει στους πτυχιούχους τόσο τις κατάλληλες θεωρητικές γνώσεις, όσο και την ανάπτυξη της ικανότητας εφαρμογής τους στις πραγματικές ανάγκες διαφόρων τομέων της οικονομικής δραστηριότητας (Εθνική Στατιστική Υπηρεσία της Ελλάδας, δημόσιοι και ιδιωτικοί Οργανισμοί, όπως Υπουργεία, Νοσοκομεία, Τράπεζες με υπηρεσίες Στατιστικής, Βιομηχανία, Ασφαλιστικές Εταιρείες, Εταιρείες Δημοσκοπήσεων, Ερευνητικά Κέντρα κ.λπ.). Τέλος, ο στόχος αυτός επιτυγχάνεται, επίσης, με την ανάπτυξη των διεθνών επαφών, κύρια έκφραση των οποίων είναι το Πρόγραμμα Κινητικότητας Erasmus. Η ερευνητική δραστηριότητα του Τμήματος περιλαμβάνει ένα ευρύ φάσμα θεωρητικών και εφαρμοσμένων αντικειμένων της Στατιστικής επιστήμης και των πιθανοτήτων, όπως για παράδειγμα τη Bayesian Στατιστική, την Υπολογιστική στατιστική, τις χρονοσειρές, τη βιοστατιστική, τις στοχαστικές ανελίξεις, τις εφαρμοσμένες πιθανότητες, τις επαναλαμβανόμενες μετρήσεις, τη δειγματοληψία, τη θεωρία κατανομών, την αναλογιστική στατιστική, την οικονομετρία, τα χρηματοοικονομικά, τον έλεγχο ποιότητας, τη στοχαστική ανάλυση, τις επίσημες στατιστικές. Με την απόκτηση του πτυχίου, ανοίγονται στους αποφοίτους του Τμήματος πολλές δυνατότητες σταδιοδρομίας στην ελληνική και ευρωπαϊκή βιομηχανία, σε εταιρείες συμβούλων επιχειρήσεων, στον τραπεζικό και χρηματοοικονομικό τομέα και βέβαια στη δευτεροβάθμια και επαγγελματική εκπαίδευση. Πολλοί απόφοιτοι συνεχίζουν τις σπουδές τους σε μεταπτυχιακό και διδακτορικό επίπεδο στα αντίστοιχα προγράμματα του ίδιου του Τμήματος ή άλλων Πανεπιστημίων της Ελλάδας ή του εξωτερικού.URL: http://www. stat-athens.aueb.gr
Περιήγηση
Πλοήγηση Τμήμα Στατιστικής ανά Επιβλέπων "Dellaportas, Petros"
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
Α Β Γ Δ Ε Ζ Η Θ Ι Κ Λ Μ Ν Ξ Ο Π Ρ Σ Τ Υ Φ Χ Ψ Ω
Τώρα δείχνει 1 - 12 από 12
- Αποτελέσματα ανά σελίδα
- Επιλογές ταξινόμησης
Τεκμήριο Bayesian model determination and nonlinear threshold volatility modelsPetralias, Athanassios; Πετραλιάς, Αθανάσιος; Athens University of Economics and Business, Department of Statistics; Dellaportas, Petros; Ntzoufras, IoannisThe purpose of this Thesis is to document an original contribution in the areas of model determination and volatility modeling. Model determination is the procedure that evaluates the ability of competing hypothesized models to describe a phenomenon under study. Volatility modeling in the present context, involves developing models that can adequately describe the volatility process of a financial time series. In this Thesis we focus on the development of efficient algorithms for Bayesian model determination using Markov Chain Monte Carlo (MCMC), which are also used to develop a family of nonlinear flexible models for volatility. We propose a new method for Bayesian model determination that incorporates several desirable characteristics, resulting in better mixing for the MCMC chain and more precise estimates of the posterior density. The new method is compared with various existing methods in an extensive simulation study, as well as more complex model selections problems based on linear regression, with both simulated and real data comprising of 300 to 1000 variables. The method seems to produce rather promising results, overperforming several other existing algorithms in most of the analyzed cases. Furthermore the method is applied to gene selection using logistic regression, with a famous dataset including 3226 genes. The problem lies in identifying the genes related to the presence of a specific form of breast cancer. The new method again proves to be more efficient when compared to an existing Population MCMC sampler, while we extend the findings of previous medical studies on this issue. We present a new class of flexible threshold models for volatility. In these models the variables included, as well as the number and location of the threshold points are estimated, while the exogenous variables are allowed to be observed on lower frequencies than the dependent variable. To estimate these models we use the new method for Bayesian model determination, enriched with new move types, the use of which is validated through additional simulations. Furthermore, we propose a comparative model based on splines, where the number and location of the spline knots is related to a set of exogenous variables. The new models are applied to estimate and predict the variance of the Euro-dollar exchange rate, using as exogenous variables a set of U.S. macroeconomic announcements. The results indicate that the threshold models can provide significantly better estimates and projections than the spline model and typical conditional volatility models, while the most important macroeconomic announcements are identified. The threshold models are then generalised to the multivariate case. Under the proposed methodology, the estimation of the univariate variances is only required, as well as a rather small collection of regression coefficients. This simplifies greatly the inference, while the model is found to perform rather well in terms of predictability. A detailed review of both the available algorithms for Bayesian Model determination and nonlinear models for financial time series is also included in this Thesis. We illustrate how the existing methods for model determination are embedded into a common general scheme, while we discuss the properties and advantages each method has to offer. The main argument presented is that there is no globally best or preferable method, but their relative performance and applicability, depends on the dataset and problem of interest. With respect to the nonlinear models for financial time series and volatility we present in a unified manner, the main parametric and nonparametric classes of these models, while there is also a review of event studies analyzing the effect of news announcements on volatility.Τεκμήριο Bayesian modelling of high dimensional financial data using latent gaussian modelsAlexopoulos, Angelis N.; Αλεξόπουλος, Αγγελής Ν.; Athens University of Economics and Business, Department of Statistics; Dellaportas, Petros; Papaspiliopoulos, OmirosThe present thesis deals with the problem of developing statistical methodology for modellingand inference of high dimensional financial data. The motivation of our research wasthe identification of infrequent and extreme movements, which are called jumps, in the pricesof the 600 stocks of Euro STOXX index. This is known in the financial and statistical literatureas the problem of separating jumps from the volatility of the underlying process whichis assumed for the evolution of the stock prices.The main contribution of the thesis is the modelling and the development of methodsfor inference on the characteristics of the jumps across multiple stocks, as well as across thetime horizon. Following the Bayesian paradigm we use prior information in order to modela known characteristic of financial crises, which is that jumps in stock prices tend to occurclustered in time and to a↵ect several markets within a sort period of time. An improvementin the prediction of future stock prices has been achieved.The proposed model combines the stochastic volatility (SV) model with a multivariatejump process and belongs to the very broad class of latent Gaussian models. Bayesian inferencefor latent Gaussian models relies on a Markov chain Monte Carlo (MCMC) algorithmwhich alternates sampling from the distribution of the latent states of the model conditionalon the parameters and the observations, and sampling from the distribution of the parametersof the model conditional on the latent states and the observations. In the case of SVmodels with jumps, sampling the latent volatility process of the model is not a new problem.Over the last few years several methods have been proposed for separating the jumps fromthe volatility process but there is not a satisfactory solution yet, since sampling from a highdimensional nonlinear and non-Gaussian distribution is required. In the present thesis wepropose a Metropolis-Hastings algorithm in which we sample the whole path of the volatilityprocess of the model without using any approximation. We compare the resulting MCMCalgorithm with existing algorithms. We apply our proposed methodology on univariate SVwith jumps models in order to identify jumps in the stock prices of the real dataset thatmotivated our research.To model the propagation of the jumps across stocks and across time we combine the SVmodel with a doubly stochastic Poisson process, also known as Cox process. The intensityof the jumps in the Poisson process is modelled using a dynamic factor model. Furthermore,we develop an MCMC algorithm to conduct Bayesian inference for the parameters and thelatent states of the proposed model. We test the proposed methods on simulated data and weapplied them on our real dataset. We compare the prediction of future stock prices using theproposed model with the predictions obtained using existing models. The proposed modelprovides better predictions of future stock prices and this is an indication for a predictablepart of the jump process of SV models.IIIThe MCMC algorithm that is implemented in order to conduct Bayesian inference forthe aforementioned models is also employed on a demographic application. More precisely,within the context of latent Gaussian models we present a novel approach to model andpredict mortality rates of individuals.Τεκμήριο Bayesian regression models for mortality dataAlexopoulos, Angelis N.; Αλεξόπουλος, Αγγελής Ν.; Athens University of Economics and Business, Department of Statistics; Dellaportas, PetrosThe main theme of this thesis is to provide forecasts for mortality rates by using two different approaches. First, we employ dynamic non-linear logistic models based on Helingman-Pollard formula. Second, we assume that the dynamics of the mortality rates can be modeled through a Gaussian Markov random field. Both methodologies are tested with past data and are used to forecast mortality rates for UK and Wales up to 30 years ahead. In this thesis we make, firstly Bayesian analysis for mortality data. Many researchers in published papers have noticed that estimation of the parameters of the Heligman-Pollard model, Heligman and Pollard (1980), is problematic because of the overparameterization of the model. They also noted that using the weighted least squares approach for the estimation the numerical instabilities can removed only by fixing two parameters to be constant. The above problems can be tackled with Bayesian methods. Dellaportas et al. (2001) adopt a Bayesian inference approach for the Heligman-Pollard model. This approach has the following advantages. First the use of informative priors solves the problem of overparameterization. Secondly the non-normality of the likelihood surface means that the least squares estimates are inadequate. So, as a starting point, we repeated the work of Dellaportas et al. (2001) for recent mortality data. Then, I focused on the important issue of dynamic modeling of mortality data. The approach adopted till now is to first estimate the parameters of mortality rates model for each year (or 5-year) interval, and then modeling the estimated parameters via a time series model. Clearly, the parameter uncertainty and the within model parameter dependence is ignored. In this thesis we worked on this problem with data collected from the Human mortality database (www.mortality.org). Our first attempt was to assume a dynamic non-linear generalized model. The dynamic part is added to account for the dynamic projections of demographic characteristics across years. Inference of such a model is very hard, and in particular only ad-hoc solutions for dynamic linear generalized linear models exist. The othermodel-perspective that we attempted is based on non-isotropic Gaussian Processes. Here, our data (mortality rates), were viewed as coming from a huge multivariate normal distribution, in which we estimated the correlation structure and we made forecasts for the future mortality rates, based on this.Τεκμήριο Context tree weighting for signal processing, bayesian inference, and model selection: theoryand algorithms(23-11-2015) Skoularidou, Maria; Athens University of Economics and Business, Department of Statistics; Dellaportas, PetrosThe goal of the present thesis is to explore, extend and utilize a family of algorithms that arose over the past twenty years in the Information Theoryliterature, under the umbrella of “Context Tree Weighting”. Although these methods were originally motivated by and applied to problems in source coding and data compression, we argue that there range of applicability extends to a large variety of problems in statistical inference and signal processing.We will examine the Maximum A Posteriori Probability Tree Algorithm(MAPT) as an efficient method for Bayesian inference, in the context of discrete series data. The MAPT algorithm computes the maximum a posteriori probability tree model, as well as the corresponding model posterior probability. Experimental results will be given, illustrating its performance,both on independent data and on more complex signals generated by variable memory Markov chains.Τεκμήριο An econometric analysis of high-frequency financial data(12/09/2021) Lamprinakou, Fiori; Λαμπρινάκου, Φιόρη; Athens University of Economics and Business, Department of Statistics; Papaspiliopoulos, Omiros; Demiris, Nikolaos; Pedeli, Xanthi; Papastamoulis, Panagiotis; Tsionas, Mike; Damien, Paul; Dellaportas, PetrosWe present and compare observation driven and parameter driven models for predictinginteger price changes of high-frequency financial data. We explore Bayesian inferencevia Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) for the observationdriven model activity-direction-size (ADS), introduced by Rydberg and Shephard [1998a,2003]. We extend the ADS model by proposing a parameter driven model and use a Bernoulligeneralized linear model (GLM) with a latent process in the mean. We propose a new decompositionmodel that uses trade intervals and is applied on data that allow three possible tickmovements: one tick up price change, one tick down price change, or no price change. Wemodel each component sequentially using a Binomial generalized linear autoregressive movingaverage (GLARMA) model, as well as a GLM with a latent process in the mean. We perform asimulation study to investigate the effectiveness of the proposed parameter driven models usingdifferent algorithms within a Bayesian framework. We illustrate the analysis by modelling thetransaction-by-transaction data of of E-mini Standard and Poor’s (S&P) 500 index futures contracttraded on the Chicago Mercantile Exchange’s Globex platformbetween May 16th 2011 andMay 24th 2011. In order to assess the predictive performance, we compare the mean square error(MSE) and mean absolute error (MAE) criterion, as well as four scalar performance measures,namely, accuracy, sensitivity, precision and specificity derived from the confusion matrix.Τεκμήριο High dimensional time-varying covariance matrices with applications in finance(10-07-2011) Plataniotis, Anastasios; Πλατανιώτης, Αναστάσιος; Athens University of Economics and Business, Department of Statistics; Dellaportas, PetrosThe scope of this Thesis is to provide an original contribution in the area of Multivariate Volatility modeling. Multivariate Volatility modeling in the present context, involves developing models that can adequately describe the Covariance matrix process of Multivariate financial time series. Developmentof efficient algorithms for Bayesian model estimation using Markov Chain Monte Carlo (MCMC) and Nested Laplace approximations is our main objective in order to provide parsimonious and flexible volatility models. A detailed review of Univariate Volatility models for financial time series is first introduced in this Thesis. We illustrate the historical background of each model proposed and discuss its properties and advantages as well as comment on the several estimation methods that have emerged. We also provide a comparative analysis via a small simulation example for the dominant models in the literature. Continuing the review from the univariate models we move on to the multivariate case and extensively present competing models for Covariance matrices. The main argument presented is that currently no model is able to capture the dynamics of higher dimensional Covariance matrices fully, but their relative performance and applicability depends on the dataset and problem of interest. Problems are mainly due to the positive definiteness constraints required by most models as well as lack of interpretability of the model parameters in terms of the characteristics of the financial series. In addition, model development so far focus mostly in parameter estimation and in sample fit; it is our goal to examine the out-of-sample fit perspective of these models. We conclude the review section by proposing some small improvements for existing models that will lead towards more efficient parameter estimates, faster estimation methods and accurate forecasts. Subsequently, a new class of multivariate models for volatility is introduced. The new model is based on the Spectral decomposition of the time changing covariance matrix and the incorporation of autocorrelation modeling or the time changing elements. In these models we allow a priori for all the elements of the covariance matrix to be time changing as independent autoregressive processes and then for any given dataset we update our prior information and decide on the number of time changing elements. Theoretical properties of the new model are presented along with a series of estimation methods, bayesian and classical. We conclude that in order to estimate these models one may use an MCMC method for small dimension portfolios in terms of the size of the covariance matrix. For higher dimensions, due to the curse of dimensionality we propose to use a Nested Laplace approximation approach that provides results much faster with small loss in accuracy. Once the new model is proposed along with the estimation methods, we compare its performance against competing models in simulated and real datasets; we also examine its performance in small portfolios of less than 5 assets as well in the high dimensional case of up to 100 assets. Results indicate that the new model provides significantly better estimates and projections than current models in the majority of example datasets. We believe that small improvements in terms of forecasting is of significant importance in the finance industry. In addition, the new model allows for parameter interpretability and parsimony which is of huge importance due to the dimensionality curse. Simplifying inference and prediction of multivariate volatility models was our initial goal and inspiration. It is our hope that we have made a small step towards that direction, and a new path for studying multivariate financial data series has been revealed. We conclude by providing some proposals for future research that we hope may influence some people into furthering this class of models.Τεκμήριο On variance reduction for Markov chain Monte Carlo(02-2012) Tsourti, Zoi; Τσούρτη, Ζωή; Athens University of Economics and Business, Department of Statistics; Dellaportas, Petros; Kontoyannis, IoannisIn the present thesis we are concerned with appropriate variance reduction methods for specific classes of Markov Chain Monte Carlo (MCMC) algorithms. The variance reduction method of main interest here is that of control variates. More particularly, we focus on control variates of the form U = G−PG, for arbitrary function G, where PG stands for the one-step ahead conditional expectation, that have been proposed by Henderson (1997). A key issue for the efficient implementation of control variates is the appropriate estimation of corresponding coefficients. In the case of Markov chains, this involves the solution of Poisson equation for the function of initial interest, which in most cases is intractable. Dellaportas & Kontoyiannis (2012) have further elaborated on this issue and they have proven optimal results for the case of reversible Markov chains, avoiding that function. In this context, we concentrate on the implementation of those results for Metropolis-Hastings (MH) algorithm, a popular MCMC technique. In the case of MH, the main issue of concern is the assessment of one-step ahead conditional expectations, since these are not usually available in closed form expressions. The main contribution of this thesis is the development and evaluation of appropriate techniques for dealing with the use of the above type of control variates in the MH setting. The basic approach suggested is the use of Monte Carlo method for estimating one-step ahead conditional expectations as empirical means. In the case of MH this is a straightforward task requiring minimum additional analytical effort. However, it is rather computationally demanding and, hence, alternative methods are also suggested. These include importance sampling of the available data resulting from the algorithm (that is, the initially proposed or finally accepted values), additional application of the notion of control variates for the estimation of PG’s, or parallel exploitation of the values that are produced in the frame of an MH algorithm but not included in the resulting Markov chain (hybrid strategy). The ultimate purpose is the establishment of a purely efficient strategy, that is, a strategy where the variance reduction attained overcomes the additional computational cost imposed. The applicability and efficiency of the methods is illustrated through a series of diverse applications.Τεκμήριο Particle filters(04-2013) Pierroutsakos, Konstantinos X.; Athens University of Economics and Business, Department of Statistics; Dellaportas, PetrosSubject of this master thesis is the presentation of a methodology, called Particle filters, a class of Sequential Monte Carlo (SMC) methods used to sample sequentially from a sequence of high-dimensional and complex probability distributions. A cloud of particles evolves over time as new observations become available to approximate the posterior distribution of the state variables. From when first introduced from Gordon (1993)[17] , it has been used in a lot of applications that include signal and image processing, predicting economical data, tracking the position of aircraft or cars. In the context of this thesis, the algorithm of particle filters is illustrated, as well as a relatively new technique introduced by Christophe Andrieu, Arnaud Doucet, Roman Holenstein (2010)[8], called Particle Markov Chain Monte Carlo (PMCMC) which combines the MCMC with the SMC methods. PMCMC is used when the state-space model depends on unknown parameters in a case where standard particle filters fail. The main objective of this thesis is to illustrate the basics of those methodologies in theoretical framework together with a simple example.Τεκμήριο A simulation approach to the problem of moments(Athens University of Economics and Business, 1998) Vargemidis, Nikolaos G.; Athens University of Economics and Business, Department of Statistics; Dellaportas, PetrosThesis - Athens University of Economics and Business. Postgraduate, Department of StatisticsΤεκμήριο Spatial statistics for methylated DNA genomes(Athens University of Economics and Business, 08-2012) Vradi, Eleni; Dellaportas, PetrosThesis - Athens University of Economics and Business. Postgraduate, Department of StatisticsΤεκμήριο Time series and forecasting(Athens University of Economics and Business, 02-2015) Kazatzi, Angeliki V.; Athens University of Economics and Business, Department of Statistics; Dellaportas, PetrosThesis - Athens University of Economics and Business. Postgraduate, Department of StatisticsΤεκμήριο Α reinforcement learning approach on matrix triangularization using rotations(21-11-2023) Λώλος, Αλέξανδρος; Lolos, Alexandros; Athens University of Economics and Business, Department of Statistics; Demiris, Nikolaos; Livada, Alexandra; Dellaportas, PetrosΣε αυτήν τη διπλωματική εργασία θα μελετήσουμε λεπτομερώς τον μετασχηματισμό Householder, τις περιστροφές Givens, τις περιστροφές Cordic Givens, τον αλγόριθμο Gram-Schmidt και την τροποποιημένη έκδοσή του. ́Ολες αυτές οι μέθοδοι στις οποίες αναφερθήκαμε μέχρι στιγμής είναι μέθοδοι για την παραγοντοποίηση QR με διαφορές σε πολυπλοκότητα και σφάλμα. Επίσης θα συζητήσουμε για την βασική θεωρία της ενισχυτικής μάθησης και θα εξερευνήσουμε τις διαφορές μεταξύ αλγορίθμων όπως ο SARSA και ο Q-Learning. Στο κεφάλαιο των πειραματισμών θα σχεδιάσουμε την παραγοντοποίηση QR σαν ένα επιτραπέζιο. Με τη χρήση ενισχυτικής μάθησης και συγκεκριμένα του Q-Learning αλγορίθμου θα οδηγήσουμε τις περιστροφές Givens σε εναλλακτικά μονοπάτια για τον αλγόριθμό μας, όπου για συγκεκριμένες εισόδους πινάκων θα οδηγηθούν στη παραγοντοποίηση QR σε λιγότερες επαναλήψεις.