skip to content

Working Papers


Archives: WP version of published papers

  • "Testing for Alpha in Linear Factor Pricing Models with a Large Number of Securities", by M. Hashem Pesarann and Takashi Yamagata, March 2017

    Abstract: This paper proposes a novel test of zero pricing errors for the linear factor pricing model when the number of securities, N, can be large relative to the time dimension, T, of the return series. The test is based on Student t tests of individual securities and has a number of advantages over the existing standardised Wald type tests. It allows for non-Gaussianity and general forms of weakly cross correlated errors. It does not require estimation of an invertible error covariance matrix, it is much faster to implement, and is valid even if N is much larger than T. Monte Carlo evidence shows that the proposed test performs remarkably well even when T = 60 and N = 5,000. The test is applied to monthly returns on securities in the S&P 500 at the end of each month in real time, using rolling windows of size 60. Statistically significant evidence against Sharpe-Lintner CAPM and Fama-French three factor models are found mainly during the recent financial crisis. Also we find a significant negative correlation between a twelve-months moving average p-values of the test and excess returns of long/short equity strategies (relative to the return on S&P 500) over the period November 1994 to June 2015, suggesting that abnormal profits are earned during episodes of market inefficiencies.
    JEL Classifications: C12, C15, C23, G11, G12.
    Key Words: CAPM, Testing for alpha, Weak and spatial error cross-sectional dependence, S&P 500 securities, Long/short equity strategy.
    Full Text:

  • "Double-question Survey Measures for the Analysis of Financial Bubbles and Crashes", by M. Hashem Pesarann and Ida Johnsson, December 2016, revised June 2017

    Abstract: This paper proposes a new double-question survey whereby an individual is presented with two sets of questions; one on beliefs about current asset values and another on price expectations. A theoretical asset pricing model with heterogeneous agents is advanced and the existence of a negative relationship between price expectations and asset valuations is established, which is tested using survey results on equity, gold and house prices. Leading indicators of bubbles and crashes are proposed and their potential value is illustrated in the context of a dynamic panel regression of realized house price changes across a number of key MSAs in the US.
    JEL Classifications: C83, D84, G12, G14.
    Key Words: Price expectations, bubbles and crashes, house prices, belief valuations.
    Full Text:

  • "Econometric Analysis of Production Networks with Dominant Units", by M. Hashem Pesarann and Cynthia Fan Yang, October 2016, revised August 2017

    Abstract: This paper considers production and price networks with unobserved common factors, and derives an exact expression for the rate at which aggregate fluctuations vary with the dimension of the network. It introduces the notions of strongly and weakly dominant and non-dominant units, and shows that at most a finite number of units in the network can be strongly dominant. The pervasiveness of a network is measured by the degree of dominance of the most pervasive unit in the network, and is shown to be equivalent to the inverse of the shape parameter of the power law fitted to the network outdegrees. New cross-section and panel extremum estimators for the degree of dominance of individual units in the network are proposed and their asymptotic properties investigated. Using Monte Carlo techniques, the proposed estimator is shown to have satisfactory small sample properties. An empirical application to US input-output tables spanning the period 1972 to 2007 is provided which suggests that no sector in the US economy is strongly dominant. The most dominant sector turns out to be the wholesale trade with an estimated degree of dominance ranging from 0.72 to 0.82 over the years 1972-2007.
    JEL Classifications: C12, C13, C23, C67, E32
    Key Words: Aggregate fluctuations, strongly and weakly dominant units, spatial models, outdegrees, degree of pervasiveness, power law, input-output tables, US economy.
    Full Text:

  • "Half-Panel Jackknife Fixed Effects Estimation of Panels with Weakly Exogenous Regressors, by Alexander Chudik, M. Hashem Pesarann and Jui-Chung Yang, SSRN Working Paper No. 281, September 2016

    Abstract: This paper considers estimation and inference in fixed effects (FE) linear panel regression models with lagged dependent variables and/or other weakly exogenous (or predetermined) regressors when N (the cross section dimension) is large relative to T (the time series dimension). The paper first derives a general formula for the bias of the FE estimator which is a generalization of the Nickell type bias derived in the literature for the pure dynamic panel data models. It shows that in the presence of weakly exogenous regressors, inference based on the FE estimator will result in size distortions unless N/T is suffciently small. To deal with the bias and size distortion of FE estimator when N is large relative to T, the use of half-panel Jackknife FE estimator is proposed and its asymptotic distribution is derived. It is shown that the bias of the proposed estimator is of order [code], and for valid inference it is only required that N/T[code], as N, T [code] jointly. Extensions to panel data models with time effects (TE), for balanced as well as unbalanced panels, are also provided. The theoretical results are illustrated with Monte Carlo evidence. It is shown that the FE estimator can suffer from large size distortions when N > T, with the proposed estimator showing little size distortions. The use of half-panel jackknife FE-TE estimator is illustrated with two empirical applications from the literature.
    JEL Classifications: C32, E17, E32, F44, F47, O51, Q43.
    Key Words: Panel Data Models, Weakly Exogenous Regressors, Lagged Dependent Variable, Fixed Effects, Time Effects, Unbalanced Panels, Half-Panel Jackknife, Bias Correction
    Full Text:

  • "Oil Prices and the Global Economy: Is It Different This Time Around?", by Kamiar Mohaddes and M. Hashem Pesarann, July 2016

    Abstract: The recent plunge in oil prices has brought into question the generally accepted view that lower oil prices are good for the US and the global economy. In this paper, using a quarterly multi-country econometric model, we first show that a fall in oil prices tends relatively quickly to lower interest rates and inflation in most countries, and increase global real equity prices. The effects on real output are positive, although they take longer to materialize (around 4 quarters after the shock). We then re-examine the effects of low oil prices on the US economy over different sub-periods using monthly observations on real oil prices, real equity prices and real dividends. We confirm the perverse positive relationship between oil and equity prices over the period since the 2008 financial crisis highlighted in the recent literature, but show that this relationship has been unstable when considered over the longer time period of 1946-2016. In contrast, we find a stable negative relationship between oil prices and real dividends which we argue is a better proxy for economic activity (as compared to equity prices). On the supply side, the effects of lower oil prices differ widely across the different oil producers, and could be perverse initially, as some of the major oil producers try to compensate their loss of revenues by raising production. Taking demand and supply adjustments to oil price changes as a whole, we conclude that oil markets equilibrate but rather slowly, with large episodic swings between low and high oil prices.
    JEL Classifications: C32, E17, E32, F44, F47, O51, Q43.
    Key Words: Oil prices, equity prices, dividends, economic growth, oil supply, global oil markets, and international business cycle.
    Full Text:


  • "A One-Covariate at a Time, Multiple Testing Approach to Variable Selection in High-Dimensional Linear Regression Models", by Alexander Chudik, George Kapetanios and M. Hashem Pesaran, February 2016, revised November 2016

    Abstract: Model specification and selection are recurring themes in econometric analysis. Both topics become considerably more complicated in the case of large-dimensional data sets where the set of specification possibilities can become quite large. In the context of linear regression models, penalised regression has become the de facto benchmark technique used to trade off parsimony and fit when the number of possible covariates is large, often much larger than the number of available observations. However, issues such as the choice of a penalty function and tuning parameters associated with the use of penalized regressions remain contentious. In this paper, we provide an alternative approach that considers the statistical significance of the individual covariates one at a time, whilst taking full account of the multiple testing nature of the inferential problem involved. We refer to the proposed method as One Covariate at a Time Multiple Testing (OCMT) procedure. The OCMT provides an alternative to penalised regression methods: It is based on statistical inference and is therefore easier to interpret and relate to the classical statistical analysis, it allows working under more general assumptions, it is faster, and performs well in small samples for almost all of the different sets of experiments considered in this paper. We provide extensive theoretical and Monte Carlo results in support of adding the proposed OCMT model selection procedure to the toolbox of applied researchers. The usefulness of OCMT is also illustrated by an empirical application to forecasting U.S. output growth and inflation.
    JEL Classifications: C52, C55
    Key Words: One covariate at a time, multiple testing, model selection, high dimensionality, penalised regressions, boosting, Monte Carlo experiments.
    Full Text:
    Supplement 1:
    Supplement 2:


  • "Quasi Maximum Likelihood Estimation of Spatial Models with Heterogeneous Coefficients", by Michele Aquaro, Natalia Bailey and M. Hashem Pesaran, June 2015

    Abstract: This paper considers spatial autoregressive panel data models and extends their analysis to the case where the spatial coefficients differ across the spatial units. It derives conditions under which the spatial coefficients are identied and develops a quasi maximum likelihood (QML) estimation procedure. Under certain regularity conditions, it is shown that the QML estimators of individual spatial coefficients are consistent and asymptotically normally distributed when both the time and cross section dimensions of the panel are large. It derives the asymptotic covariance matrix of the QML estimators allowing for the possibility of non-Gaussian error processes. Small sample properties of the proposed estimators are investigated by Monte Carlo simulations for Gaussian and non-Gaussian errors, and with spatial weight matrices of differing degree of sparseness. The simulation results are in line with the paper's key theoretical findings and show that the QML estimators have satisfactory small sample properties for panels with moderate time dimensions and irrespective of the number of cross section units in the panel, under certain sparsity conditions on the spatial weight matrix.
    JEL Classifications: C21, C23
    Key Words: Spatial panel data models, heterogeneous spatial lag coefficients, identification, quasi maximum likelihood (QML) estimators, non-Gaussian errors.
    Full Text:


  • "Tests of Policy Ineffectiveness in Macroeconometrics", by M. Hashem Pesaran and Ron P. Smith, CAFE Research Paper No. 14.07, June 2014, revised January 2015

    Abstract: This paper considers tests of the null hypothesis of the ineffectiveness of a policy intervention, defined as a change in the parameters of a policy rule, in the context of a macroeconometric dynamic stochastic general equilibrium (DSGE) model. This is an ex post evaluation of an intervention in a single country, where data are available before and after the interven- tion. The tests are based on the difference between the realisations of the outcome variable of interest and counterfactuals based on no policy intervention, using only the pre-intervention parameter estimates, and in consequence the Lucas Critique does not apply. We show that such tests will have power to detect the effect of a policy intervention on a target outcome variable that changes the steady state value of that variable, e.g. the target inflation rate. They will have less power against interventions which do not change the steady state, since these typically only have transitory effects. Asymptotic distributions of the proposed tests are derived both when the post intervention sample is fixed as the pre-intervention sample expands, and when both samples rise jointly but at different rates. The performance of the test is illustrated by a simulated policy analysis of a three equation New Keynesian Model.
    JEL Classifications: C18, C54, E65.
    Key Words: Counterfactuals, policy analysis, policy ineffectiveness test, macroeconomics.
    Full Text:


  • "A Multiple Testing Approach to the Regularisation of Large Sample Correlation Matrices", by Natalia Bailey, M. Hashem Pesaran and L. Vanessa Smith, CAFE Research Paper No. 14.05, May 2014, revised September 2016

    Abstract: This paper proposes a regularisation method for the estimation of large covariance matrices that uses insights from the multiple testing (MT) literature. The approach tests the statistical signi?cance of individual pair-wise correlations and sets to zero those elements that are not statistically significant, taking account of the multiple testing nature of the problem. The effective p-values of the tests are set as a decreasing function of N (the cross section dimension), the rate of which is governed by the maximum degree of dependence of the underlying observations when their pair-wise correlation is zero, and the relative expansion rates of N and T (the time dimension). In this respect, the method specifies the appropriate thresholding parameter to be used under Gaussian and non-Gaussian settings. The MT estimator of the sample correlation matrix is shown to be consistent in the spectral and Frobenius norms, and in terms of support recovery, so long as the true covariance matrix is sparse. The performance of the proposed MT estimator is compared to a number of other estimators in the literature using Monte Carlo experiments. It is shown that the MT estimator performs well and tends to outperform the other estimators, particularly when N is larger than T.
    JEL Classifications: C13, C58.
    Key Words: High-dimensional data, Multiple testing, Non-Gaussian observations, Sparsity, Thresholding, Shrinkage.
    Full Text:
    Supplementary Material:


  • "Transformed Maximum Likelihood Estimation of Short Dynamic Panel Data Models with Interactive Effects", by Kazuhiko Hayakawa, M. Hashem Pesaran and L. Vanessa Smith, CAFE Research Paper No. 14.06, May 2014

    Abstract: This paper proposes the transformed maximum likelihood estimator for short dynamic panel data models with interactive fixed effects, and provides an extension of Hsiao et al. (2002) that allows for a multifactor error structure. This is an important extension since it retains the advantages of the transformed likelihood approach, whilst at the same time allows for observed factors (fixed or random). Small sample results obtained from Monte Carlo simulations show that the transformed ML estimator performs well in finite samples and outperforms the GMM estimators proposed in the literature in almost all cases considered.
    JEL Classifications: C12, C13, C23.
    Key Words: short T dynamic panels, transformed maximum likelihood, multi-factor error structure, interactive fixed effects.
    Full Text:


  • "Uncertainty and Economic Activity: A Global Perspective", by Ambrogio Cesa-Bianchi, M. Hashem Pesaran and Alessandro Rebucci, March 2014

    Abstract: The 2007-2008 global financial crisis and the subsequent anemic recovery have rekindled academic interest in quantifying the impact of uncertainty on macroeconomic dynamics based on the premise that uncertainty causes economic activity to slow down and contract. In this paper, we study the interrelation between financial markets volatility and economic activity assuming that both variables are driven by the same set of unobserved common factors. We further assume that these common factors affect volatility and economic activity with a time lag of at least a quarter. Under these assumptions, we show analytically that volatility is forward looking and that the output equation of a typical VAR estimated in the literature is mis-specified as least squares estimates of this equation are inconsistent. Empirically, we document a statistically significant and economically sizable impact of future output growth on current volatility, and no effect of volatility shocks on business cycles, over and above those driven by the common factors. We interpret this evidence as suggesting that volatility is a symptom rather than a cause of economic instability.
    JEL Classifications: E44, F44, G15.
    Key Words: Uncertainty, Realized volatility, GVAR, Great Recession, Identification, Business Cycle, Common Factors.
    Full Text:


  • "Counterfactual Analysis in Macroeconometrics: An Empirical Investigation into the Effects of Quantitative Easing", by M. Hashem Pesaran and Ron P Smith, IZA Discussion Paper No. 6618, May 2012, revised June 2014

    Abstract: The policy innovations that followed the recent Great Recession, such as unconventional monetary policies, prompted renewed interest in the question of how to measure the effectiveness of such policy interventions. To test policy effectiveness requires a model to construct a counterfactual for the outcome variable in the absence of the policy intervention and a way to determine whether the differences between the realised outcome and the model-based counter- factual outcomes are larger than what could have occurred by chance in the absence of policy intervention. Pesaran & Smith (2014b) propose tests of policy ineffectiveness in the context of macroeconometric rational expectations dynamic stochastic general equilibrium models. When we are certain of the specification, estimation of the complete system imposing all the cross-equation restrictions implied by the full structural model is more efficient. But if the full model is misspecified, one may obtain more reliable estimates of the counterfactul outcomes from a parsimonious reduced form policy response equation, which conditions on lagged values, and on the policy measures and variables known to be invariant to the policy intervention. We propose policy ineffectiveness tests based on such reduced forms and illustrate the tests with an application to the unconventional monetary policy known as quantitative easing (QE) adopted in the UK.
    JEL Classifications: C18, C54, E65
    Key Words: Counterfactuals, policy analysis, policy ine¤ectiveness test, macroeconomics, quantitative easing (QE)
    Full Text:


  • "Testing CAPM with a Large Number of Assets", by M. Hashem Pesaran and Takashi Yamagata, CWPE Working Paper No. 1210, IZA Discussion Paper No. 6469, under revision, February 2012, under revision

    Abstract: This paper is concerned with testing the time series implications of the capital asset pricing model (CAPM) due to Sharpe (1964) and Lintner (1965), when the number of securities, N, is large relative to the time dimension, T, of the return series. In the case of cross-sectionally correlated errors, using a threshold estimator of the average squares of pair-wise error correlations a test is proposed and is shown to be valid even if N is much larger than T. Monte Carlo evidence show that the proposed test works well in small samples. The test is then applied to all securities in the S&P 500 index with 60 months of return data at the end of each month over the period September 1989-September 2011. Statistically significant evidence against Sharpe-Lintner CAPM is found mainly during the recent financial crisis. Furthermore, a strong negative correlation is found between a twelve-month moving average p-values of the test and the returns of long/short equity strategies relative to the return on S&P 500 over the period December 2006 to September 2011, suggesting that abnormal profits are earned during episodes of market inefficiencies.
    JEL Classifications: C12, C15, C23, G11, G12
    Key Words: CAPM, Testing for alpha, Market efficiency, Long/short equity returns, Large panels, Weak and strong cross-sectional dependence.
    Full Text:


  • "Business Cycle Effects of Credit Shocks in a DSGE Model with Firm Defaults", by M. Hashem Pesaran and TengTeng Xu, CWPE Working paper. No. 1159, CESifo Working Paper No. 3609, IZA Discussion Paper No. 6027, October 2011, revised April 2016

    Abstract: This paper proposes a new theoretical framework for the analysis of the relationship between credit shocks, firm defaults and volatility. The key feature of the modelling approach is to allow for the possibility of default in equilibrium. The model is then used to study the impact of credit shocks on business cycle dynamics. It is assumed that firms are identical ex ante but differ ex post due to different realizations of firm-specific technology shocks, possibly leading to default by some firms. The implications of firm defaults for the balance sheets of households and banks and their subsequent impacts on business uctuations are investigated within a dynamic stochastic general equilibrium framework. Results from a calibrated version of the model suggest that, in the steady state, a firm's default probability rises with its leverage ratio and the level of uncertainty in the economy. A positive credit shock, defined as a rise in the loan-to-deposit ratio, increases output, consumption, hours and productivity, and reduces the spread between loan and deposit rates. Interestingly, the effects of the credit shock tend to be highly persistent, even without price rigidities and habit persistence in consumption behavior.
    JEL Classifications: E32, E44, E50, G21.
    Key Words: Firm Defaults; Credit Shocks; Financial Intermediation; Interest Rate Spread; Uncertainty.
    Full Text:
    Supplement: 5Oct2011_Supplement.pdf


  • "Optimality and Diversifiability of Mean Variance and Arbitrage Pricing Portfolios", by M. Hashem Pesaran, and Paolo Zaffaroni, CESifo Working Papers No. 2857, October, 2009

    Abstract: This paper investigates the limit properties of mean-variance (mv) and arbitrage pricing (ap) trading strategies using a general dynamic factor model, as the number of assets diverge to infinity. It extends the results obtained in the literature for the exact pricing case to two other cases of asymptotic no-arbitrage and the unconstrained pricing scenarios. The paper characterizes the asymptotic behaviour of the portfolio weights and establishes that in the non-exact pricing cases the ap and mv portfolio weights are asymptotically equivalent and, moreover, functionally independent of the factors conditional moments. By implication, the paper sheds light on a number of issues of interest such as the prevalence of short-selling, the number of dominant factors and the granularity property of the portfolio weight.
    JEL Classifications: Large Portfolios, Factor Models, Mean-Variance Portfolio, Arbitrage Pricing, Market (Beta) Neutrality, Well Diversification.
    Key Words: C32, C52, C53, G11.
    Full Text: