Engle, R. F. (Robert F.)
Most widely held works about
R. F Engle
Most widely held works by
R. F Engle
Anticipating correlations a new paradigm for risk management by R. F Engle (
file
)
15
editions published
in
2009
in
English
and held by
1,305
libraries
worldwide
Financial markets respond to information virtually instantaneously. Each new piece of information influences the prices of assets and their correlations with each other, and as the system rapidly changes, so too do correlation forecasts. This fastevolving environment presents econometricians with the challenge of forecasting dynamic correlations, which are essential inputs to risk measurement, portfolio allocation, derivative pricing, and many other critical financial activities. In Anticipating Correlations, Nobel Prizewinning economist Robert Engle introduces an important new method for es
Technical capabilities necessary for regulation of systemic financial risk summary of a workshop by Workshop on Technical Capabilities Necessary for Regulation of Systemic Financial Risk (
file
)
4
editions published
in
2010
in
English
and held by
946
libraries
worldwide
Longrun economic relationships : readings in cointegration by Clive William John Granger (
Book
)
13
editions published
between
1991
and
1992
in
English and Undetermined
and held by
425
libraries
worldwide
ARCH : selected readings by R. F Engle (
Book
)
15
editions published
between
1995
and
2004
in
English and Undetermined
and held by
299
libraries
worldwide
In the early 1980s, R. F. Engle pioneered the econometric technique of AutoRegressive Conditional Heteroskedasticity (ARCH), which has subsequently generated a very considerable literature. This collection brings together the leading papers which have shaped ARCH research from its inception to the latest developments. Papers present both theory and financial market analysis, and discuss the key issues in the use of ARCH models to study volatility and correlation: which model to use, what time intervals to employ, how to model multivariate systems, how to apply the models to price and trade options, and how to model volatility spillovers across markets and within the day
Cointegration, causality, and forecasting : a festschrift in honour of Clive W.J. Granger
(
Book
)
7
editions published
in
1999
in
English
and held by
297
libraries
worldwide
"Clive W. J. Granger is a pioneer in econometrics, perhaps best known for his work on cointegration: this book is a collection of essays dedicated to him and his work. Central themes of Granger's work are reflected in the book with attention given to tests for unit roots and cointegration, tests of misspecification, forecasting models and forecast evaluation, nonlinear and nonparametric econometric techniques, and overall, a careful blend of practical empirical work and strong theory. The book shows the scope of Granger's research and the range of the profession that has been influenced by his work."BOOK JACKET
Volatility and time series econometrics : essays in honor of Robert F. Engle
(
Book
)
9
editions published
between
2009
and
2010
in
English
and held by
220
libraries
worldwide
This volume celebrates and develops the work of Nobel Laureate Robert Engle. It includes original contributions from some of the world's leading econometricians that further Engle's work in time series economics
Handbook of econometrics by James J Heckman (
file
)
20
editions published
between
1983
and
1994
in
English
and held by
218
libraries
worldwide
Handbook in econometrics.  v.4
Handbook of econometrics
(
Book
)
19
editions published
between
1994
and
2007
in
English
and held by
89
libraries
worldwide
The econometrics of ultrahigh frequency data by R. F Engle (
Book
)
15
editions published
between
1996
and
2000
in
English
and held by
85
libraries
worldwide
Abstract: Ultrahigh frequency data are complete transactions data which inherently arrive at random times. Marked point processes provide a theoretical framework for analysis of such data sets. The ACD model developed by Engle and Russell (1995) is then applied to IBM transactions data to develop semiparametric hazard estimates and measures of instantaneous conditional variances. The variances are negatively influenced by surprisingly long durations as suggested by some of the market microstructure literature
Option hedging using empirical pricing kernels by Joshua Rosenberg (
Book
)
13
editions published
in
1997
in
English
and held by
83
libraries
worldwide
Abstract: This paper develops a method for option hedging which is consistent with timevarying preferences and probabilities. The preferences are expressed in the form of an empirical pricing kernel (EPK), which measures the state price per unit probability, while probabilities are derived from an estimated stochastic volatility model of the form GARCH components with leverage. State prices are estimated using the flexible riskneutral density method of Rosenberg (1995) and a daily crosssection of option premia. Timevarying preferences over states are linked to a dynamic model of the underlying price to obtain a oneday ahead forecast of derivative price distributions and minimum variance hedge ratios. Empirical results suggest that risk aversion over S&P500 return states is substantially higher than risk aversion implied by BlackScholes state prices and probabilities using long run estimates of S&P500 return moments. It is also found that the daily level of risk aversion is strongly positively autocorrelated, negatively correlated with S&P500 price changes,and positively correlated with the spread between implied and objective volatilities. Hedging results reveal that typical hedging techniques for outofthemoney S&P500 index options, such as BlackScholes or historical minimum variance hedging, are inferior to the EPK hedging method. Thus, timevarying preferences and probabilities appear to be an important factor in the daytoday pricing of S&P500 options
GARCH gamma by R. F Engle (
Book
)
12
editions published
between
1995
and
1998
in
English
and held by
82
libraries
worldwide
Abstract: This paper addresses the issue of hedging option positions when the underlying asset exhibits stochastic volatility. By parameterizing the volatility process as GARCH, and utilizing risk neutral valuation, we estimate hedging parameters (delta and gamma) using MonteCarlo simulation. We estimate hedging parameters for options on the Standard and Poor's 500 index, a bond futures index, a weighted foreign exchange rate index, and an oil futures index. We find that BlackScholes and GARCH deltas are similar for all the options considered, while GARCH gammas are significantly higher than BS gammas for all options. For near the money options, GARCH gamma hedge ratios are higher than BS hedge ratios when hedging a long term option with a short term option. Away from the money, GARCH gamma hedge ratios are lower than BS
Hedging options in a GARCH environment : testing the term structure of stochastic volatility models by R. F Engle (
Book
)
13
editions published
in
1994
in
English
and held by
82
libraries
worldwide
Abstract: This paper develops a methodology for testing the term structure of volatility forecasts derived from stochastic volatility models, and implements it to analyze models of S&P 500 index volatility. Volatility models are compared by their ability to hedge options positions sensitive to the term structure of volatility. Overall, the most effective hedge is a BlackScholes (BS) deltagamma hedge, while the BS deltavega hedge is the least effective. The most successful volatility hedge is GARCH components deltagamma, suggesting that the GARCH components estimate of the term structure of volatility is most accurate. The success of the BS deltagamma hedge may be due to mispricing in the options market over the sample period
Measuring, forecasting, and explaining time varying liquidity in the stock market by R. F Engle (
Book
)
12
editions published
in
1997
in
English
and held by
81
libraries
worldwide
Abstract: The paper proposes a new measure, VNET, of market liquidity which directly measures the depth of the market. The measure is constructed from the excess volume of buys or sells during a market event defined by a price movement. As this measure varies over time, it can be forecast and explained. Using TORQ data, it is found that market depth varies positively but less than proportionally with past volume and negatively with the number of transactions. Both findings suggest that over time high volumes are associated with an influx of informed traders and reduce market liquidity. High expected volatility as measured by the ACD model of Engle and Russell (1995) and wide spreads both reduce expected depth. If the asymmetric trades are transacted in shorter than expected times, the costs will be greater giving an estimate of the value of patience
Forecasting transaction rates : the autoregressive conditional duration model by R. F Engle (
Book
)
14
editions published
between
1994
and
1995
in
English
and held by
81
libraries
worldwide
Abstract: This paper will propose a new statistical model for the analysis of data that does not arrive in equal time intervals such as financial transactions data, telephone calls, or sales data on commodities that are tracked electronically. In contrast to fixed interval analysis, the model treats the time between observation arrivals as a stochastic time varying process and therefore is in the spirit of the models of time deformation initially proposed by Tauchen and Pitts (1983), Clark (1973) and more recently discussed by Stock (1988), Lamoureux and Lastrapes (1992), Muller et al. (1990) and Ghysels and Jasiak (1994) but does not require auxiliary data or assumptions on the causes of time flow. Strong evidence is provided for duration clustering beyond a deterministic component for the financial transactions data analyzed. We will show that a very simple version of the model can successfully account for the significant autocorrelations in the observed durations between trades of IBM stock on the consolidated market. A simple transformation of the duration data allows us to include volume in the model
Modeling the impacts of market activity on bidask spreads in the option market by YoungHye Cho (
Book
)
13
editions published
in
1999
in
English
and held by
80
libraries
worldwide
Abstract: In this paper, we examine the impact of market activity on the percentage bidask spreads of S&P 100 index options using transactions data. We propose a new market microstructure theory which we call derivative hedge theory, in which option market percentage spreads will be inversely related to the option market maker's ability to hedge his positions in the underlying market, as measured by the liquidity of the latter market. In a perfect hedge world, spreads arise from the illiquidity of the underlying market, rather than from inventory risk or informed trading in the option market itself. We find option market volume is not a significant determinant of option market spreads. This finding leads us to question the use of volume as a measure of liquidity and supports the derivative hedge theory. Option market spreads are positively related to spreads in the underlying market, again supporting our theory. However, option market duration does affect option market spreads, with very slow and very fast option markets both leading to bigger spreads. The fast market result would be predicted by the asymmetric information theory. Inventory model predicts big spreads in slow markets. Neither result would be observed if the underlying securities market provided a perfect hedge. We interpret these mixed results as meaning that the option market maker is able to only imperfectly hedge his positions in the underlying securities market. Our result of insignificant options volume casts doubt on the price discovery argument between stock and option market (Easley, O'Hara, and Srinivas (1998)). Asymmetric information costs in either market are naturally passed to the other market maker's hedgeing and therefore it is unimportant where the informed traders trade
Timevarying betas and asymmetric effects of news : empirical analysis of blue chip stocks by YoungHye Cho (
Book
)
11
editions published
in
1999
in
English
and held by
77
libraries
worldwide
We investigate whether or not a beta increases with bad news and decreases with good news, just as does volatility. Using daily returns for nine stocks in a double beta model with EGARCH specifications, we show that news asymmetrically affects the betas of individual stocks. We find that betas depend on two source of news: market shocks and idiosyncratic shocks. Some stock betas depend on both while others depend on one. We categorize each stock return as belonging to one of three beta process models, a joint, an idiosyncratic, and a market model based on the role of market shocks and idiosyncratic shocks. Our conclusions differ from those of Brown, Nelson, and Sunnier (1995) who worked with monthly aggregated data in a bivariate EGARCH model. We believe that stock price aggregation in this previous research resulted in a loss of cross sectional variation and consequently lead to weak results. If the asymmetric effect is more readily apparent in daily data, then this may again explain previous researchers' inability to detect asymmetric effects. Our findings shed light on the controversy as to whether abnormalities in stock returns result from overreaction to information or from changes in expected returns in an efficient market. Finding an asymmetric effect in betas leads us to conclude that abnormalities can, at least partially, be explained by changes in expected returns through a change in beta
Theoretical and empirical properties of Dynamic Conditional Correlation Multivariate GARCH by R. F Engle (
Book
)
12
editions published
in
2001
in
English
and held by
73
libraries
worldwide
Abstract: In this paper, we develop the theoretical and empirical properties of a new class of multivariate GARCH models capable of estimating large timevarying covariance matrices, Dynamic Conditional Correlation Multivariate GARCH. We show that the problem of multivariate conditional variance estimation can be simplified by estimating univariate GARCH models for each asset, and then, using transformed residuals resulting from the first stage, estimating a conditional correlation estimator. The standard errors for the first stage parameters remain consistent, and only the standard errors for the correlation parameters need be modified. We use the model to estimate the conditional covariance of up to 100 assets using S&P 500 Sector Indices and Dow Jones Industrial Average stocks, and conduct specification tests of the estimator using an industry standard benchmark for volatility models. This new estimator demonstrates very strong performance especially considering ease of implementation of the estimator
CAViaR : conditional value at risk by quantile regression by R. F Engle (
Book
)
10
editions published
in
1999
in
English
and held by
69
libraries
worldwide
Abstract: Value at Risk has become the standard measure of market risk employed by financial institutions for both internal and regulatory purposes. Despite its conceptual simplicity, its measurement is a very challenging statistical problem and none of the methodologies developed so far give satisfactory solutions. Interpreting Value at Risk as a quantile of future portfolio values conditional on current information, we propose a new approach to quantile estimation which does not require any of the extreme assumptions invoked by existing methodologies (such as normality or i.i.d. returns). The Conditional Value at Risk or CAViaR model moves the focus of attention from the distribution of returns directly to the behavior of the quantile. We postulate a variety of dynamic processes for updating the quantile and use regression quantile estimation to determine the parameters of the updating process. Tests of model adequacy utilize the criterion that each period the probability of exceeding the VaR must be independent of all the past information. We use a differential evolutionary genetic algorithm to optimize an objective function which is nondifferentiable and hence cannot be optimized using traditional algorithms. Applications to simulated and real data provide empirical support to our methodology and illustrate the ability of these algorithms to adapt to new risk environments
Estimating sectorial cycles using cointegration and common features by R. F Engle (
Book
)
11
editions published
in
1993
in
English
and held by
69
libraries
worldwide
This paper investigates the degree of short run and long run comovement in U.S. sectoral output data by estimating sectoral trends and cycles. A theoretical model based on Long and Plosser (1983) is used to derive a reduced form for sectoral output from first principles. Cointegration and common features (cycles) tests are performed and sectoral output data seem to share a relatively high number of common trends and a relatively low number of common cycles. A special trendcycle decomposition of the data set is performed and the results indicate a very similar cyclical behavior across sectors and a very different behavior for trends. In a variance decomposition exercise, for prominent sectors such as Manufacturing and Wholesale/Retail Trade, the cyclical innovation is more important than the trend innovation
Indexoption pricing with stochastic volatility and the value of accurate variance forecasts by R. F Engle (
Book
)
12
editions published
in
1993
in
English
and held by
64
libraries
worldwide
Abstract: In pricing primarymarket options and in making secondary markets, financial intermediaries depend on the quality of forecasts of the variance of the underlying assets. Hence, the gain from improved pricing of options would be a measure of the value of a forecast of underlying asset returns. NYSE index returns over the period of 19681991 are used to suggest that pricing index options of up to 90days maturity would be more accurate when: (1) using ARCH specifications in place of a moving average of squared returns; (2) using Hull and White's (1987) adjustment for stochastic variance in Black and Scholes's (1973) formula; (3) accounting explicitly for weekends and the slowdown of variance whenever the market is closed
more
fewer
Associated Subjects
Bartholomew, Harland, Business cyclesEconometric models Business forecasting Capital assets pricing model Cointegration Correlation (Statistics) Econometric models Econometrics Economic forecastingEconometric models Economic forecastingMathematical models Economic forecastingStatistical methods Economic history Economics Engle, R. F.(Robert F.) Estimation theory FinanceEconometric models FinanceMathematical models Financial engineering Financial futuresEconometric models Financial risk Financial risk management Granger, C. W. J.(Clive William John), Haavelmo, Trygve Heckman, James J.(James Joseph) Hedging (Finance) Hedging (Finance)Econometric models Heteroscedasticity Liquidity (Economics) Liquidity (Economics)Econometric models McFadden, Daniel Options (Finance) Options (Finance)PricesEconometric models Parameter estimation Rate of returnForecastingEconometric models Risk management Risk managementEconometric models Risk managementMathematical models Risk perceptionEconometric models Securities Stochastic processes Stochastic processesEconometric models Stock exchangesEconometric models Stock optionsEconometric models Stock price forecasting Stock price forecastingEconometric models StocksEconometric models Tibbitts, Armand, Timeseries analysis Timeseries analysisEconometric models United States

Alternative Names
Engle, R. F. Engle, R. F. 1942 Engle, Rob 1942 Engle, Robert. Engle, Robert 1942 Engle, Robert F.
Languages
Covers
