Chinese / English
Lars Peter Hansen


Thank you, it’s my great privilege to be here today and to be part of this very nice celebration. Before I get started here, I have been asked by my family to convey some congratulations. My father-in-law, Sho-Chieh Tsiang, and mother-in-law, Hsi-Tsin Tsiang, have been long-standing friends of Gregory and Paula Chow. Hsi-Tsin Tsiang and her daughter, Grace Tsiang, send their congratulations and were very pleased to see Gregory honored in such a special way. Grace would also like to send congratulations to Xiaohong Chen. Both Grace and I have known Xiaohong for many years, and it’s been our privilege to watch her develop intellectually. It’s been a lot of fun for us.

Today, I thought I would take a little bit of a different perspective on the research of Xiaohong Chen and Gregory Chow.  Rather than giving an in-depth characterization of their specific contributions, I am going to describe why I believe this research to be fundamentally important, and I will suggest productive ways it could be applied and extended in the future. So, let’s get down to business.

Gregory Chow has had many contributions, but before I delve into them, it’s valuable to get a sense of how he nurtured scholarship. I first got to know him when I was straight out of graduate school. I was collaborating on research with Thomas Sargent, and I was sent by Sargent to go to a conference that he could not attend. The conference was sponsored by the Society of Economic Dynamics and Control.  At the time, Gregory was the President of the Society and was an acknowledged intellectual.  He was very kind to welcome me at the conference. He made sure that I was well-treated and received lots of wonderful feedback. For me, the conference was as valuable experience at the outset of my professional career and an experience that I truly appreciated.

Today I will add some context to three of Gregory’s  important contributions: a) testing for structural breaks, b) developing and adapting control theory to economic applications, and c) the econometric modeling of the Chinese economy. For Xiaohong Chen, I will discuss some of her research connected to a) sieve-based methods for conditional moment estimation, b) copula-based methods for time series econometrics, and c) temporal dependence in nonlinear models. There is much jargon attached to these different lines of research. Instead of delving into specifics, I indicate why I think these methods and these ideas are really truly important in econometric practice.

So, let me start with a peculiar slide. It contains a couple of pictures spiced together, and let me give a little bit context for why I did this.  Looking at this slide you might not see an obvious connection to Gregory and Xiaohong’s papers.  But let me try to convince you otherwise. It turns out that probability theory, prior to Jacob Bernoulli, was developed to analyze games of chance.   For instance, when people throw dice or flip coins, we know probabilities, but we don’t know outcomes. As I have been educated by my colleague Stephen Stigler, who is an expert in the history of statistics, I have learned that the first person who really looked at social scientific data using probability of methods was Jacob Bernoulli.  His research was conducted over 300 years ago. His featured result became known as the so-called "Law of Large Numbers." So, I think of Jacob Bernoulli as a person who took probability theory from a situation where we know probabilities to one of which we use social scientific evidence to figure them out. It’s the advent of statistics applied to social sciences.

So, on your left, you can see a picture of Jacob Bernoulli. What’s he doing? Jacob Bernoulli is trying to look at observed behavior and trying to figure things out about it. Why is economics different from physics? To capture this distinction, I have Jacob Bernoulli looking out on a market place. The painting is by Pissarrocalled, “The Market Place”. To build economic models, what do we do? We include people in those models and include enterprises and other forms of economic activity. These economic players also have to try to figure stuff out. In the context of the painting, in advance of their trip to the market place, the sellers are fully confident the amount of demand there will be for their goods. They also don’t know what prices will be agreed to. In advance of their trip to the marketplace, they have to make production decisions based on guesses about what’s going to happen in the future. The usual role of a statistician is that of Jacob Bernoulli, but for an economist as a model builder, there is another role for a statistician that I consider to be every bit as important. People inside the models economists build have to speculate about the future when making decisions.

So, I think two roles for statistics in economic analysis, and whenever I think about contributions to statistics and econometrics, I find it valuable to put these two roles on the table. The first is an outside of a model perspective.  By that, I think of a Bernoulli-like perspective. Look at the economy through the guises of economic models, estimate unknown parameters of these models and assess their implications. Then you look into how the model is constructed, we have economic actors, people or enterprises, inside the model that also face statistical challenges when coping with uncertainty. How they behave has consequences for market outcomes and resource allocations. So, as we explore the importance of statistical advances, we should remind ourselves of these dual different roles for statistics and how they are useful in economic analyses.

I also like to think about uncertainty in broader terms than is typical in economic analyses.  With this mind, I find the following categorization of uncertainty to be revealing; one category is risk. When we teach in economic classes, we often make reference to risk aversion. Formal references to risk aversion typically presume that we know probabilities, but we don’t know outcomes. I think of this as uncertainty within a model. A second category is what I will call ambiguity and is a situation in which we are unsure about which is the correct model or there are unknown parameters within a given model. This captures uncertainty across models and poses a challenge as to what weights to assign to the alternative possible models. A third category is the potential for model misspecification. All models are abstractions; they are of simplifications, and therefore they are necessarily wrong. Uncertainty comes into play as we struggle with how to use models with unknown flaws sensibly. If the flaws in the model were known, we would just go back and fix them easily.  Consequently, it is the potential for unknown flaws that gives rise to a third component of uncertainty. In many respects, this third one is the hardest one to address but may also be the most important one to address.

One of the reasons I find statistical tools that are central to both Gregory’s and Xiaohong’s research are so important, even if at times they are under-appreciated, is that they provide a framework for understanding when the environment is complex.  If we put economic actors or agents inside a complicated environment, it will be challenging for them to figure things out.  Similarly, when researchers are trying to sort out a complex world, it will be challenging for them to construct credible models. Statistical methods and statistical thinking help us conceptualize when uncertainties we face are more or less complicated.

When is it challenging to learn about draw inferences? Economics has always been about behavior of economic actors or agents. The so-called "behavioral approach" to economics and finance features appeals to psychology, albeit at times in superficial ways. While I do doubt the potential value of psychology, I also embrace the hope for more applications of statistical ideas.  Arguably there is more scope for so-called behavioral distortions when people are placed in a more complex setting making learning and inference more of a struggle. Such conjectures lead naturally to questions like the following: How might statistical uncertainty induce fluctuations in market prices? How might it affect resource allocations, and how should statistical uncertainty alter how we design policy responses to economic problems?


What are some advances that address these issues in the recent award we just heard about?  How do these advances contribute to statistics and econometrics?


 Gregory Chow conducted the initial and important work on so-called testing for structural breaks or structural change. Suppose we start with a model that is specified with tractable and simple parametric form.   Gregory's test was designed to assess if the underlying parameter vector is invariant over time, or could it possibly have changed?  Gregory’s initial contribution opened the door to assessing models in more complex environments than envisioned with the original parametric model specification. In the more complex settings parameters change, they drift, or they are altered in specific ways. This work led subsequent researchers to go beyond just constructing tests and instead to formulating ways to capture and model variation.  This research is a nice example of type of complexities that might be out there in dynamic economic modeling, and it shows how statistics and econometrics help us to detect and accommodate these complexities.

Xiaohong Chen developed so-called semiparametric methods along with another prominent person in the audience here. Peter Robinson has been a leader in this field. This is a very important approach.  With this approach an econometrician can impose economic structure along some dimensions, for instance there may be key parameters that have direct interpretations, and flexibility along other dimensions.  This research shows how to take a complex environment and allow for flexibility in some aspects of the model while still focusing the measurement on few really central insights.


Taken together, the Chow and Chen contributions enhance our understanding of statistical complexity in different and very complementary ways. There is a lot of interest in so-called "big data." We should not look at the big data to magically reveal everything of interest. For economists, this certainly is not the case. Big data and machine learning methods can tell us a lot about some things, but not so much about other things. As economists we engage in counterfactual analysis and policy analysis.  In so doing, we need to extrapolate from places where data might be very informative to places in which there is data sparsity. To do so, requires explicit economic structure, and it is this interplay between the economics and data which I view as very important.  Addressing this challenge is where I think econometricians have had very important contributions to make, including those by Xiaohong Chen and Gregory Chow.

How about confronting uncertainty? Once we get better characterizations of the statistical complexity of the underlying economic environments, how should this influence decision making? I think the best approach here is to be systematic and formal and draw upon insights from a variety of literatures.  

This is a big literature on decision theory with its origins in statistics. Savage wrote an elegant axiomatic treatment of decision under uncertainty, but even Savage realized the limits to his own approach.  Such limitations have led to many extensions by a variety of researchers. The so-called axiomatic approach that economists have been so fascinated with proceeds as follows; once you commit to a set of axioms, they inform how to represent preferences of decision makers.  Arguably, these axioms are informative about what means to be rational and to help in structuring sensible approaches to decision-making under uncertainty.  But there are important gaps between axioms and practice. Here is where engineers and control theorists contribute. Many economic problems are fundamentally dynamic in nature. Control theorists have been also concerned about how best to confront uncertainty while being cognizant of tractability. Gregory was a really early researcher in economics to embrace control theoretic methods. Indeed, there are a lot of insights economists can extract from control theory, sometimes with important modifications. In collaboration with Thomas Sargent, I have certainly followed Gregory’s lead in adapting and at times altering control theory insights to study economic dynamics under uncertainty. Moreover, the interplay between the decision theory coming out of the economics and robust formulations coming out of the control theory has been very valuable in our research. Recent advances have allowed for the presence of alternative and potentially complex forms of uncertainty.  This has resulted in application to the modeling of economic agents, including households and entrepreneurs, as well as to providing  guides  economic policy.

In the analysis of financial markets there is often reference made to “bears versus bulls’’ I cannot say that either Xiaohong or Gregory have directly explained “bears versus bulls” tension in financial markets, but I actually think some of the ideas that they have been exploring can help us to understand why financial markets appear bold sometimes and cautious in other times. As decision-makers, investors sometimes struggle with the economic environment they are situated in and other points in time and as a consequence behave cautiously. In other times, they may proceed with more confidence. Statistical characterization of the environment and how it evolves over time, I think, can be an important input in the understanding behavior like this, about situations in which investors are more bearish and in others when they are more bullish. While this is a personal perspective, I believe that integrating in a more statistical perspective can help us understand better market behavior.

Next, I turn to policy. I start with a quote from Hayek’s Nobel address. For me, quoting Hayek is a bit strange. When you win the Nobel Prize you are supposed to write an essay, explaining research pertaining to your prize.  This, of course, is kind of a boring thing to do, so many writers deviate a bit and add some newer perspectives. Hayek, after winning a Nobel Prize in Economic Sciences, writes an essay questioning whether economics is really a science. He is very harsh on econometrics, and he basically argues that econometrics has accomplished little after many centuries worth of attempts. He also says that while mathematics is useful as a language, quantitative components to economic analysis are highly suspect.  I have to say I don’t agree with all Hayek’s perspectives in his essay.  This should be evident from what economists called "revealed preference," in other words, by my own choice of research. But, I extracted from Hayek’s essay a quote that I think is really an important one:

"Even if true scientists should recognize the limits of studying human behaviour, as long as the public has expectations, there will be people who pretend or believe that they can do more to meet popular demand than what is really in their power."

 There is a bias in economic policy-making -- a bias that the public or politicians want people to say things with great confidence even when the confidence is not warranted.

 I was informed by one of my collaborators, William Brock, of a quote from Lyndon Johnson who was a president of the U.S. a few decades back. He was hoping economists could provide him with some forecasts and guidance. And the economists with caution in answers and avoided giving sharp numerical predictions.  When they made reference to a range of outcomes that might happen, Lyndon B. Johnson looked at them and said, “ranges are for cows, give me the number.” And there is a real sense in policy-making that politicians just want a simple answer, with qualification. Johnson’s reaction captures some of what Hayek’s point is in the quote I provided. I see the role of statistical and econometric methods to provide a more sober discussion of potential policy outcomes and for decision theory to helps use the outcomes of well-conceived statistical analyses in sensible ways when structuring policy responses.  

I think of Gregory Chow as a pioneer in building econometric models of the Chinese economy. China has faced and continues to confront important, and in some ways, unique policy challenges. Gregory initiated econometric modeling of the Chinese economy, because he wanted to on one hand take account of uncertainty, but on the other hand, he wanted to provide intelligence and form guidance to make the economy work better.

Let me mention some other policy challenges for which I think uncertainty is absolutely critical. A few years back, Steve E. Koonin wrote an editorial in thWall Street Journal that was really heavily criticized by the climate scientists community. Some of criticisms challenged whether he was reporting evidence correctly. On this, I do not have the expertise to weigh in.  But part of Koonin’s essay makes the following statement:


"Any serious discussion of the changing climate must begin with acknowledging not only the scientific certainties but also the uncertainties, especially in projecting the future. Recognizing those limits, rather than ignoring them, will lead to a more sober and ultimately more productive discussion of climate change and climate policies."


There is a lot of interest now in quantifying how economic activity has altered the climate and how it will impact the climate in the future. While the climate science models can be incredibly elaborate, they also have acknowledged flaws. Understanding how to use climate models in economic analyses and confronting the resulting uncertainties when making projections is a critically important modeling challenge. I have made this argument in other settings and sometimes people in the audience conclude that I am a denier of the human input into the climate. They speculate that I will use such uncertainty concerns as evidence that we should doing nothing until our understanding is much sharper.   But the argument of "do nothing" is a potential misunderstanding of decision theory. Decision theory says that, when it makes sense, one should act now based on the possibilities of bad outcomes. There nothing inherent in decision theory claiming the need for certainty in order to proceed with mitigation. The fear of acknowledging uncertainties seems to me to be largely misguided.  Drawing on statistical characterizations of uncertainty within the context of formal decision theory can contribute to the discussions of policy designed to counteract human impacts on the climate.  

Uncertainty also should play a central role in discussions of financial market oversight. After the Dodd–Frank Act, the US formed the Financial Stability Oversight Committee. China just recently announced the creation of Financial Stability and Development Committee. These committees engage in the oversight of financial markets with an eye on their macroeconomic consequences. This is an area in which our knowledge within the economics profession remains a little bit sparse. There is term called "systemic risk."  Systemic risk is something that, in the academic literature, was pretty sparsely used prior to the financial crisis and since then has been used extensively. Many recent papers have featured the term in a variety of different ways. It is challenging to sort out what the quantitatively most important and most critical sources are of so-called systemic risk in our financial systems. Our knowledge still remains somewhat sparse, and I would personally prefer to see the term systemic risk replaced by "systemic uncertainty."

Part of what is interesting in this recent intellectual history is that macroeconomics, before the financial crisis, typically had a limited role for the impact of financing restrictions. The models themselves were primarily at least approximately linear.  Much of the formal statistical analyses were done imposing the linear approximations.  

What is new to our thinking through some of the potential financial impacts in the macro-economy? Nonlinearity.  Nonlinearity emerges in how random impulses and shocks impact the macro-economy over subsequent time periods.  Thus, there is nonlinearity in the so-called transition mechanisms.  It remains an issue for subsequent research to quantitatively assess these nonlinearities.  Understanding how the nonlinearity impacts the macroeconomy is going to be very critical intellectual challenge as this research progresses. 

Xiaohong Chen has provided some very important characterizations of nonlinear dependence that are valuable in helping us understand how these nonlinear mechanisms work or how best to characterize them. Moreover, she has refinements of copula techniques that are specifically designed to go beyond very low-dimensional nonlinear models handle while preserving tractability. Thus, I see promise in research that connects the type of contributions Xiaohong Chen has made to substantive problems that emerge in finance and macroeconomics. Such advances would be very fruitful directions in the future, ones that I hope will be explored in future research.


In summary, I believe firmly that the type of problems that are successfully addressed in the Chow and Chen research and that scholars continue to wrestle with that are essential to integrate into our tools for policy analysis. They help us to formulate and confront statistical complexity, and it is the interplay between this complexity and how we build and analyze economic models that helps us incorporate broad notions of uncertainty all the way from individuals and firms to outcomes of economic policy.  

I end my discussion by quoting from a rather well-known Chinese philosopher: Confucius. I can’t verify the translation here, so others will have to judge whether I did a good job or not. The English translation is the following:

"When you know a thing, hold that you know it; and when you do not know a thing, allow that you do not know it- this is knowledge."


Thank you!



◆please indicate the source if authorized: National Economics Foundation

◆photo:National Economics Foundation